Detection of Wideband Signal Number Based on Bootstrap Resampling
Directory of Open Access Journals (Sweden)
Jiaqi Zhen
2016-01-01
Full Text Available Knowing source number correctly is the precondition for most spatial spectrum estimation methods; however, many snapshots are needed when we determine number of wideband signals. Therefore, a new method based on Bootstrap resampling is proposed in this paper. First, signals are divided into some nonoverlapping subbands; apply coherent signal methods (CSM to focus them on the single frequency. Then, fuse the eigenvalues with the corresponding eigenvectors of the focused covariance matrix. Subsequently, use Bootstrap to construct the new resampling matrix. Finally, the number of wideband signals can be calculated with obtained vector sequences according to clustering technique. The method has a high probability of success under low signal to noise ratio (SNR and small number of snapshots.
Assessing Uncertainty in LULC Classification Accuracy by Using Bootstrap Resampling
Directory of Open Access Journals (Sweden)
Lin-Hsuan Hsiao
2016-08-01
Full Text Available Supervised land-use/land-cover (LULC classifications are typically conducted using class assignment rules derived from a set of multiclass training samples. Consequently, classification accuracy varies with the training data set and is thus associated with uncertainty. In this study, we propose a bootstrap resampling and reclassification approach that can be applied for assessing not only the uncertainty in classification results of the bootstrap-training data sets, but also the classification uncertainty of individual pixels in the study area. Two measures of pixel-specific classification uncertainty, namely the maximum class probability and Shannon entropy, were derived from the class probability vector of individual pixels and used for the identification of unclassified pixels. Unclassified pixels that are identified using the traditional chi-square threshold technique represent outliers of individual LULC classes, but they are not necessarily associated with higher classification uncertainty. By contrast, unclassified pixels identified using the equal-likelihood technique are associated with higher classification uncertainty and they mostly occur on or near the borders of different land-cover.
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
Genetic divergence among cupuaçu accessions by multiscale bootstrap resampling
Directory of Open Access Journals (Sweden)
Vinicius Silva dos Santos
2015-06-01
Full Text Available This study aimed at investigating the genetic divergence of eighteen accessions of cupuaçu trees based on fruit morphometric traits and comparing usual methods of cluster analysis with the proposed multiscale bootstrap resampling methodology. The data were obtained from an experiment conducted in Tomé-Açu city (PA, Brazil, arranged in a completely randomized design with eighteen cupuaçu accessions and 10 repetitions, from 2004 to 2011. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction (REML/BLUP methodology. The predicted breeding values were used in the study on genetic divergence through Unweighted Pair Cluster Method with Arithmetic Mean (UPGMA hierarchical clustering and Tocher’s optimization method based on standardized Euclidean distance. Clustering consistency and optimal number of clusters in the UPGMA method were verified by the cophenetic correlation coefficient (CCC and Mojena’s criterion, respectively, besides the multiscale bootstrap resampling technique. The use of the clustering UPGMA method in situations with and without multiscale bootstrap resulted in four and five clusters, respectively, while the Tocher’s method resulted in seven clusters. The multiscale bootstrap resampling technique proves to be efficient to assess the consistency of clustering in hierarchical methods and, consequently, the optimal number of clusters.
Shen, Meiyu; Machado, Stella G
2016-12-01
,max/CR,max are obtained from the nonparametric bootstrap resampling samples and are used for the evaluation of bioequivalence study for one-time sparse sampling data.
Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec
2008-07-01
Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.
Takemoto, Seiji; Yamaoka, Kiyoshi; Nishikawa, Makiya; Takakura, Yoshinobu
2006-12-01
A bootstrap method is proposed for assessing statistical histograms of pharmacokinetic parameters (AUC, MRT, CL and V(ss)) from one-point sampling data in animal experiments. A computer program, MOMENT(BS), written in Visual Basic on Microsoft Excel, was developed for the bootstrap calculation and the construction of histograms. MOMENT(BS) was applied to one-point sampling data of the blood concentration of three physiologically active proteins ((111)In labeled Hsp70, Suc(20)-BSA and Suc(40)-BSA) administered in different doses to mice. The histograms of AUC, MRT, CL and V(ss) were close to a normal (Gaussian) distribution with the bootstrap resampling number (200), or more, considering the skewness and kurtosis of the histograms. A good agreement of means and SD was obtained between the bootstrap and Bailer's approaches. The hypothesis test based on the normal distribution clearly demonstrated that the disposition of (111)In-Hsp70 and Suc(20)-BSA was almost independent of dose, whereas that of (111)In-Suc(40)-BSA was definitely dose-dependent. In conclusion, the bootstrap method was found to be an efficient method for assessing the histogram of pharmacokinetic parameters of blood or tissue disposition data by one-point sampling.
Fingerprint resampling: A generic method for efficient resampling
Mestdagh, Merijn; Verdonck, Stijn; Duisters, Kevin; Tuerlinckx, Francis
2015-01-01
In resampling methods, such as bootstrapping or cross validation, a very similar computational problem (usually an optimization procedure) is solved over and over again for a set of very similar data sets. If it is computationally burdensome to solve this computational problem once, the whole resampling method can become unfeasible. However, because the computational problems and data sets are so similar, the speed of the resampling method may be increased by taking advantage of these similarities in method and data. As a generic solution, we propose to learn the relation between the resampled data sets and their corresponding optima. Using this learned knowledge, we are then able to predict the optima associated with new resampled data sets. First, these predicted optima are used as starting values for the optimization process. Once the predictions become accurate enough, the optimization process may even be omitted completely, thereby greatly decreasing the computational burden. The suggested method is validated using two simple problems (where the results can be verified analytically) and two real-life problems (i.e., the bootstrap of a mixed model and a generalized extreme value distribution). The proposed method led on average to a tenfold increase in speed of the resampling method. PMID:26597870
Sisvar: a Guide for its Bootstrap procedures in multiple comparisons
Directory of Open Access Journals (Sweden)
Daniel Furtado Ferreira
2014-04-01
Full Text Available Sisvar is a statistical analysis system with a large usage by the scientific community to produce statistical analyses and to produce scientific results and conclusions. The large use of the statistical procedures of Sisvar by the scientific community is due to it being accurate, precise, simple and robust. With many options of analysis, Sisvar has a not so largely used analysis that is the multiple comparison procedures using bootstrap approaches. This paper aims to review this subject and to show some advantages of using Sisvar to perform such analysis to compare treatments means. Tests like Dunnett, Tukey, Student-Newman-Keuls and Scott-Knott are performed alternatively by bootstrap methods and show greater power and better controls of experimentwise type I error rates under non-normal, asymmetric, platykurtic or leptokurtic distributions.
Institute of Scientific and Technical Information of China (English)
Fang-Ling Tao; Shi-Fan Min; Wei-Jian Wu; Guang-Wen Liang; Ling Zeng
2008-01-01
Taking a published natural population life table office leaf roller, Cnaphalocrocis medinalis (Lepidoptera: Pyralidae), as an example, we estimated the population trend index,I, via re-sampling methods (jackknife and bootstrap), determined its statistical properties and illustrated the application of these methods in determining the control effectiveness of bio-agents and chemical insecticides. Depending on the simulation outputs, the smoothed distribution pattern of the estimates of I by delete-1 jackknife is visually distinguishable from the normal density, but the smoothed pattern produced by delete-d jackknife, and logarithm-transformed smoothed patterns produced by both empirical and parametric bootstraps,matched well the corresponding normal density. Thus, the estimates of I produced by delete-1 jackknife were not used to determine the suppressive effect of wasps and insecticides. The 95% percent confidence intervals or the narrowest 95 percentiles and Z-test criterion were employed to compare the effectiveness of Trichogrammajaponicum Ashmead and insecti-cides (powder, 1.5% mevinphos + 3% alpha-hexachloro cyclohexane) against the rice leaf roller based on the estimates of I produced by delete-d jackknife and bootstrap techniques.At α= 0.05 level, there were statistical differences between wasp treatment and control, and between wasp and insecticide treatments, if the normality is ensured, or by the narrowest 95 percentiles. However, there is still no difference between insecticide treatment and control.By Z-test criterion, wasp treatment is better than control and insecticide treatment with P-value＜0.01. Insecticide treatment is similar to control with P-value ＞ 0.2 indicating that 95% confidence intervals procedure is more conservative. Although similar conclusions may be drawn by re-sampling techniques, such as the delta method, about the suppressive effect of trichogramma and insecticides, the normality of the estimates can be checked and guaranteed
Directory of Open Access Journals (Sweden)
Osmir José Lavoranti
2010-06-01
Full Text Available Reliable evaluation of the stability of genotypes and environment is of prime concern to plant breeders, but the lack of a comprehensive analysis of the structure of the GE interaction has been a stumbling block to the recommendation of varieties. The Additive Main Effects and Multiplicative Interaction (AMMI Model currently offers the good approach to interpretation and understanding of the GE interaction but lacks a way of assessing the stability of its estimates. The present contribution proposes the use of bootstrap resampling
in the AMMI Model, and applies it to obtain both a graphical and a numerical analysis of the phenotypic
stability of 20 Eucalyptus grandis progenies from Australia that were planted in seven environments in the Southern and Southeastern regions of Brazil. The results showed distinct behaviors of genotypes and
environments and the genotype x environment interaction was significant (p value < 0.01. The bootstrap coefficient of stability based on the squared Mahalanobis distance of the scores showed that genotypes and environments can be differentiated in terms of their stabilities. Graphical analysis of the AMMI biplot provided a better understanding of the interpretation of phenotypic stability. The proposed AMMI bootstrap eliminated the uncertainties regarding the identification of low scores in traditional analyses.As posições críticas dos estatísticos, que atuam em programas de melhoramento genético, referem-se à falta de uma análise criteriosa da estrutura da interação do genótipo com o ambiente (GE como um dos principais problemas para a recomendação de cultivares. A metodologia AMMI (additive main effects and multiplicative interaction analysis propõe ser mais eficiente que as análises usuais na interpretação e compreensão da interação GE, entretanto, à dificuldade de se interpretar a interação quando há baixa explicação do primeiro componente principal; à dificuldade de
A bootstrap procedure to select hyperspectral wavebands related to tannin content
Ferwerda, J.G.; Skidmore, A.K.; Stein, A.
2006-01-01
Detection of hydrocarbons in plants with hyperspectral remote sensing is hampered by overlapping absorption pits, while the `optimal' wavebands for detecting some surface characteristics (e.g. chlorophyll, lignin, tannin) may shift. We combined a phased regression with a bootstrap procedure to find
Directory of Open Access Journals (Sweden)
Izabela CHMIEL
2012-03-01
Full Text Available Aim: To determine and analyse an alternative methodology for the analysis of a set of Likert responses measured on a common attitudinal scale when the primary focus of interest is on the relative importance of items in the set - with primary application to health-related quality of life (HRQOL measures. HRQOL questionnaires usually generate data that manifest evident departures from fundamental assumptions of Analysis of Variance (ANOVA approach, not only because of their discrete, bounded and skewed distributions, but also due to significant correlation between mean scores and their variances. Material and Methods: Questionnaire survey with SF-36 has been conducted among 142 convalescents after acute pancreatitis. The estimated scores of HRQOL were compared with use of the multiple comparisons procedures under Bonferroni-like adjustment, and with the bootstrap procedures. Results: In the data set studied, with the SF-36 outcome, the use of the multiple comparisons and bootstrap procedures for analysing HRQOL data provides results quite similar to conventional ANOVA and Rasch methods, suggested at frames of Classical Test Theory and Item Response Theory. Conclusions: These results suggest that the multiple comparisons and bootstrap both are valid methods for analysing HRQOL outcome data, in particular at case of doubts with appropriateness of the standard methods. Moreover, from practical point of view, the processes of the multiple comparisons and bootstrap procedures seems to be much easy to interpret by non-statisticians aimed to practise evidence based health care.
New resampling method for evaluating stability of clusters
Directory of Open Access Journals (Sweden)
Neuhaeuser Markus
2008-01-01
Full Text Available Abstract Background Hierarchical clustering is a widely applied tool in the analysis of microarray gene expression data. The assessment of cluster stability is a major challenge in clustering procedures. Statistical methods are required to distinguish between real and random clusters. Several methods for assessing cluster stability have been published, including resampling methods such as the bootstrap. We propose a new resampling method based on continuous weights to assess the stability of clusters in hierarchical clustering. While in bootstrapping approximately one third of the original items is lost, continuous weights avoid zero elements and instead allow non integer diagonal elements, which leads to retention of the full dimensionality of space, i.e. each variable of the original data set is represented in the resampling sample. Results Comparison of continuous weights and bootstrapping using real datasets and simulation studies reveals the advantage of continuous weights especially when the dataset has only few observations, few differentially expressed genes and the fold change of differentially expressed genes is low. Conclusion We recommend the use of continuous weights in small as well as in large datasets, because according to our results they produce at least the same results as conventional bootstrapping and in some cases they surpass it.
Bootstrap for the case-cohort design.
Huang, Yijian
2014-06-01
The case-cohort design facilitates economical investigation of risk factors in a large survival study, with covariate data collected only from the cases and a simple random subset of the full cohort. Methods that accommodate the design have been developed for various semiparametric models, but most inference procedures are based on asymptotic distribution theory. Such inference can be cumbersome to derive and implement, and does not permit confidence band construction. While bootstrap is an obvious alternative, how to resample is unclear because of complications from the two-stage sampling design. We establish an equivalent sampling scheme, and propose a novel and versatile nonparametric bootstrap for robust inference with an appealingly simple single-stage resampling. Theoretical justification and numerical assessment are provided for a number of procedures under the proportional hazards model.
Neal, Dan J; Simons, Jeffrey S
2007-12-01
Analysis of alcohol use data and other low base rate risk behaviors using ordinary least squares regression models can be problematic. This article presents 2 alternative statistical approaches, generalized linear models and bootstrapping, that may be more appropriate for such data. First, the basic theory behind the approaches is presented. Then, using a data set of alcohol use behaviors and consequences, results based on these approaches are contrasted with the results from ordinary least squares regression. The less traditional approaches consistently demonstrated better fit with model assumptions, as demonstrated by graphical analysis of residuals, and identified more significant variables potentially resulting in theoretically different interpretations of the models of alcohol use. In conclusion, these models show significant promise for furthering the understanding of alcohol-related behaviors.
Introductory statistics and analytics a resampling perspective
Bruce, Peter C
2014-01-01
Concise, thoroughly class-tested primer that features basic statistical concepts in the concepts in the context of analytics, resampling, and the bootstrapA uniquely developed presentation of key statistical topics, Introductory Statistics and Analytics: A Resampling Perspective provides an accessible approach to statistical analytics, resampling, and the bootstrap for readers with various levels of exposure to basic probability and statistics. Originally class-tested at one of the first online learning companies in the discipline, www.statistics.com, the book primarily focuses on application
Efficient p-value evaluation for resampling-based tests
Yu, K.
2011-01-05
The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.
Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.
2010-01-01
Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.
Variance estimation in neutron coincidence counting using the bootstrap method
Energy Technology Data Exchange (ETDEWEB)
Dubi, C., E-mail: chendb331@gmail.com [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Ocherashvilli, A.; Ettegui, H. [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Pedersen, B. [Nuclear Security Unit, Institute for Transuranium Elements, Via E. Fermi, 2749 JRC, Ispra (Italy)
2015-09-11
In the study, we demonstrate the implementation of the “bootstrap” method for a reliable estimation of the statistical error in Neutron Multiplicity Counting (NMC) on plutonium samples. The “bootstrap” method estimates the variance of a measurement through a re-sampling process, in which a large number of pseudo-samples are generated, from which the so-called bootstrap distribution is generated. The outline of the present study is to give a full description of the bootstrapping procedure, and to validate, through experimental results, the reliability of the estimated variance. Results indicate both a very good agreement between the measured variance and the variance obtained through the bootstrap method, and a robustness of the method with respect to the duration of the measurement and the bootstrap parameters.
Inferences of Coordinates in Multidimensional Scaling by a Bootstrapping Procedure in R
Kim, Donghoh; Kim, Se-Kang; Park, Soyeon
2015-01-01
Recently, MDS has been utilized to identify and evaluate cognitive ability latent profiles in a population. However, dimension coordinates do not carry any statistical properties. To cope with statistical incompetence of MDS, we investigated the common aspects of various studies utilizing bootstrapping, and provided an R function for its…
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
Directory of Open Access Journals (Sweden)
Larissa Ribeiro de Andrade
2014-01-01
Full Text Available The bootstrap method is generally performed by presupposing that each sample unit would show the same probability of being re-sampled. However, when a sample with outliers is taken into account, the empirical distribution generated by this method may be influenced, or rather, it may not accurately represent the original sample. Current study proposes a bootstrap algorithm that allows the use of measures of influence in the calculation of re-sampling probabilities. The method was reproduced in simulation scenarios taking into account the logistic growth curve model and the CovRatio measurement to evaluate the impact of an influential observation in the determinacy of the matrix of the co-variance of parameter estimates. In most cases, bias estimates were reduced. Consequently, the method is suitable to be used in non-linear models and allows the researcher to apply other measures for better bias reductions.
Resampling methods in Microsoft Excel® for estimating reference intervals.
Theodorsson, Elvar
2015-01-01
Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular. Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
On the Impact of Bootstrap in Stratified Random Sampling
Institute of Scientific and Technical Information of China (English)
LIU Cheng; ZHAO Lian-wen
2009-01-01
In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resampling method under some conditions. The determination of sample size by bootstrap method is also discussed, and a simulation is made to verify the accuracy of the proposed method. The simulation results show that the sample size based on bootstrapping is smaller than that based on central limit theorem.
On Bootstrap Tests of Symmetry About an Unknown Median.
Zheng, Tian; Gastwirth, Joseph L
2010-07-01
It is important to examine the symmetry of an underlying distribution before applying some statistical procedures to a data set. For example, in the Zuni School District case, a formula originally developed by the Department of Education trimmed 5% of the data symmetrically from each end. The validity of this procedure was questioned at the hearing by Chief Justice Roberts. Most tests of symmetry (even nonparametric ones) are not distribution free in finite sample sizes. Hence, using asymptotic distribution may not yield an accurate type I error rate or/and loss of power in small samples. Bootstrap resampling from a symmetric empirical distribution function fitted to the data is proposed to improve the accuracy of the calculated p-value of several tests of symmetry. The results show that the bootstrap method is superior to previously used approaches relying on the asymptotic distribution of the tests that assumed the data come from a normal distribution. Incorporating the bootstrap estimate in a recently proposed test due to Miao, Gel and Gastwirth (2006) preserved its level and shows it has reasonable power properties on the family of distribution evaluated.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
Confidence Intervals for Effect Sizes: Applying Bootstrap Resampling
Banjanovic, Erin S.; Osborne, Jason W.
2016-01-01
Confidence intervals for effect sizes (CIES) provide readers with an estimate of the strength of a reported statistic as well as the relative precision of the point estimate. These statistics offer more information and context than null hypothesis statistic testing. Although confidence intervals have been recommended by scholars for many years,…
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger;
We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data.......We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... and in simulations we observe that the bootstrap-based hypothesis test provides considerable finite-sample improvements over an existing test that is based on a central limit theorem. This is important when studying the roughness properties of time series data; we illustrate this by applying the bootstrap method...
Schick, Simon; Rössler, Ole; Weingartner, Rolf
2016-10-01
Based on a hindcast experiment for the period 1982-2013 in 66 sub-catchments of the Swiss Rhine, the present study compares two approaches of building a regression model for seasonal streamflow forecasting. The first approach selects a single "best guess" model, which is tested by leave-one-out cross-validation. The second approach implements the idea of bootstrap aggregating, where bootstrap replicates are employed to select several models, and out-of-bag predictions provide model testing. The target value is mean streamflow for durations of 30, 60 and 90 days, starting with the 1st and 16th day of every month. Compared to the best guess model, bootstrap aggregating reduces the mean squared error of the streamflow forecast by seven percent on average. Thus, if resampling is anyway part of the model building procedure, bootstrap aggregating seems to be a useful strategy in statistical seasonal streamflow forecasting. Since the improved accuracy comes at the cost of a less interpretable model, the approach might be best suited for pure prediction tasks, e.g. as in operational applications.
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
Resampling Methods Revisited: Advancing the Understanding and Applications in Educational Research
Bai, Haiyan; Pan, Wei
2008-01-01
Resampling methods including randomization test, cross-validation, the jackknife and the bootstrap are widely employed in the research areas of natural science, engineering and medicine, but they lack appreciation in educational research. The purpose of the present review is to revisit and highlight the key principles and developments of…
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
2012-01-01
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
Kim, Jae-In; Kim, Taejung
2016-03-22
Epipolar resampling is the procedure of eliminating vertical disparity between stereo images. Due to its importance, many methods have been developed in the computer vision and photogrammetry field. However, we argue that epipolar resampling of image sequences, instead of a single pair, has not been studied thoroughly. In this paper, we compare epipolar resampling methods developed in both fields for handling image sequences. Firstly we briefly review the uncalibrated and calibrated epipolar resampling methods developed in computer vision and photogrammetric epipolar resampling methods. While it is well known that epipolar resampling methods developed in computer vision and in photogrammetry are mathematically identical, we also point out differences in parameter estimation between them. Secondly, we tested representative resampling methods in both fields and performed an analysis. We showed that for epipolar resampling of a single image pair all uncalibrated and photogrammetric methods tested could be used. More importantly, we also showed that, for image sequences, all methods tested, except the photogrammetric Bayesian method, showed significant variations in epipolar resampling performance. Our results indicate that the Bayesian method is favorable for epipolar resampling of image sequences.
Directory of Open Access Journals (Sweden)
André Luiz Missio
2014-02-01
Full Text Available Sampling sufficiency of the anatomical characteristics of fibres from Luehea divaricata wood was investigated through the bootstrap resampling method. Sampling sufficiency of fibre length, fibre diameter, lumen diameter and fibre wall thickness were determined. Three scenarios of sampling sufficiency evaluation were used: general, segregation between juvenile and mature wood, and accumulation throughout the life of the tree. The segregation of juvenile and mature wood showed that sampling sufficiency in juvenile wood was higher than in mature wood, up to three times for the fibre length. In general, results indicated higher values than suggested in specific standards and by other authors. Therefore, resampling bootstrap methodology, even though underused in forestry research, is an alternative for new studies, as it does not have restrictive hypotheses such as data normality.
Resampling methods for particle filtering:identical distribution, a new method, and comparable study
Institute of Scientific and Technical Information of China (English)
Tian-cheng LI; Gabriel VILLARRUBIA; Shu-dong SUN; Juan M CORCHADO; Javier BAJO
2015-01-01
Resampling is a critical procedure that is of both theoretical and practical significance for efficient implementation of the particle filter. To gain an insight of the resampling process and the filter, this paper contributes in three further respects as a sequel to the tutorial (Li et al., 2015). First, identical distribution (ID) is established as a general principle for the resampling design, which requires the distribution of particles before and after resampling to be statistically identical. Three consistent met-rics including the (symmetrical) Kullback-Leibler divergence, Kolmogorov-Smirnov statistic, and the sampling variance are introduced for assessment of the ID attribute of resampling, and a corresponding, qualitative ID analysis of representative resampling methods is given. Second, a novel resampling scheme that obtains the optimal ID attribute in the sense of minimum sampling variance is proposed. Third, more than a dozen typical resampling methods are compared via simulations in terms of sample size variation, sampling variance, computing speed, and estimation accuracy. These form a more comprehensive under-standing of the algorithm, providing solid guidelines for either selection of existing resampling methods or new implementations.
A comparison of four different block bootstrap methods
Directory of Open Access Journals (Sweden)
Boris Radovanov
2014-12-01
Full Text Available The paper contains a description of four different block bootstrap methods, i.e., non-overlapping block bootstrap, overlapping block bootstrap (moving block bootstrap, stationary block bootstrap and subsampling. Furthermore, the basic goal of this paper is to quantify relative efficiency of each mentioned block bootstrap procedure and then to compare those methods. To achieve the goal, we measure mean square errors of estimation variance returns. The returns are calculated from 1250 daily observations of Serbian stock market index values BELEX15 from April 2009 to April 2014. Thereby, considering the effects of potential changes in decisions according to variations in the sample length and purposes of the use, this paper introduces stability analysis which contains robustness testing of the different sample size and the different block length. Testing results indicate some changes in bootstrap method efficiencies when altering the sample size or the block length.
Echeverri, Alejandro Castedo; Serone, Marco
2016-01-01
We study the numerical bounds obtained using a conformal-bootstrap method - advocated in ref. [1] but never implemented so far - where different points in the plane of conformal cross ratios $z$ and $\\bar z$ are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point $z=\\bar z =1/2$, we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and $O(n)$ vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the $O(n)$ vector models, with $n=2,3,4$, which have not yet been computed using bootstrap techniques.
Echeverri, Alejandro Castedo; von Harling, Benedict; Serone, Marco
2016-09-01
We study the numerical bounds obtained using a conformal-bootstrap method — advocated in ref. [1] but never implemented so far — where different points in the plane of conformal cross ratios z and overline{z} are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z=overline{z}=1/2 , we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and O( n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O( n) vector models, with n = 2, 3, 4, which have not yet been computed using bootstrap techniques.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
Prabodh Shukla
2008-08-01
Bootstrap percolation transition may be first order or second order, or it may have a mixed character where a first-order drop in the order parameter is preceded by critical fluctuations. Recent studies have indicated that the mixed transition is characterized by power-law avalanches, while the continuous transition is characterized by truncated avalanches in a related sequential bootstrap process. We explain this behaviour on the basis of an analytical and numerical study of the avalanche distributions on a Bethe lattice.
Jongjoo, Kim; Davis, Scott K; Taylor, Jeremy F
2002-06-01
Empirical confidence intervals (CIs) for the estimated quantitative trait locus (QTL) location from selective and non-selective non-parametric bootstrap resampling methods were compared for a genome scan involving an Angus x Brahman reciprocal fullsib backcross population. Genetic maps, based on 357 microsatellite markers, were constructed for 29 chromosomes using CRI-MAP V2.4. Twelve growth, carcass composition and beef quality traits (n = 527-602) were analysed to detect QTLs utilizing (composite) interval mapping approaches. CIs were investigated for 28 likelihood ratio test statistic (LRT) profiles for the one QTL per chromosome model. The CIs from the non-selective bootstrap method were largest (87 7 cM average or 79-2% coverage of test chromosomes). The Selective II procedure produced the smallest CI size (42.3 cM average). However, CI sizes from the Selective II procedure were more variable than those produced by the two LOD drop method. CI ranges from the Selective II procedure were also asymmetrical (relative to the most likely QTL position) due to the bias caused by the tendency for the estimated QTL position to be at a marker position in the bootstrap samples and due to monotonicity and asymmetry of the LRT curve in the original sample.
Fung, Wing K; Yu, Kexin; Yang, Yingrui; Zhou, Ji-Yuan
2016-08-08
Monte Carlo evaluation of resampling-based tests is often conducted in statistical analysis. However, this procedure is generally computationally intensive. The pooling resampling-based method has been developed to reduce the computational burden but the validity of the method has not been studied before. In this article, we first investigate the asymptotic properties of the pooling resampling-based method and then propose a novel Monte Carlo evaluation procedure namely the n-times pooling resampling-based method. Theorems as well as simulations show that the proposed method can give smaller or comparable root mean squared errors and bias with much less computing time, thus can be strongly recommended especially for evaluating highly computationally intensive hypothesis testing procedures in genetic epidemiology.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Rejon-Barrera, Fernando
2015-01-01
We work out all of the details required for implementation of the conformal bootstrap program applied to the four-point function of two scalars and two vectors in an abstract conformal field theory in arbitrary dimension. This includes a review of which tensor structures make appearances, a construction of the projectors onto the required mixed symmetry representations, and a computation of the conformal blocks for all possible operators which can be exchanged. These blocks are presented as differential operators acting upon the previously known scalar conformal blocks. Finally, we set up the bootstrap equations which implement crossing symmetry. Special attention is given to the case of conserved vectors, where several simplifications occur.
BoCluSt: Bootstrap Clustering Stability Algorithm for Community Detection.
Garcia, Carlos
2016-01-01
The identification of modules or communities in sets of related variables is a key step in the analysis and modeling of biological systems. Procedures for this identification are usually designed to allow fast analyses of very large datasets and may produce suboptimal results when these sets are of a small to moderate size. This article introduces BoCluSt, a new, somewhat more computationally intensive, community detection procedure that is based on combining a clustering algorithm with a measure of stability under bootstrap resampling. Both computer simulation and analyses of experimental data showed that BoCluSt can outperform current procedures in the identification of multiple modules in data sets with a moderate number of variables. In addition, the procedure provides users with a null distribution of results to evaluate the support for the existence of community structure in the data. BoCluSt takes individual measures for a set of variables as input, and may be a valuable and robust exploratory tool of network analysis, as it provides 1) an estimation of the best partition of variables into modules, 2) a measure of the support for the existence of modular structures, and 3) an overall description of the whole structure, which may reveal hierarchical modular situations, in which modules are composed of smaller sub-modules.
Franco, Glaura C.; Reisen, Valderio A.
2007-03-01
This paper deals with different bootstrap approaches and bootstrap confidence intervals in the fractionally autoregressive moving average (ARFIMA(p,d,q)) process [J. Hosking, Fractional differencing, Biometrika 68(1) (1981) 165-175] using parametric and semi-parametric estimation techniques for the memory parameter d. The bootstrap procedures considered are: the classical bootstrap in the residuals of the fitted model [B. Efron, R. Tibshirani, An Introduction to the Bootstrap, Chapman and Hall, New York, 1993], the bootstrap in the spectral density function [E. Paparoditis, D.N Politis, The local bootstrap for periodogram statistics. J. Time Ser. Anal. 20(2) (1999) 193-222], the bootstrap in the residuals resulting from the regression equation of the semi-parametric estimators [G.C Franco, V.A Reisen, Bootstrap techniques in semiparametric estimation methods for ARFIMA models: a comparison study, Comput. Statist. 19 (2004) 243-259] and the Sieve bootstrap [P. Bühlmann, Sieve bootstrap for time series, Bernoulli 3 (1997) 123-148]. The performance of these procedures and confidence intervals for d in the stationary and non-stationary ranges are empirically obtained through Monte Carlo experiments. The bootstrap confidence intervals here proposed are alternative procedures with some accuracy to obtain confidence intervals for d.
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, G.; Rahbek, Anders; Taylor, A.M.R.
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... show that the bootstrap PLR tests are asymptotically correctly sized and, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under...
Comparison of interpolating methods for image resampling.
Parker, J; Kenyon, R V; Troxel, D E
1983-01-01
When resampling an image to a new set of coordinates (for example, when rotating an image), there is often a noticeable loss in image quality. To preserve image quality, the interpolating function used for the resampling should be an ideal low-pass filter. To determine which limited extent convolving functions would provide the best interpolation, five functions were compared: A) nearest neighbor, B) linear, C) cubic B-spline, D) high-resolution cubic spline with edge enhancement (a = -1), and E) high-resolution cubic spline (a = -0.5). The functions which extend over four picture elements (C, D, E) were shown to have a better frequency response than those which extend over one (A) or two (B) pixels. The nearest neighbor function shifted the image up to one-half a pixel. Linear and cubic B-spline interpolation tended to smooth the image. The best response was obtained with the high-resolution cubic spline functions. The location of the resampled points with respect to the initial coordinate system has a dramatic effect on the response of the sampled interpolating function the data are exactly reproduced when the points are aligned, and the response has the most smoothing when the resampled points are equidistant from the original coordinate points. Thus, at the expense of some increase in computing time, image quality can be improved by resampled using the high-resolution cubic spline function as compared to the nearest neighbor, linear, or cubic B-spline functions.
An approximate analytical approach to resampling averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, M.
2004-01-01
Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach for appr......Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach...
Bootstrap Estimation for Nonparametric Efficiency Estimates
1995-01-01
This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...
Gap bootstrap methods for massive data sets with an application to transportation engineering
Lahiri, S.N.; Spiegelman, C.; Appiah, J.; Rilett, L.
2013-01-01
In this paper we describe two bootstrap methods for massive data sets. Naive applications of common resampling methodology are often impractical for massive data sets due to computational burden and due to complex patterns of inhomogeneity. In contrast, the proposed methods exploit certain structural properties of a large class of massive data sets to break up the original problem into a set of simpler subproblems, solve each subproblem separately where the data exhibit a...
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be
Collier, Scott; Yin, Xi
2016-01-01
We constrain the spectrum of two-dimensional unitary, compact conformal field theories with central charge c > 1 using modular bootstrap. Upper bounds on the gap in the dimension of primary operators of any spin, as well as in the dimension of scalar primaries, are computed numerically as functions of the central charge using semi-definite programming. Our bounds refine those of Hellerman and Friedan-Keller, and are in some cases saturated by known CFTs. In particular, we show that unitary CFTs with c < 8 must admit relevant deformations, and that a nontrivial bound on the gap of scalar primaries exists for c < 25. We also study bounds on the dimension gap in the presence of twist gaps, bounds on the degeneracy of operators, and demonstrate how "extremal spectra" which maximize the degeneracy at the gap can be determined numerically.
Testing for heteroscedasticity in jumpy and noisy high-frequency data: A resampling approach
DEFF Research Database (Denmark)
Christensen, Kim; Hounyo, Ulrich; Podolskij, Mark
in the presence of a heteroscedastic volatility term (and has a standard normal distribution otherwise). The test is inspected in a general Monte Carlo simulation setting, where we note that in finite samples the asymptotic theory is severely distorted by infinite-activity price jumps. To improve inference, we...... suggest a bootstrap approach to test the null of homoscedasticity. We prove the first-order validity of this procedure, while in simulations the bootstrap leads to almost correctly sized tests. As an illustration, we apply the bootstrapped version of our t-statistic to a large cross-section of equity high...
Bootstrapping quarks and gluons
Energy Technology Data Exchange (ETDEWEB)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates......, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional...
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates......, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional...
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Bootstrapping quality of Web Services
Directory of Open Access Journals (Sweden)
Zainab Aljazzaf
2015-07-01
Full Text Available A distributed application may be composed of global services provided by different organizations and having different properties. To select a service from many similar services, it is important to distinguish between them. Quality of services (QoS has been used as a distinguishing factor between similar services and plays an important role in service discovery, selection, and composition. Moreover, QoS is an important contributing factor to the evolution of distributed paradigms, such as service-oriented computing and cloud computing. There are many research works that assess services and justify the QoS at the finding, composition, or binding stages of services. However, there is a need to justify the QoS once new services are registered and before any requestors use them; this is called bootstrapping QoS. Bootstrapping QoS is the process of evaluating the QoS of the newly registered services at the time of publishing the services. Thus, this paper proposes a QoS bootstrapping solution for Web Services and builds a QoS bootstrapping framework. In addition, Service Oriented Architecture (SOA is extended and a prototype is built to support QoS bootstrapping. Experiments are conducted and a case study is presented to test the proposed QoS bootstrapping solution.
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker......" variance estimator derived from the "small bandwidth" asymptotic framework. The results of a small-scale Monte Carlo experiment are found to be consistent with the theory and indicate in particular that sensitivity with respect to the bandwidth choice can be ameliorated by using the "robust...
Increased-diversity systematic resampling in particle filtering for BLAST
Institute of Scientific and Technical Information of China (English)
Zheng Jianping; Bai Baoming; Wang Xinmei
2009-01-01
Two variants of systematic resampling (S-RS) are proposed to increase the diversity of particles and thereby improve the performance of particle filtering when it is utilized for detection in Bell Laboratories Layered Space-Time (BLAST) systems. In the first variant, Markov chain Monte Carlo transition is integrated in the S-RS procedure to increase the diversity of particles with large importance weights. In the second one, all particles are first partitioned into two sets according to their importance weights, and then a double S-RS is introduced to increase the diversity of particles with small importance weights. Simulation results show that both variants can improve the bit error performance efficiently compared with the standard S-RS with little increased complexity.
Liang, Rong; Zhou, Shu-dong; Li, Li-xia; Zhang, Jun-guo; Gao, Yan-hui
2013-09-01
This paper aims to achieve Bootstraping in hierarchical data and to provide a method for the estimation on confidence interval(CI) of intraclass correlation coefficient(ICC).First, we utilize the mixed-effects model to estimate data from ICC of repeated measurement and from the two-stage sampling. Then, we use Bootstrap method to estimate CI from related ICCs. Finally, the influences of different Bootstraping strategies to ICC's CIs are compared. The repeated measurement instance show that the CI of cluster Bootsraping containing the true ICC value. However, when ignoring the hierarchy characteristics of data, the random Bootsraping method shows that it has the invalid CI. Result from the two-stage instance shows that bias observed between cluster Bootstraping's ICC means while the ICC of the original sample is the smallest, but with wide CI. It is necessary to consider the structure of data as important, when hierarchical data is being resampled. Bootstrapping seems to be better on the higher than that on lower levels.
The bootstrap in bioequivalence studies.
Pigeot, Iris; Hauschke, Dieter; Shao, Jun
2011-11-01
In 1997, the U.S. Food and Drug Administration (FDA) suggested in its draft guidance the use of new concepts for assessing the bioequivalence of two drug formulations, namely, the concepts of population and individual bioequivalence. Aggregate moment-based and probability-based measures of bioequivalence were introduced to derive criteria in order to decide whether two formulations should be regarded as bioequivalent or not. The statistical decision may be made via a nonparametric bootstrap percentile interval. In this article, we review the history of population and individual bioequivalence with special focus on the role of the bootstrap in this context.
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
Resampling: An improvement of importance sampling in varying population size models.
Merle, C; Leblois, R; Rousset, F; Pudlo, P
2017-04-01
Sequential importance sampling algorithms have been defined to estimate likelihoods in models of ancestral population processes. However, these algorithms are based on features of the models with constant population size, and become inefficient when the population size varies in time, making likelihood-based inferences difficult in many demographic situations. In this work, we modify a previous sequential importance sampling algorithm to improve the efficiency of the likelihood estimation. Our procedure is still based on features of the model with constant size, but uses a resampling technique with a new resampling probability distribution depending on the pairwise composite likelihood. We tested our algorithm, called sequential importance sampling with resampling (SISR) on simulated data sets under different demographic cases. In most cases, we divided the computational cost by two for the same accuracy of inference, in some cases even by one hundred. This study provides the first assessment of the impact of such resampling techniques on parameter inference using sequential importance sampling, and extends the range of situations where likelihood inferences can be easily performed.
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Institute of Scientific and Technical Information of China (English)
郑蓉建; 周林成; 潘丰
2012-01-01
Fault monitoring of bioprocess is important to ensure safety of a reactor and maintain high quality of products. It is difficult to build an accurate mechanistic model for a bioprocess, so fault monitoring based on rich historical or online database is an effective way. A group of data based on bootstrap method could be resampling stochastically, improving generalization capability of model. In this paper, online fault monitoring of generalized additive models (GAMs) combining with bootstrap is proposed for glutamate fermentation process. GAMs and bootstrap are first used to decide confidence interval based on the online and off-line normal sampled data from glutamate fermentation experiments. Then GAMs are used to online fault monitoring for time, dissolved oxygen, oxygen uptake rate, and carbon dioxide evolution rate. The method can provide accurate fault alarm online and is helpful to provide useful information for removing fault and abnormal phenomena in the fermentation.
Einecke, Sabrina; Bissantz, Nicolai; Clevermann, Fabian; Rhode, Wolfgang
2016-01-01
Astroparticle experiments such as IceCube or MAGIC require a deconvolution of their measured data with respect to the response function of the detector to provide the distributions of interest, e.g. energy spectra. In this paper, appropriate uncertainty limits that also allow to draw conclusions on the geometric shape of the underlying distribution are determined using bootstrap methods, which are frequently applied in statistical applications. Bootstrap is a collective term for resampling methods that can be employed to approximate unknown probability distributions or features thereof. A clear advantage of bootstrap methods is their wide range of applicability. For instance, they yield reliable results, even if the usual normality assumption is violated. The use, meaning and construction of uncertainty limits to any user-specific confidence level in the form of confidence intervals and levels are discussed. The precise algorithms for the implementation of these methods, applicable for any deconvolution algor...
Bootstrap Current in Spherical Tokamaks
Institute of Scientific and Technical Information of China (English)
王中天; 王龙
2003-01-01
Variational principle for the neoclassical theory has been developed by including amomentum restoring term in the electron-electron collisional operator, which gives an additionalfree parameter maximizing the heat production rate. All transport coefficients are obtained in-cluding the bootstrap current. The essential feature of the study is that the aspect ratio affects thefunction of the electron-electron collision operator through a geometrical factor. When the aspectratio approaches to unity, the fraction of circulating particles goes to zero and the contribution toparticle flux from the electron-electron collision vanishes. The resulting diffusion coefficient is inrough agreement with Hazeltine. When the aspect ratio approaches to infinity, the results are inagreement with Rosenbluth. The formalism gives the two extreme cases a connection. The theoryis particularly important for the calculation of bootstrap current in spherical tokamaks and thepresent tokamaks, in which the square root of the inverse aspect ratio, in general, is not small.
Conformal Bootstrap in Mellin Space
Gopakumar, Rajesh; Sen, Kallol; Sinha, Aninda
2016-01-01
We propose a new approach towards analytically solving for the dynamical content of Conformal Field Theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the epsilon expansion of the Wilson-Fisher fixed point by computing operator dimensions and, strikingly, OPE coefficients to higher orders in epsilon than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement of certain observables in the 3d Ising model, with the precise numerical values that...
Colin, Yannick; Goñi-Urriza, Marisol; Caumette, Pierre; Guyoneaud, Rémy
2015-03-01
The development of new high-throughput cultivation methods aims to increase the isolation efficiency as compared to standard techniques that often require enrichment procedures to compensate the low microbial recovery. In the current study, estuarine sulfate-reducing bacteria were isolated using an anaerobic isolation procedure in 384-well microplates. Ninety-nine strains were recovered from initial sediments. Isolates were identified according to their partial 16S rRNA sequences and clustered into 13 phylotypes. Besides, the increase in species richness obtained through enrichments or resampling was investigated. Forty-four enrichment procedures were conducted and shifts in sulfate-reducing bacterial communities were investigated through dsrAB gene fingerprinting. Despite efforts in conducting numerous enrichment conditions only few of them were statistically different from initial sample. The cultural diversity obtained from 3 of the most divergent enrichments, as well as from resampled sediments equally contributed to raise the sulfate-reducing diversity up to 22 phylotypes. Enrichments (selection of metabolism) or resampling (transient populations and micro-heterogeneity) may still be helpful to assess new microbial phylotypes. Nevertheless, all the newly cultivated strains were all representatives of minor Operational Taxonomic Units and could eventually be recovered by maintaining high-throughput isolation effort from the initial sediments.
NAIP Aerial Imagery (Resampled), Salton Sea - 2005 [ds425
California Department of Resources — NAIP 2005 aerial imagery that has been resampled from 1-meter source resolution to approximately 30-meter resolution. This is a mosaic composed from several NAIP...
Medical Image Retrieval Based on Multi-Layer Resampling Template
Institute of Scientific and Technical Information of China (English)
WANG Xin-rui; YANG Yun-feng
2014-01-01
Medical image application in clinical diagnosis and treatment is becoming more and more widely, How to use a large number of images in the image management system and it is a very important issue how to assist doctors to analyze and diagnose. This paper studies the medical image retrieval based on multi-layer resampling template under the thought of the wavelet decomposition, the image retrieval method consists of two retrieval process which is coarse and fine retrieval. Coarse retrieval process is the medical image retrieval process based on the image contour features. Fine retrieval process is the medical image retrieval process based on multi-layer resampling template, a multi-layer sampling operator is employed to extract image resampling images each layer, then these resampling images are retrieved step by step to finish the process from coarse to fine retrieval.
On adaptive resampling strategies for sequential Monte Carlo methods
Del Moral, Pierre; Doucet, Arnaud; Jasra, Ajay
2012-01-01
Sequential Monte Carlo (SMC) methods are a class of techniques to sample approximately from any sequence of probability distributions using a combination of importance sampling and resampling steps. This paper is concerned with the convergence analysis of a class of SMC methods where the times at which resampling occurs are computed online using criteria such as the effective sample size. This is a popular approach amongst practitioners but there are very few convergence results available for...
Model-Consistent Sparse Estimation through the Bootstrap
Bach, Francis
2009-01-01
We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.
Bootstrap percolation on spatial networks
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
Statistical Analysis of Random Simulations : Bootstrap Tutorial
Deflandre, D.; Kleijnen, J.P.C.
2002-01-01
The bootstrap is a simple but versatile technique for the statistical analysis of random simulations.This tutorial explains the basics of that technique, and applies it to the well-known M/M/1 queuing simulation.In that numerical example, different responses are studied.For some responses, bootstrap
Analytic bootstrap at large spin
Kaviraj, Apratim; Sinha, Aninda
2015-01-01
We use analytic conformal bootstrap methods to determine the anomalous dimensions and OPE coefficients for large spin operators in general conformal field theories in four dimensions containing a scalar operator of conformal dimension $\\Delta_\\phi$. It is known that such theories will contain an infinite sequence of large spin operators with twists approaching $2\\Delta_\\phi+2n$ for each integer $n$. By considering the case where such operators are separated by a twist gap from other operators at large spin, we analytically determine the $n$, $\\Delta_\\phi$ dependence of the anomalous dimensions. We find that for all $n$, the anomalous dimensions are negative for $\\Delta_\\phi$ satisfying the unitarity bound, thus extending the Nachtmann theorem to non-zero $n$. In the limit when $n$ is large, we find agreement with the AdS/CFT prediction corresponding to the Eikonal limit of a 2-2 scattering with dominant graviton exchange.
Bishara, Anthony J; Hittner, James B
2012-09-01
It is well known that when data are nonnormally distributed, a test of the significance of Pearson's r may inflate Type I error rates and reduce power. Statistics textbooks and the simulation literature provide several alternatives to Pearson's correlation. However, the relative performance of these alternatives has been unclear. Two simulation studies were conducted to compare 12 methods, including Pearson, Spearman's rank-order, transformation, and resampling approaches. With most sample sizes (n ≥ 20), Type I and Type II error rates were minimized by transforming the data to a normal shape prior to assessing the Pearson correlation. Among transformation approaches, a general purpose rank-based inverse normal transformation (i.e., transformation to rankit scores) was most beneficial. However, when samples were both small (n ≤ 10) and extremely nonnormal, the permutation test often outperformed other alternatives, including various bootstrap tests.
On adaptive resampling strategies for sequential Monte Carlo methods
Del Moral, Pierre; Jasra, Ajay; 10.3150/10-BEJ335
2012-01-01
Sequential Monte Carlo (SMC) methods are a class of techniques to sample approximately from any sequence of probability distributions using a combination of importance sampling and resampling steps. This paper is concerned with the convergence analysis of a class of SMC methods where the times at which resampling occurs are computed online using criteria such as the effective sample size. This is a popular approach amongst practitioners but there are very few convergence results available for these methods. By combining semigroup techniques with an original coupling argument, we obtain functional central limit theorems and uniform exponential concentration estimates for these algorithms.
Simulating ensembles of source water quality using a K-nearest neighbor resampling approach.
Towler, Erin; Rajagopalan, Balaji; Seidel, Chad; Summers, R Scott
2009-03-01
Climatological, geological, and water management factors can cause significant variability in surface water quality. As drinking water quality standards become more stringent, the ability to quantify the variability of source water quality becomes more important for decision-making and planning in water treatment for regulatory compliance. However, paucity of long-term water quality data makes it challenging to apply traditional simulation techniques. To overcome this limitation, we have developed and applied a robust nonparametric K-nearest neighbor (K-nn) bootstrap approach utilizing the United States Environmental Protection Agency's Information Collection Rule (ICR) data. In this technique, first an appropriate "feature vector" is formed from the best available explanatory variables. The nearest neighbors to the feature vector are identified from the ICR data and are resampled using a weight function. Repetition of this results in water quality ensembles, and consequently the distribution and the quantification of the variability. The main strengths of the approach are its flexibility, simplicity, and the ability to use a large amount of spatial data with limited temporal extent to provide water quality ensembles for any given location. We demonstrate this approach by applying it to simulate monthly ensembles of total organic carbon for two utilities in the U.S. with very different watersheds and to alkalinity and bromide at two other U.S. utilities.
N=1 Supersymmetric Boundary Bootstrap
Toth, G Z
2004-01-01
We investigate the boundary bootstrap programme for finding exact reflection matrices of integrable boundary quantum field theories with N=1 boundary supersymmetry. The bulk S-matrix and the reflection matrix are assumed to take the form S=S_1S_0, R=R_1R_0, where S_0 and R_0 are the S-matrix and reflection matrix of some integrable non-supersymmetric boundary theory that is assumed to be known, and S_1 and R_1 describe the mixing of supersymmetric indices. Under the assumption that the bulk particles transform in the kink and boson/fermion representations and the ground state is a singlet we present rules by which the supersymmetry representations and reflection factors for excited boundary bound states can be determined. We apply these rules to the boundary sine-Gordon model, to the boundary a_2^(1) and a_4^(1) affine Toda field theories, to the boundary sinh-Gordon model and to the free particle.
PARTICLE FILTER BASED VEHICLE TRACKING APPROACH WITH IMPROVED RESAMPLING STAGE
Directory of Open Access Journals (Sweden)
Wei Leong Khong
2014-02-01
Full Text Available Optical sensors based vehicle tracking can be widely implemented in traffic surveillance and flow control. The vast development of video surveillance infrastructure in recent years has drawn the current research focus towards vehicle tracking using high-end and low cost optical sensors. However, tracking vehicles via such sensors could be challenging due to the high probability of changing vehicle appearance and illumination, besides the occlusion and overlapping incidents. Particle filter has been proven as an approach which can overcome nonlinear and non-Gaussian situations caused by cluttered background and occlusion incidents. Unfortunately, conventional particle filter approach encounters particle degeneracy especially during and after the occlusion. Particle filter with sampling important resampling (SIR is an important step to overcome the drawback of particle filter, but SIR faced the problem of sample impoverishment when heavy particles are statistically selected many times. In this work, genetic algorithm has been proposed to be implemented in the particle filter resampling stage, where the estimated position can converge faster to hit the real position of target vehicle under various occlusion incidents. The experimental results show that the improved particle filter with genetic algorithm resampling method manages to increase the tracking accuracy and meanwhile reduce the particle sample size in the resampling stage.
Energy Technology Data Exchange (ETDEWEB)
Sohn, S.Y
1999-12-01
We consider a robust parameter design of the process for forming contact windows in complementary metal-oxide semiconductor circuits. Robust design is often used to find the optimal levels of process conditions which would provide the output of consistent quality as close to a target value. In this paper, we analyze the results of the fractional factorial design of nine factors: mask dimension, viscosity, bake temperature, spin speed, bake time, aperture, exposure time, developing time, etch time, where the outcome of the experiment is measured in terms of a categorized window size with five categories. Random effect analysis is employed to model both the mean and variance of categorized window size as functions of some controllable factors as well as random errors. Empirical Bayes' procedures are then utilized to fit both the models, and to eventually find the robust design of CMOS circuit process by means of a Bootstrap resampling approach.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... that the probability of selecting a rank smaller than (equal to) the true co-integrating rank will converge to zero (one minus the marginal significance level), as the sample size diverges, for general I(1) processes. No such likelihood-based procedure is currently known to be available. In this paper we fill this gap...
Using re-sampling methods in mortality studies.
Directory of Open Access Journals (Sweden)
Igor Itskovich
Full Text Available Traditional methods of computing standardized mortality ratios (SMR in mortality studies rely upon a number of conventional statistical propositions to estimate confidence intervals for obtained values. Those propositions include a common but arbitrary choice of the confidence level and the assumption that observed number of deaths in the test sample is a purely random quantity. The latter assumption may not be fully justified for a series of periodic "overlapping" studies. We propose a new approach to evaluating the SMR, along with its confidence interval, based on a simple re-sampling technique. The proposed method is most straightforward and requires neither the use of above assumptions nor any rigorous technique, employed by modern re-sampling theory, for selection of a sample set. Instead, we include all possible samples that correspond to the specified time window of the study in the re-sampling analysis. As a result, directly obtained confidence intervals for repeated overlapping studies may be tighter than those yielded by conventional methods. The proposed method is illustrated by evaluating mortality due to a hypothetical risk factor in a life insurance cohort. With this method used, the SMR values can be forecast more precisely than when using the traditional approach. As a result, the appropriate risk assessment would have smaller uncertainties.
Bootstrap Percolation on Random Geometric Graphs
Bradonjić, Milan
2012-01-01
Bootstrap percolation has been used effectively to model phenomena as diverse as emergence of magnetism in materials, spread of infection, diffusion of software viruses in computer networks, adoption of new technologies, and emergence of collective action and cultural fads in human societies. It is defined on an (arbitrary) network of interacting agents whose state is determined by the state of their neighbors according to a threshold rule. In a typical setting, bootstrap percolation starts by random and independent "activation" of nodes with a fixed probability $p$, followed by a deterministic process for additional activations based on the density of active nodes in each neighborhood ($\\th$ activated nodes). Here, we study bootstrap percolation on random geometric graphs in the regime when the latter are (almost surely) connected. Random geometric graphs provide an appropriate model in settings where the neighborhood structure of each node is determined by geographical distance, as in wireless {\\it ad hoc} ...
Particle filter based on iterated importance density function and parallel resampling
Institute of Scientific and Technical Information of China (English)
武勇; 王俊; 曹运合
2015-01-01
The design, analysis and parallel implementation of particle filter (PF) were investigated. Firstly, to tackle the particle degeneracy problem in the PF, an iterated importance density function (IIDF) was proposed, where a new term associating with the current measurement information (CMI) was introduced into the expression of the sampled particles. Through the repeated use of the least squares estimate, the CMI can be integrated into the sampling stage in an iterative manner, conducing to the greatly improved sampling quality. By running the IIDF, an iterated PF (IPF) can be obtained. Subsequently, a parallel resampling (PR) was proposed for the purpose of parallel implementation of IPF, whose main idea was the same as systematic resampling (SR) but performed differently. The PR directly used the integral part of the product of the particle weight and particle number as the number of times that a particle was replicated, and it simultaneously eliminated the particles with the smallest weights, which are the two key differences from the SR. The detailed implementation procedures on the graphics processing unit of IPF based on the PR were presented at last. The performance of the IPF, PR and their parallel implementations are illustrated via one-dimensional numerical simulation and practical application of passive radar target tracking.
Dettinger, M.
2006-01-01
In many meteorological and climatological modeling applications, the availability of ensembles of predictions containing very large numbers of members would substantially ease statistical analyses and validations. This study describes and demonstrates an objective approach for generating large ensembles of "additional" realizations from smaller ensembles, where the additional ensemble members share important first-and second-order statistical characteristics and some dynamic relations within the original ensemble. By decomposing the original ensemble members into assuredly independent time-series components (using a form of principal component decomposition) that can then be resampled randomly and recombined, the component-resampling procedure generates additional time series that follow the large and small scale structures in the original ensemble members, without requiring any tuning by the user. The method is demonstrated by applications to operational medium-range weather forecast ensembles from a single NCEP weather model and application to a multi-model, multi-emission-scenarios ensemble of 21st Century climate-change projections. ?? Springer 2006.
BOOTSTRAPPING FOR EXTRACTING RELATIONS FROM LARGE CORPORA
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A new approach of relation extraction is described in this paper. It adopts a bootstrapping model with a novel iteration strategy, which generates more precise examples of specific relation. Compared with previous methods, the proposed method has three main advantages: first, it needs less manual intervention; second, more abundant and reasonable information are introduced to represent a relation pattern; third, it reduces the risk of circular dependency occurrence in bootstrapping. Scalable evaluation methodology and metrics are developed for our task with comparable techniques over TianWang 100G corpus. The experimental results show that it can get 90% precision and have excellent expansibility.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
Bootstrap inversion for Pn wave velocity in North-Western Italy
Directory of Open Access Journals (Sweden)
C. Eva
1997-06-01
Full Text Available An inversion of Pn arrival times from regional distance earthquakes (180-800 km, recorded by 94 seismic stations operating in North-Western Italy and surrounding areas, was carried out to image lateral variations of P-wave velocity at the crust-mantle boundary, and to estimate the static delay time at each station. The reliability of the obtained results was assessed using both synthetic tests and the bootstrap Monte Carlo resampling technique. Numerical simulations demonstrated the existence of a trade-off between cell velocities and estimated station delay times along the edge of the model. Bootstrap inversions were carried out to determine the standard deviation of velocities and time terms. Low Pn velocity anomalies are detected beneath the outer side of the Alps (-6% and the Western Po plain (-4% in correspondence with two regions of strong crustal thickening and negative Bouguer anomaly. In contrast, high Pn velocities are imaged beneath the inner side of the Alps (+4% indicating the presence of high velocity and density lower crust-upper mantle. The Ligurian sea shows high Pn velocities close to the Ligurian coastlines (+3% and low Pn velocities (-1.5% in the middle of the basin in agreement with the upper mantle velocity structure revealed by seismic refraction profiles.
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Deterministic bootstrap percolation in high dimensional grids
Huang, Hao; Lee, Choongbum
2013-01-01
In this paper, we study the k-neighbor bootstrap percolation process on the d-dimensional grid [n]^d, and show that the minimum number of initial vertices that percolate is (1-d/k)n^d + O(n^{d-1})$ when d
On the estimation of the extremal index based on scaling and resampling
Hamidieh, Kamal; Michailidis, George
2010-01-01
The extremal index parameter theta characterizes the degree of local dependence in the extremes of a stationary time series and has important applications in a number of areas, such as hydrology, telecommunications, finance and environmental studies. In this study, a novel estimator for theta based on the asymptotic scaling of block-maxima and resampling is introduced. It is shown to be consistent and asymptotically normal for a large class of m-dependent time series. Further, a procedure for the automatic selection of its tuning parameter is developed and different types of confidence intervals that prove useful in practice proposed. The performance of the estimator is examined through simulations, which show its highly competitive behavior. Finally, the estimator is applied to three real data sets of daily crude oil prices, daily returns of the S&P 500 stock index, and high-frequency, intra-day traded volumes of a stock. These applications demonstrate additional diagnostic features of statistical plots ...
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.
Bootstrapping Relational Affordances of Object Pairs using Transfer
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert;
2016-01-01
leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...... affordance predictor is augmented with the output of previously learnt affordances. In the second approach (category based bootstrapping), we form categories that capture underlying commonalities of a pair of existing affordances and augment the state-space with this category classifier’s output. In addition....... We also show that there is no significant difference in performance between direct and category based bootstrapping....
Extremal bootstrapping: go with the flow
El-Showk, Sheer
2016-01-01
The extremal functional method determines approximate solutions to the constraints of crossing symmetry, which saturate bounds on the space of unitary CFTs. We show that such solutions are characterized by extremality conditions, which may be used to flow continuously along the boundaries of parameter space. Along the flow there is generically no further need for optimization, which dramatically reduces computational requirements, bringing calculations from the realm of computing clusters to laptops. Conceptually, extremality sheds light on possible ways to bootstrap without positivity, extending the method to non-unitary theories, and implies that theories saturating bounds, and especially those sitting at kinks, have unusually sparse spectra. We discuss several applications, including the first high-precision bootstrap of a non-unitary CFT.
Bootstrapping ${\\mathcal N}=2$ chiral correlators
Lemos, Madalena
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional ${\\mathcal N}=2$ SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of ${\\mathcal N}=2$ SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
Bootstrapping Deep Lexical Resources: Resources for Courses
Baldwin, Timothy
2007-01-01
We propose a range of deep lexical acquisition methods which make use of morphological, syntactic and ontological language resources to model word similarity and bootstrap from a seed lexicon. The different methods are deployed in learning lexical items for a precision grammar, and shown to each have strengths and weaknesses over different word classes. A particular focus of this paper is the relative accessibility of different language resource types, and predicted ``bang for the buck'' associated with each in deep lexical acquisition applications.
TASI Lectures on the Conformal Bootstrap
Simmons-Duffin, David
2016-01-01
These notes are from courses given at TASI and the Advanced Strings School in summer 2015. Starting from principles of quantum field theory and the assumption of a traceless stress tensor, we develop the basics of conformal field theory, including conformal Ward identities, radial quantization, reflection positivity, the operator product expansion, and conformal blocks. We end with an introduction to numerical bootstrap methods, focusing on the 2d and 3d Ising models.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Dexter, Troy A; Kowalewski, Michał
2013-12-01
Quantitative estimates of growth rates can augment ecological and paleontological applications of body-size data. However, in contrast to body-size estimates, assessing growth rates is often time-consuming, expensive, or unattainable. Here we use an indirect approach, a jackknife-corrected parametric bootstrap, for efficient approximation of growth rates using nearest living relatives with known age-size relationships. The estimate is developed by (1) collecting a sample of published growth rates of closely related species, (2) calculating the average growth curve using those published age-size relationships, (3) resampling iteratively these empirically known growth curves to estimate the standard errors and confidence bands around the average growth curve, and (4) applying the resulting estimate of uncertainty to bracket age-size relationships of the species of interest. This approach was applied to three monophyletic families (Donacidae, Mactridae, and Semelidae) of mollusk bivalves, a group characterized by indeterministic shell growth, but widely used in ecological, paleontological, and geochemical research. The resulting indirect estimates were tested against two previously published geochemical studies and, in both cases, yielded highly congruent age estimates. In addition, a case study in applied fisheries was used to illustrate the potential of the proposed approach for augmenting aquaculture management practices. The resulting estimates of growth rates place body size data in a constrained temporal context and confidence intervals associated with resampling estimates allow for assessing the statistical uncertainty around derived temporal ranges. The indirect approach should allow for improved evaluation of diverse research questions, from sustainability of industrial shellfish harvesting to climatic interpretations of stable isotope proxies extracted from fossil skeletons.
Accidental Symmetries and the Conformal Bootstrap
Chester, Shai M; Iliesiu, Luca V; Klebanov, Igor R; Pufu, Silviu S; Yacoby, Ran
2015-01-01
We study an ${\\cal N} = 2$ supersymmetric generalization of the three-dimensional critical $O(N)$ vector model that is described by $N+1$ chiral superfields with superpotential $W = g_1 X \\sum_i Z_i^2 + g_2 X^3$. By combining the tools of the conformal bootstrap with results obtained through supersymmetric localization, we argue that this model exhibits a symmetry enhancement at the infrared superconformal fixed point due to $g_2$ flowing to zero. This example is special in that the existence of an infrared fixed point with $g_1,g_2\
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size.
Bandpass-resampling effects for the retrieval of surface emissivity.
Richter, Rudolf; Coll, Cesar
2002-06-20
The retrieval of surface emissivity in the 8-14-microm region from remotely sensed thermal imagery requires channel-averaged values of atmospheric transmittance, path radiance, and downwelling sky flux. Band-pass resampling introduces inherent retrieval errors that depend on atmospheric conditions, spectral region, bandwidth, flight altitude, and surface temperature. This simulation study is performed for clear sky conditions and moderate atmospheric water vapor contents. It shows that relative emissivity retrieval errors can reach as much as 3% for broadband sensors (1-2-microm bandwidth) and 0.8% for narrowband instruments (0.15 microm), even for constant surface emissivity. For spectrally varying surface emissivities the relative retrieval error increases for the broadband instrument by approximately 2% in channels with strong emissivity changes of 0.05-0.1. The corresponding retrieval errors for narrowband sensors increase by approximately 3-4%. The channels in the atmospheric window regions with lower transmittance, i.e., 8-8.5 and 12.5-14 microm, are most sensitive to retrieval errors.
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995...
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Conformal Bootstrap, Universality and Gravitational Scattering
Jackson, Steven; Verlinde, Herman
2014-01-01
We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large $c$, a sparse light spectrum and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles) between two heavy states (BTZ black holes). We find that the operator algebra in this regime is (i) universal and identical to that of Liouville CFT, and (ii) takes the form of an exchange algebra, specified by an R-matrix that exactly matches with the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Uncertainty estimation in diffusion MRI using the nonlocal bootstrap.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2014-08-01
In this paper, we propose a new bootstrap scheme, called the nonlocal bootstrap (NLB) for uncertainty estimation. In contrast to the residual bootstrap, which relies on a data model, or the repetition bootstrap, which requires repeated signal measurements, NLB is not restricted by the data structure imposed by a data model and obviates the need for time-consuming multiple acquisitions. NLB hinges on the observation that local imaging information recurs in an image. This self-similarity implies that imaging information coming from spatially distant (nonlocal) regions can be exploited for more effective estimation of statistics of interest. Evaluations using in silico data indicate that NLB produces distribution estimates that are in closer agreement with those generated using Monte Carlo simulations, compared with the conventional residual bootstrap. Evaluations using in vivo data demonstrate that NLB produces results that are in agreement with our knowledge on white matter architecture.
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2017-03-01
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Finally, simulation results show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.
Enders, Craig K.
2002-01-01
Proposed a method for extending the Bollen-Stine bootstrap model (K. Bollen and R. Stine, 1992) fit to structural equation models with missing data. Developed a Statistical Analysis System macro program to implement this procedure, and assessed its usefulness in a simulation. The new method yielded model rejection rates close to the nominal 5%…
Lee, Taesam
2017-02-01
The outputs from general circulation models (GCMs) provide useful information about the rate and magnitude of future climate change. The temperature variable is more reliable than other variables in GCM outputs. However, hydrological variables (e.g., precipitation) from GCM outputs for future climate change possess an uncertainty that is too high for practical use. Therefore, a method called intentionally biased bootstrapping (IBB), which simulates the increase of the temperature variable by a certain level as ascertained from observed global warming data, is proposed. In addition, precipitation data were resampled by employing a block-wise sampling technique associated with the temperature simulation. In summary, a warming temperature scenario is simulated, along with the corresponding precipitation values whose time indices are the same as those of the simulated warming temperature scenario. The proposed method was validated with annual precipitation data by truncating the recent years of the record. The proposed model was also employed to assess the future changes in seasonal precipitation in South Korea within a global warming scenario as well as in weekly timescales. The results illustrate that the proposed method is a good alternative for assessing the variation of hydrological variables such as precipitation under the warming condition.
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Bootstrap Power of Time Series Goodness of fit tests
Directory of Open Access Journals (Sweden)
Sohail Chand
2013-10-01
Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.
Bootstrapping Object Coreferencing on the Semantic Web
Institute of Scientific and Technical Information of China (English)
Wei Hu; Yu-Zhong Qu; Xing-Zhi Sun
2011-01-01
An object on the Semantic Web is likely to be denoted with several URIs by different parties.Object coreferencing is a process to identify "equivalent" URIs of objects for achieving a better Data Web.In this paper,we propose a bootstrapping approach for object coreferencing on the Semantic Web.For an object URI,we firstly establish a kernel that consists of semantically equivalent URIs from the same-as,(inverse) functional properties and (max-)cardinalities,and then extend the kernel with respect to the textual descriptions (e.g.,labels and local names) of URIs.We also propose a trustworthiness-based method to rank the coreferent URIs in the kernel as well as a similarity-based method for ranking the URIs in the extension of the kernel.We implement the proposed approach,called ObjectCoref,on a large-scale dataset that contains 76 million URIs collected by the Falcons search engine until 2008.The evaluation on precision,relative recall and response time demonstrates the feasibility of our approach.Additionally,we apply the proposed approach to investigate the popularity of the URI alias phenomenon on the current Semantic Web.
Simulation-optimization via Kriging and bootstrapping : A survey
Kleijnen, Jack P.C.
2014-01-01
This article surveys optimization of simulated systems. The simulation may be either deterministic or random. The survey reflects the author’s extensive experience with simulation-optimization through Kriging (or Gaussian process) metamodels, analysed through parametric bootstrapping for determinist
Bootstrapping the statistical uncertainties of NN scattering data
Perez, R Navarro; Arriola, E Ruiz
2014-01-01
We use the Monte Carlo bootstrap as a method to simulate pp and np scattering data below pion production threshold from an initial set of over 6700 experimental mutually $3\\sigma$ consistent data. We compare the results of the bootstrap, with 1020 statistically generated samples of the full database, with the standard covariance matrix method of error propagation. No significant differences in scattering observables and phase shifts are found. This suggests alternative strategies for propagating errors of nuclear forces in nuclear structure calculations.
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Control of bootstrap current in the pedestal region of tokamaks
Energy Technology Data Exchange (ETDEWEB)
Shaing, K. C. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China); Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796 (United States); Lai, A. L. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China)
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by the electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.
Epipolar Resampling of Cross-Track Pushbroom Satellite Imagery Using the Rigorous Sensor Model.
Jannati, Mojtaba; Valadan Zoej, Mohammad Javad; Mokhtarzade, Mehdi
2017-01-11
Epipolar resampling aims to eliminate the vertical parallax of stereo images. Due to the dynamic nature of the exterior orientation parameters of linear pushbroom satellite imagery and the complexity of reconstructing the epipolar geometry using rigorous sensor models, so far, no epipolar resampling approach has been proposed based on these models. In this paper for the first time it is shown that the orientation of the instantaneous baseline (IB) of conjugate image points (CIPs) in the linear pushbroom satellite imagery can be modeled with high precision in terms of the rows- and the columns-number of CIPs. Taking advantage of this feature, a novel approach is then presented for epipolar resampling of cross-track linear pushbroom satellite imagery. The proposed method is based on the rigorous sensor model. As the instantaneous position of sensors remains fixed, the digital elevation model of the area of interest is not required in the resampling process. Experimental results obtained from two pairs of SPOT and one pair of RapidEye stereo imagery with different terrain conditions shows that the proposed epipolar resampling approach benefits from a superior accuracy, as the remained vertical parallaxes of all CIPs in the normalized images are close to zero.
Epipolar Resampling of Cross-Track Pushbroom Satellite Imagery Using the Rigorous Sensor Model
Directory of Open Access Journals (Sweden)
Mojtaba Jannati
2017-01-01
Full Text Available Epipolar resampling aims to eliminate the vertical parallax of stereo images. Due to the dynamic nature of the exterior orientation parameters of linear pushbroom satellite imagery and the complexity of reconstructing the epipolar geometry using rigorous sensor models, so far, no epipolar resampling approach has been proposed based on these models. In this paper for the first time it is shown that the orientation of the instantaneous baseline (IB of conjugate image points (CIPs in the linear pushbroom satellite imagery can be modeled with high precision in terms of the rows- and the columns-number of CIPs. Taking advantage of this feature, a novel approach is then presented for epipolar resampling of cross-track linear pushbroom satellite imagery. The proposed method is based on the rigorous sensor model. As the instantaneous position of sensors remains fixed, the digital elevation model of the area of interest is not required in the resampling process. Experimental results obtained from two pairs of SPOT and one pair of RapidEye stereo imagery with different terrain conditions shows that the proposed epipolar resampling approach benefits from a superior accuracy, as the remained vertical parallaxes of all CIPs in the normalized images are close to zero.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
A robust Kalman framework with resampling and optimal smoothing.
Kautz, Thomas; Eskofier, Bjoern M
2015-02-27
The Kalman filter (KF) is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies.
Building Intuitions about Statistical Inference Based on Resampling
Watson, Jane; Chance, Beth
2012-01-01
Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…
A Robust Kalman Framework with Resampling and Optimal Smoothing
Directory of Open Access Journals (Sweden)
Thomas Kautz
2015-02-01
Full Text Available The Kalman filter (KF is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies.
Stability of response characteristics of a Delphi panel: application of bootstrap data expansion
Directory of Open Access Journals (Sweden)
Cole Bryan R
2005-12-01
Full Text Available Abstract Background Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. Methods The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. Results The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. Conclusion Panels of similarly trained experts (who possess a general understanding in the field of interest provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making.
Unbiased bootstrap error estimation for linear discriminant analysis.
Vu, Thang; Sima, Chao; Braga-Neto, Ulisses M; Dougherty, Edward R
2014-12-01
Convex bootstrap error estimation is a popular tool for classifier error estimation in gene expression studies. A basic question is how to determine the weight for the convex combination between the basic bootstrap estimator and the resubstitution estimator such that the resulting estimator is unbiased at finite sample sizes. The well-known 0.632 bootstrap error estimator uses asymptotic arguments to propose a fixed 0.632 weight, whereas the more recent 0.632+ bootstrap error estimator attempts to set the weight adaptively. In this paper, we study the finite sample problem in the case of linear discriminant analysis under Gaussian populations. We derive exact expressions for the weight that guarantee unbiasedness of the convex bootstrap error estimator in the univariate and multivariate cases, without making asymptotic simplifications. Using exact computation in the univariate case and an accurate approximation in the multivariate case, we obtain the required weight and show that it can deviate significantly from the constant 0.632 weight, depending on the sample size and Bayes error for the problem. The methodology is illustrated by application on data from a well-known cancer classification study.
Resampling-based approaches to study variation in morphological modularity.
Directory of Open Access Journals (Sweden)
Carmelo Fruciano
Full Text Available Modularity has been suggested to be connected to evolvability because a higher degree of independence among parts allows them to evolve as separate units. Recently, the Escoufier RV coefficient has been proposed as a measure of the degree of integration between modules in multivariate morphometric datasets. However, it has been shown, using randomly simulated datasets, that the value of the RV coefficient depends on sample size. Also, so far there is no statistical test for the difference in the RV coefficient between a priori defined groups of observations. Here, we (1, using a rarefaction analysis, show that the value of the RV coefficient depends on sample size also in real geometric morphometric datasets; (2 propose a permutation procedure to test for the difference in the RV coefficient between a priori defined groups of observations; (3 show, through simulations, that such a permutation procedure has an appropriate Type I error; (4 suggest that a rarefaction procedure could be used to obtain sample-size-corrected values of the RV coefficient; and (5 propose a nearest-neighbor procedure that could be used when studying the variation of modularity in geographic space. The approaches outlined here, readily extendable to non-morphometric datasets, allow study of the variation in the degree of integration between a priori defined modules. A Java application--that will allow performance of the proposed test using a software with graphical user interface--has also been developed and is available at the Morphometrics at Stony Brook Web page (http://life.bio.sunysb.edu/morph/.
Using Resampling To Estimate the Precision of an Empirical Standard-Setting Method.
Muijtjens, Arno M. M.; Kramer, Anneke W. M.; Kaufman, David M.; Van der Vleuten, Cees P. M.
2003-01-01
Developed a method to estimate the cutscore precisions for empirical standard-setting methods by using resampling. Illustrated the method with two actual datasets consisting of 86 Dutch medical residents and 155 Canadian medical students taking objective structured clinical examinations. Results show the applicability of the method. (SLD)
Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'
DEFF Research Database (Denmark)
de Nijs, Robin
2015-01-01
methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics......, also in the case of rounding off of the images....
A steady-State Genetic Algorithm with Resampling for Noisy Inventory Control
Prestwich, S.; Tarim, S.A.; Rossi, R.; Hnich, B.
2008-01-01
Noisy fitness functions occur in many practical applications of evolutionary computation. A standard technique for solving these problems is fitness resampling but this may be inefficient or need a large population, and combined with elitism it may overvalue chromosomes or reduce genetic diversity.
Benchmark of the bootstrap current simulation in helical plasmas
Huang, Botsz; Kanno, Ryutaro; Sugama, Hideo; Goto, Takuya
2016-01-01
The importance of the parallel momentum conservation on the bootstrap current evaluation in nonaxisymmetric systems is demonstrated by the benchmarks among the local drift-kinetic equation solvers, i.e., the Zero-Orbit-width(ZOW), DKES, and PENTA codes. The ZOW model is extended to include the ion parallel mean flow effect on the electron-ion parallel friction. Compared to the DKES model in which only the pitch-angle-scattering term is included in the collision operator, the PENTA model employs the Sugama-Nishimura method to correct the momentum balance. The ZOW and PENTA models agree each other well on the calculations of the bootstrap current. The DKES results without the parallel momentum conservation deviates significantly from those from the ZOW and PENTA models. This work verifies the reliability of the bootstrap current calculation with the ZOW and PENTA models for the helical plasmas.
Point Set Denoising Using Bootstrap-Based Radial Basis Function
Ramli, Ahmad; Abd. Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study. PMID:27315105
Design and Implementation of a Bootstrap Trust Chain
Institute of Scientific and Technical Information of China (English)
YU Fajiang; ZHANG Huanguo
2006-01-01
The chain of trust in bootstrap process is the basis of whole system trust in the trusted computing group (TCG) definition. This paper presents a design and implementation of a bootstrap trust chain in PC based on the Windows and today' commodity hardware, merely depends on availability of an embedded security module (ESM). ESM and security enhanced BIOS is the root of trust, PMBR (Pre-MBR) checks the integrity of boot data and Windows kernel, which is a checking agent stored in ESM. In the end, the paper analyzed the mathematic expression of the chain of trust and the runtime performance compared with the common booting process. The trust chain bootstrap greatly strengthens the security of personal computer system, and affects the runtime performance with only adding about 12% booting time.
Addressing the P2P Bootstrap Problem for Small Networks
Wolinsky, David Isaac; Boykin, P Oscar; Figueiredo, Renato
2010-01-01
P2P overlays provide a framework for building distributed applications consisting of few to many resources with features including self-configuration, scalability, and resilience to node failures. Such systems have been successfully adopted in large-scale services for content delivery networks, file sharing, and data storage. In small-scale systems, they can be useful to address privacy concerns and for network applications that lack dedicated servers. The bootstrap problem, finding an existing peer in the overlay, remains a challenge to enabling these services for small-scale P2P systems. In large networks, the solution to the bootstrap problem has been the use of dedicated services, though creating and maintaining these systems requires expertise and resources, which constrain their usefulness and make them unappealing for small-scale systems. This paper surveys and summarizes requirements that allow peers potentially constrained by network connectivity to bootstrap small-scale overlays through the use of e...
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages ob...... obtained by Monte-Carlo sampling.......We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
PyCFTBoot: A flexible interface for the conformal bootstrap
Behan, Connor
2016-01-01
We introduce PyCFTBoot, a wrapper designed to reduce the barrier to entry in conformal bootstrap calculations that require semidefinite programming. Symengine and SDPB are used for the most intensive symbolic and numerical steps respectively. After reviewing the built-in algorithms for conformal blocks, we explain how to use the code through a number of examples that verify past results. As an application, we show that the multi-correlator bootstrap still appears to single out the Wilson-Fisher fixed points as special theories in dimensions between 3 and 4 despite the recent proof that they violate unitarity.
The conditional resampling model STARS: weaknesses of the modeling concept and development
Menz, Christoph
2016-04-01
The Statistical Analogue Resampling Scheme (STARS) is based on a modeling concept of Werner and Gerstengarbe (1997). The model uses a conditional resampling technique to create a simulation time series from daily observations. Unlike other time series generators (such as stochastic weather generators) STARS only needs a linear regression specification of a single variable as the target condition for the resampling. Since its first implementation the algorithm was further extended in order to allow for a spatially distributed trend signal, to preserve the seasonal cycle and the autocorrelation of the observation time series (Orlovsky, 2007; Orlovsky et al., 2008). This evolved version was successfully used in several climate impact studies. However a detaild evaluation of the simulations revealed two fundamental weaknesses of the utilized resampling technique. 1. The restriction of the resampling condition on a single individual variable can lead to a misinterpretation of the change signal of other variables when the model is applied to a mulvariate time series. (F. Wechsung and M. Wechsung, 2014). As one example, the short-term correlations between precipitation and temperature (cooling of the near-surface air layer after a rainfall event) can be misinterpreted as a climatic change signal in the simulation series. 2. The model restricts the linear regression specification to the annual mean time series, refusing the specification of seasonal varying trends. To overcome these fundamental weaknesses a redevelopment of the whole algorithm was done. The poster discusses the main weaknesses of the earlier model implementation and the methods applied to overcome these in the new version. Based on the new model idealized simulations were conducted to illustrate the enhancement.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
Determining the co-integrating rank of a system of variables has become a fundamental aspect of applied research in macroeconomics and finance. It is wellknown that standard asymptotic likelihood ratio tests for co-integration rank of Johansen (1996) can be unreliable in small samples with empiri......Determining the co-integrating rank of a system of variables has become a fundamental aspect of applied research in macroeconomics and finance. It is wellknown that standard asymptotic likelihood ratio tests for co-integration rank of Johansen (1996) can be unreliable in small samples...... with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... that the probability of selecting a rank smaller than (equal to) the true co-integrating rank will converge to zero (one minus the marginal significance level), as the sample size diverges, for general I(1) processes. No such likelihood-based procedure is currently known to be available. In this paper we fill this gap...
Porto, Paolo; Walling, Des E; Alewell, Christine; Callegari, Giovanni; Mabit, Lionel; Mallimo, Nicola; Meusburger, Katrin; Zehringer, Markus
2014-12-01
Soil erosion and both its on-site and off-site impacts are increasingly seen as a serious environmental problem across the world. The need for an improved evidence base on soil loss and soil redistribution rates has directed attention to the use of fallout radionuclides, and particularly (137)Cs, for documenting soil redistribution rates. This approach possesses important advantages over more traditional means of documenting soil erosion and soil redistribution. However, one key limitation of the approach is the time-averaged or lumped nature of the estimated erosion rates. In nearly all cases, these will relate to the period extending from the main period of bomb fallout to the time of sampling. Increasing concern for the impact of global change, particularly that related to changing land use and climate change, has frequently directed attention to the need to document changes in soil redistribution rates within this period. Re-sampling techniques, which should be distinguished from repeat-sampling techniques, have the potential to meet this requirement. As an example, the use of a re-sampling technique to derive estimates of the mean annual net soil loss from a small (1.38 ha) forested catchment in southern Italy is reported. The catchment was originally sampled in 1998 and samples were collected from points very close to the original sampling points again in 2013. This made it possible to compare the estimate of mean annual erosion for the period 1954-1998 with that for the period 1999-2013. The availability of measurements of sediment yield from the catchment for parts of the overall period made it possible to compare the results provided by the (137)Cs re-sampling study with the estimates of sediment yield for the same periods. In order to compare the estimates of soil loss and sediment yield for the two different periods, it was necessary to establish the uncertainty associated with the individual estimates. In the absence of a generally accepted procedure
Zhu, Feng; Feng, Weiyue; Wang, Huajian; Huang, Shaosen; Lv, Yisong; Chen, Yong
2013-01-01
X-ray spectral imaging provides quantitative imaging of trace elements in biological sample with high sensitivity. We propose a novel algorithm to promote the signal-to-noise ratio (SNR) of X-ray spectral images that have low photon counts. Firstly, we estimate the image data area that belongs to the homogeneous parts through confidence interval testing. Then, we apply the Poisson regression through its maximum likelihood estimation on this area to estimate the true photon counts from the Poisson noise corrupted data. Unlike other denoising methods based on regression analysis, we use the bootstrap resampling methods to ensure the accuracy of regression estimation. Finally, we use a robust local nonparametric regression method to estimate the baseline and subsequently subtract it from the X-ray spectral data to further improve the SNR of the data. Experiments on several real samples show that the proposed method performs better than some state-of-the-art approaches to ensure accuracy and precision for quantit...
C*-Algebras over Topological Spaces: The Bootstrap Class
Meyer, Ralf
2007-01-01
We carefully define and study C*-algebras over topological spaces, possibly non-Hausdorff, and review some relevant results from point-set topology along the way. We explain the triangulated category structure on the bivariant Kasparov theory over a topological space. We introduce and describe an analogue of the bootstrap class for C*-algebras over a finite topological space.
Bootstrapping Rapidity Anomalous Dimensions for Transverse-Momentum Resummation
Energy Technology Data Exchange (ETDEWEB)
Li, Ye; Zhu, Hua Xing
2017-01-01
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
Sidecoin: a snapshot mechanism for bootstrapping a blockchain
Krug, Joseph; Peterson, Jack
2015-01-01
Sidecoin is a mechanism that allows a snapshot to be taken of Bitcoin's blockchain. We compile a list of Bitcoin's unspent transaction outputs, then use these outputs and their corresponding balances to bootstrap a new blockchain. This allows the preservation of Bitcoin's economic state in the context of a new blockchain, which may provide new features and technical innovations.
Bootstrapping Rapidity Anomalous Dimension for Transverse-Momentum Resummation
Energy Technology Data Exchange (ETDEWEB)
Li, Ye [Fermilab; Zhu, Hua Xing [MIT, Cambridge, CTP
2016-04-05
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
A neural network based reputation bootstrapping approach for service selection
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in t
Janssen, Steve M J; Chessa, Antonio G; Murre, Jaap M J
2007-10-01
The reminiscence bump is the effect that people recall more personal events from early adulthood than from childhood or adulthood. The bump has been examined extensively. However, the question of whether the bump is caused by differential encoding or re-sampling is still unanswered. To examine this issue, participants were asked to name their three favourite books, movies, and records. Furthermore,they were asked when they first encountered them. We compared the temporal distributions and found that they all showed recency effects and reminiscence bumps. The distribution of favourite books had the largest recency effect and the distribution of favourite records had the largest reminiscence bump. We can explain these results by the difference in rehearsal. Books are read two or three times, movies are watched more frequently, whereas records are listened to numerous times. The results suggest that differential encoding initially causes the reminiscence bump and that re-sampling increases the bump further.
Quasi-Epipolar Resampling of High Resolution Satellite Stereo Imagery for Semi Global Matching
Tatar, N.; Saadatseresht, M.; Arefi, H.; Hadavand, A.
2015-12-01
Semi-global matching is a well-known stereo matching algorithm in photogrammetric and computer vision society. Epipolar images are supposed as input of this algorithm. Epipolar geometry of linear array scanners is not a straight line as in case of frame camera. Traditional epipolar resampling algorithms demands for rational polynomial coefficients (RPCs), physical sensor model or ground control points. In this paper we propose a new solution for epipolar resampling method which works without the need for these information. In proposed method, automatic feature extraction algorithms are employed to generate corresponding features for registering stereo pairs. Also original images are divided into small tiles. In this way by omitting the need for extra information, the speed of matching algorithm increased and the need for high temporal memory decreased. Our experiments on GeoEye-1 stereo pair captured over Qom city in Iran demonstrates that the epipolar images are generated with sub-pixel accuracy.
Automotive FMCW Radar-enhanced Range Estimation via a Local Resampling Fourier Transform
2016-01-01
In complex traffic scenarios, more accurate measurement and discrimination for an automotive frequency-modulated continuous-wave (FMCW) radar is required for intelligent robots, driverless cars and driver-assistant systems. A more accurate range estimation method based on a local resampling Fourier transform (LRFT) for a FMCW radar is developed in this paper. Radar signal correlation in the phase space sees a higher signal-noise-ratio (SNR) to achieve more accurate ranging, and the LRFT - whi...
Bárcena, Teresa G; Gundersen, Per; Vesterdal, Lars
2014-09-01
Chronosequences are commonly used to assess soil organic carbon (SOC) sequestration after land-use change, but SOC dynamics predicted by this space-for-time substitution approach have rarely been validated by resampling. We conducted a combined chronosequence/resampling study in a former cropland area (Vestskoven) afforested with oak (Quercus robur) and Norway spruce (Picea abies) over the past 40 years. The aims of this study were (i) to compare present and previous chronosequence trends in forest floor and top mineral soil (0-25 cm) C stocks; (ii) to compare chronosequence estimates with current rates of C stock change based on resampling at the stand level; (iii) to estimate SOC changes in the subsoil (25-50 cm); and (iv) to assess the influence of two tree species on SOC dynamics. The two chronosequence trajectories for forest floor C stocks revealed consistently higher rates of C sequestration in spruce than oak. The chronosequence trajectory was validated by resampling and current rates of forest floor C sequestration decreased with stand age. Chronosequence trends in topsoil SOC in 2011 did not differ significantly from those reported in 1998, however, there was a shift from a negative rate (1998: -0.3 Mg C ha(-1) yr(-1) ) to no change in 2011. In contrast SOC stocks in the subsoil increased with stand age, however, not significantly (P = 0.1), suggesting different C dynamics in and below the former plough layer. Current rates of C change estimated by repeated sampling decreased with stand age in forest floors but increased in the topsoil. The contrasting temporal change in forest floor and mineral soil C sequestration rates indicate a shift in C source-sink strength after approximately 40 years. We conclude that afforestation of former cropland within the temperate region may induce soil C loss during the first decades followed by a recovery phase of yet unknown duration.
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.
Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca
2015-08-12
Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping
Directory of Open Access Journals (Sweden)
Irene Borra-Serrano
2015-08-01
Full Text Available Unmanned aerial vehicles (UAVs combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.
BOOTSTRAP WAVELET IN THE NONPARAMETRIC REGRESSION MODEL WITH WEAKLY DEPENDENT PROCESSES
Institute of Scientific and Technical Information of China (English)
林路; 张润楚
2004-01-01
This paper introduces a method of bootstrap wavelet estimation in a nonparametric regression model with weakly dependent processes for both fixed and random designs. The asymptotic bounds for the bias and variance of the bootstrap wavelet estimators are given in the fixed design model. The conditional normality for a modified version of the bootstrap wavelet estimators is obtained in the fixed model. The consistency for the bootstrap wavelet estimator is also proved in the random design model. These results show that the bootstrap wavelet method is valid for the model with weakly dependent processes.
Bootstrapping GEE models for fMRI regional connectivity.
D'Angelo, Gina M; Lazar, Nicole A; Zhou, Gongfu; Eddy, William F; Morris, John C; Sheline, Yvette I
2012-12-01
An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations during rest between groups. We apply generalized estimating equation (GEE) models to test for differences in regional correlations across groups. Both the GEE marginal model and GEE transition model are evaluated and compared to the standard pooling Fisher-z approach using simulation studies. Standard errors of all methods are estimated both theoretically (model-based) and empirically (bootstrap). Of all the methods, we find that the transition models have the best statistical properties. Overall, the model-based standard errors and bootstrap standard errors perform about the same. We also demonstrate the methods with a functional connectivity study in a healthy cognitively normal population of ApoE4+ participants and ApoE4- participants who are recruited from the Adult Children's Study conducted at the Washington University Knight Alzheimer's Disease Research Center.
Bolasso: model consistent Lasso estimation through the bootstrap
Bach, Francis
2008-01-01
We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
Bootstrap bound for conformal multi-flavor QCD on lattice
Nakayama, Yu
2016-01-01
The recent work by Iha et al shows an upper bound on mass anomalous dimension $\\gamma_m$ of multi-flavor massless QCD at the renormalization group fixed point from the conformal bootstrap in $SU(N_F)_V$ symmetric conformal field theories under the assumption that the fixed point is realizable with the lattice regularization based on staggered fermions. We show that the almost identical but slightly stronger bound applies to the regularization based on Wilson fermions (or domain wall fermions) by studying the conformal bootstrap in $SU(N_f)_L \\times SU(N_f)_R$ symmetric conformal field theories. For $N_f=8$, our bound implies $\\gamma_m < 1.31$ to avoid dangerously irrelevant operators that are not compatible with the lattice symmetry.
A conformal bootstrap approach to critical percolation in two dimensions
Picco, Marco; Santachiara, Raoul
2016-01-01
We study four-point functions of critical percolation in two dimensions, and more generally of the Potts model. We propose an exact ansatz for the spectrum: an infinite, discrete and non-diagonal combination of representations of the Virasoro algebra. Based on this ansatz, we compute four-point functions using a numerical conformal bootstrap approach. The results agree with Monte-Carlo computations of connectivities of random clusters.
Bootstrapping a Five-Loop Amplitude from Steinmann Relations
Caron-Huot, Simon; McLeod, Andrew; von Hippel, Matt
2016-01-01
The analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
Non-critical string, Liouville theory and geometric bootstrap hypothesis
Hadasz, L; Hadasz, Leszek; Jaskolski, Zbigniew
2003-01-01
Basing on the standard construction of critical string amplitudes we analyze properties of the longitudinal sector of the non-critical Nambu-Goto string. We demonstrate that it cannot be described by standard (in the sense of BPZ) conformal field theory. As an alternative we propose a new version of the geometric approach to Liouville theory and formulate its basic consistency condition - the geometric bootstrap equation.
Necessary Condition for Emergent Symmetry from the Conformal Bootstrap
Nakayama, Yu; Ohtsuki, Tomoki
2016-09-01
We use the conformal bootstrap program to derive the necessary conditions for emergent symmetry enhancement from discrete symmetry (e.g., Zn ) to continuous symmetry [e.g., U (1 )] under the renormalization group flow. In three dimensions, in order for Z2 symmetry to be enhanced to U (1 ) symmetry, the conformal bootstrap program predicts that the scaling dimension of the order parameter field at the infrared conformal fixed point must satisfy Δ1>1.08 . We also obtain the similar necessary conditions for Z3 symmetry with Δ1>0.580 and Z4 symmetry with Δ1>0.504 from the simultaneous conformal bootstrap analysis of multiple four-point functions. As applications, we show that our necessary conditions impose severe constraints on the nature of the chiral phase transition in QCD, the deconfinement criticality in Néel valence bond solid transitions, and anisotropic deformations in critical O (n ) models. We prove that some fixed points proposed in the literature are unstable under the perturbation that cannot be forbidden by the discrete symmetry. In these situations, the second-order phase transition with enhanced symmetry cannot happen.
Truncatable bootstrap equations in algebraic form and critical surface exponents
Gliozzi, Ferdinando
2016-01-01
We describe examples of drastic truncations of conformal bootstrap equations encoding much more information than that obtained by a direct numerical approach. A three-term truncation of the four point function of a free scalar in any space dimensions provides algebraic identities among conformal block derivatives which generate the exact spectrum of the infinitely many primary operators contributing to it. In boundary conformal field theories, we point out that the appearance of free parameters in the solutions of bootstrap equations is not an artifact of truncations, rather it reflects a physical property of permeable conformal interfaces which are described by the same equations. Surface transitions correspond to isolated points in the parameter space. We are able to locate them in the case of 3d Ising model, thanks to a useful algebraic form of 3d boundary bootstrap equations. It turns out that the low-lying spectra of the surface operators in the ordinary and the special transitions of 3d Ising model form...
Soybean yield modeling using bootstrap methods for small samples
Energy Technology Data Exchange (ETDEWEB)
Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.
2016-11-01
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)
Directory of Open Access Journals (Sweden)
Rohin Anhal
2013-10-01
Full Text Available The aim of this paper is to examine the direction of causality between real GDP on the one hand and final energy and coal consumption on the other in India, for the period from 1970 to 2011. The methodology adopted is the non-parametric bootstrap procedure, which is used to construct the critical values for the hypothesis of causality. The results of the bootstrap tests show that for total energy consumption, there exists no causal relationship in either direction with GDP of India. However, if coal consumption is considered, we find evidence in support of unidirectional causality running from coal consumption to GDP. This clearly has important implications for the Indian economy. The most important implication is that curbing coal consumption in order to reduce carbon emissions would in turn have a limiting effect on economic growth. Our analysis contributes to the literature in three distinct ways. First, this is the first paper to use the bootstrap method to examine the growth-energy connection for the Indian economy. Second, we analyze data for the time period 1970 to 2011, thereby utilizing recently available data that has not been used by others. Finally, in contrast to the recently done studies, we adopt a disaggregated approach for the analysis of the growth-energy nexus by considering not only aggregate energy consumption, but coal consumption as well.
van de Water, S; Kraan, A C; Breedveld, S; Schillemans, W; Teguh, D N; Kooy, H M; Madden, T M; Heijmen, B J M; Hoogeman, M S
2013-10-01
This study investigates whether 'pencil beam resampling', i.e. iterative selection and weight optimization of randomly placed pencil beams (PBs), reduces optimization time and improves plan quality for multi-criteria optimization in intensity-modulated proton therapy, compared with traditional modes in which PBs are distributed over a regular grid. Resampling consisted of repeatedly performing: (1) random selection of candidate PBs from a very fine grid, (2) inverse multi-criteria optimization, and (3) exclusion of low-weight PBs. The newly selected candidate PBs were added to the PBs in the existing solution, causing the solution to improve with each iteration. Resampling and traditional regular grid planning were implemented into our in-house developed multi-criteria treatment planning system 'Erasmus iCycle'. The system optimizes objectives successively according to their priorities as defined in the so-called 'wish-list'. For five head-and-neck cancer patients and two PB widths (3 and 6 mm sigma at 230 MeV), treatment plans were generated using: (1) resampling, (2) anisotropic regular grids and (3) isotropic regular grids, while using varying sample sizes (resampling) or grid spacings (regular grid). We assessed differences in optimization time (for comparable plan quality) and in plan quality parameters (for comparable optimization time). Resampling reduced optimization time by a factor of 2.8 and 5.6 on average (7.8 and 17.0 at maximum) compared with the use of anisotropic and isotropic grids, respectively. Doses to organs-at-risk were generally reduced when using resampling, with median dose reductions ranging from 0.0 to 3.0 Gy (maximum: 14.3 Gy, relative: 0%-42%) compared with anisotropic grids and from -0.3 to 2.6 Gy (maximum: 11.4 Gy, relative: -4%-19%) compared with isotropic grids. Resampling was especially effective when using thin PBs (3 mm sigma). Resampling plans contained on average fewer PBs, energy layers and protons than anisotropic grid
DEFF Research Database (Denmark)
Huang, Shaojun; Mathe, Laszlo; Teodorescu, Remus
2013-01-01
Two existing methods to implement resampling modulation technique for modular multilevel converter (MMC) (the sampling frequency is a multiple of the carrier frequency) are: the software solution (using a microcontroller) and the hardware solution (using FPGA). The former has a certain level...... of inaccuracy in terms of switching instant, while the latter is inflexible with higher cost. In order to overcome these drawbacks, this paper proposes an alternative solution, which uses high frequency saw-tooth carrier and two independent comparators, usually available in microcontrollers, to create...
A resampling-based meta-analysis for detection of differential gene expression in breast cancer
Directory of Open Access Journals (Sweden)
Ergul Gulusan
2008-12-01
Full Text Available Abstract Background Accuracy in the diagnosis of breast cancer and classification of cancer subtypes has improved over the years with the development of well-established immunohistopathological criteria. More recently, diagnostic gene-sets at the mRNA expression level have been tested as better predictors of disease state. However, breast cancer is heterogeneous in nature; thus extraction of differentially expressed gene-sets that stably distinguish normal tissue from various pathologies poses challenges. Meta-analysis of high-throughput expression data using a collection of statistical methodologies leads to the identification of robust tumor gene expression signatures. Methods A resampling-based meta-analysis strategy, which involves the use of resampling and application of distribution statistics in combination to assess the degree of significance in differential expression between sample classes, was developed. Two independent microarray datasets that contain normal breast, invasive ductal carcinoma (IDC, and invasive lobular carcinoma (ILC samples were used for the meta-analysis. Expression of the genes, selected from the gene list for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes were tested on 10 independent primary IDC samples and matched non-tumor controls by real-time qRT-PCR. Other existing breast cancer microarray datasets were used in support of the resampling-based meta-analysis. Results The two independent microarray studies were found to be comparable, although differing in their experimental methodologies (Pearson correlation coefficient, R = 0.9389 and R = 0.8465 for ductal and lobular samples, respectively. The resampling-based meta-analysis has led to the identification of a highly stable set of genes for classification of normal breast samples and breast tumors encompassing both the ILC and IDC subtypes. The expression results of the selected genes obtained through real
A Bootstrap Approach to an Affordable Exploration Program
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and
Two novel applications of bootstrap currents: Snakes and jitter stabilization
Energy Technology Data Exchange (ETDEWEB)
Thyagaraja, A.; Haas, F.A. (AEA Fusion (AEA Fusion/Euratom Fusion Association), Culham Laboratory, Abingdon, OX14 3DB (United Kingdom))
1993-09-01
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of noninductive current are considered in this work. In the first, an earlier model of the nonlinearly saturated [ital m]=1 tearing mode is extended to include the stabilizing effect of a bootstrap current [ital inside] the island. This is used to explain several observed features of the so-called snake'' reported in the Joint European Torus (JET) [R. D. Gill, A. W. Edwards, D. Pasini, and A. Weller, Nucl. Fusion [bold 32], 723 (1992)]. The second application involves an alternating current (ac) form of bootstrap current, produced by pressure-gradient fluctuations. It is suggested that a time-dependent (in the plasma frame), radio-frequency (rf) power source can be used to produce localized pressure fluctuations of suitable frequency and amplitude to implement the dynamic stabilization method for suppressing gross modes in tokamaks suggested in a recent paper [A. Thyagaraja, R. D. Hazeltine, and A. Y. Aydemir, Phys. Fluids B [bold 4], 2733 (1992)]. This method works by detuning'' the resonant layer by rapid current/shear fluctuations. Estimates made for the power source requirements both for small machines such as COMPASS and for larger machines like JET suggest that the method could be practically feasible. This jitter'' (i.e., dynamic) stabilization method could provide a useful form of active instability control to avoid both gross/disruptive and fine-scale/transportive instabilities, which may set severe operating/safety constraints in the reactor regime. The results are also capable, in principle, of throwing considerable light on the local properties of current generation and diffusion in tokamaks, which may be enhanced by turbulence, as has been suggested recently by several researchers.
Intervalos de confiança bootstrap e subamostragem
Sebastião, João; Nunes, Sara
2003-01-01
Neste trabalho consideram-se as metodologias de reamostragem e subamostragem. Estas metodologias, computacionalmente intensivas, são hoje amplamente utilizadas na Inferência Estatística no cálculo de intervalos de confiança para um determinado parâmetro de interesse. Num estudo de simulação aplica-se o Bootstrap e a subamostragem a conjuntos de observações provenientes de populações normais com o objectivo de determinar intervalos de confiança para o valor médio.
Two novel applications of bootstrap currents: snakes and jitter stabilization
Energy Technology Data Exchange (ETDEWEB)
Thyagaraja, A. [AEA Technology, Culham (United Kingdom); Haas, F.A. [The Open University, Oxford Research Unit, Oxford (United Kingdom)
1993-12-31
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of non-inductive current are considered in this work. The first is an explanation of the `snake` phenomenon observed in JET based on steady-state nonlinear tearing theory. The second is an active method of dynamic stabilization of the m=1 mode using the `jitter` approach suggested by Thyagaraja et al in a recent paper. (author) 11 refs.
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Remuestreo Bootstrap y Jackknife en confiabilidad: Caso Exponencial y Weibull
Directory of Open Access Journals (Sweden)
Javier Ramírez-Montoya
2016-01-01
Full Text Available Se comparan los métodos de remuestreo Bootstrap-t y Jackknife delete I y delete II, utilizando los estimadores no paramétricos de Kaplan-Meier y Nelson-Aalen, que se utilizan con frecuencia en la práctica, teniendo en cuenta diferentes porcentajes de censura, tamaños de muestra y tiempos de interés. La comparación se realiza vía simulación, mediante el error cuadrático medio.
The S-matrix Bootstrap II: Two Dimensional Amplitudes
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
ENSO-conditioned weather resampling method for seasonal ensemble streamflow prediction
Beckers, Joost V. L.; Weerts, Albrecht H.; Tijdeman, Erik; Welles, Edwin
2016-08-01
Oceanic-atmospheric climate modes, such as El Niño-Southern Oscillation (ENSO), are known to affect the local streamflow regime in many rivers around the world. A new method is proposed to incorporate climate mode information into the well-known ensemble streamflow prediction (ESP) method for seasonal forecasting. The ESP is conditioned on an ENSO index in two steps. First, a number of original historical ESP traces are selected based on similarity between the index value in the historical year and the index value at the time of forecast. In the second step, additional ensemble traces are generated by a stochastic ENSO-conditioned weather resampler. These resampled traces compensate for the reduction of ensemble size in the first step and prevent degradation of skill at forecasting stations that are less affected by ENSO. The skill of the ENSO-conditioned ESP is evaluated over 50 years of seasonal hindcasts of streamflows at three test stations in the Columbia River basin in the US Pacific Northwest. An improvement in forecast skill of 5 to 10 % is found for two test stations. The streamflows at the third station are less affected by ENSO and no change in forecast skill is found here.
Visibility graph analysis for re-sampled time series from auto-regressive stochastic processes
Zhang, Rong; Zou, Yong; Zhou, Jie; Gao, Zhong-Ke; Guan, Shuguang
2017-01-01
Visibility graph (VG) and horizontal visibility graph (HVG) play a crucial role in modern complex network approaches to nonlinear time series analysis. However, depending on the underlying dynamic processes, it remains to characterize the exponents of presumably exponential degree distributions. It has been recently conjectured that there is a critical value of exponent λc = ln 3 / 2 , which separates chaotic from correlated stochastic processes. Here, we systematically apply (H)VG analysis to time series from autoregressive (AR) models, which confirms the hypothesis that an increased correlation length results in larger values of λ > λc. On the other hand, we numerically find a regime of negatively correlated process increments where λ < λc, which is in contrast to this hypothesis. Furthermore, by constructing graphs based on re-sampled time series, we find that network measures show non-trivial dependencies on the autocorrelation functions of the processes. We propose to choose the decorrelation time as the maximal re-sampling delay for the algorithm. Our results are detailed for time series from AR(1) and AR(2) processes.
Methods of soil resampling to monitor changes in the chemical concentrations of forest soils
Lawrence, Gregory B.; Fernandez, Ivan J.; Hazlett, Paul W.; Bailey, Scott W.; Ross, Donald S.; Villars, Thomas R.; Quintana, Angelica; Ouimet, Rock; McHale, Michael; Johnson, Chris E.; Briggs, Russell D.; Colter, Robert A.; Siemion, Jason; Bartlett, Olivia L.; Vargas, Olga; Antidormi, Michael; Koppers, Mary Margaret
2016-01-01
Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise.
Directory of Open Access Journals (Sweden)
Bryan R Conroy
Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a
Bootstrap embedding: An internally consistent fragment-based method
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy
2016-08-01
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.
Bootstrap Percolation on Complex Networks with Community Structure
Chong, Wu; Rui, Zhang; Liujun, Chen; Jiawei, Chen; Xiaobin, Li; Yanqing, Hu
2014-01-01
Real complex networks usually involve community structure. How innovation and new products spread on social networks which have internal structure is a practically interesting and fundamental question. In this paper we study the bootstrap percolation on a single network with community structure, in which we initiate the bootstrap process by activating different fraction of nodes in each community. A previously inactive node transfers to active one if it detects at least $k$ active neighbors. The fraction of active nodes in community $i$ in the final state $S_i$ and its giant component size $S_{gci}$ are theoretically obtained as functions of the initial fractions of active nodes $f_i$. We show that such functions undergo multiple discontinuous transitions; The discontinuous jump of $S_i$ or $S_{gci}$ in one community may trigger a simultaneous jump of that in the other, which leads to multiple discontinuous transitions for the total fraction of active nodes $S$ and its associated giant component size $S_{gc}$...
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Bootstrap Learning and Visual Processing Management on Mobile Robots
Directory of Open Access Journals (Sweden)
Mohan Sridharan
2010-01-01
Full Text Available A central goal of robotics and AI is to enable a team of robots to operate autonomously in the real world and collaborate with humans over an extended period of time. Though developments in sensor technology have resulted in the deployment of robots in specific applications the ability to accurately sense and interact with the environment is still missing. Key challenges to the widespread deployment of robots include the ability to learn models of environmental features based on sensory inputs, bootstrap off of the learned models to detect and adapt to environmental changes, and autonomously tailor the sensory processing to the task at hand. This paper summarizes a comprehensive effort towards such bootstrap learning, adaptation, and processing management using visual input. We describe probabilistic algorithms that enable a mobile robot to autonomously plan its actions to learn models of color distributions and illuminations. The learned models are used to detect and adapt to illumination changes. Furthermore, we describe a probabilistic sequential decision-making approach that autonomously tailors the visual processing to the task at hand. All algorithms are fully implemented and tested on robot platforms in dynamic environments.
Bootstrapping One-Loop QCD Amplitudeswith General Helicities
Energy Technology Data Exchange (ETDEWEB)
Berger, Carola F.; Bern, Zvi; Dixon, Lance J.; Forde, Darren; Kosower, David A.
2006-04-25
The recently developed on-shell bootstrap for computing one-loop amplitudes in non-supersymmetric theories such as QCD combines the unitarity method with loop-level on-shell recursion. For generic helicity configurations, the recursion relations may involve undetermined contributions from non-standard complex singularities or from large values of the shift parameter. Here we develop a strategy for sidestepping difficulties through use of pairs of recursion relations. To illustrate the strategy, we present sets of recursion relations needed for obtaining n-gluon amplitudes in QCD. We give a recursive solution for the one-loop n-gluon QCD amplitudes with three or four color-adjacent gluons of negative helicity and the remaining ones of positive helicity. We provide an explicit analytic formula for the QCD amplitude A{sub 6;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}), as well as numerical results for A{sub 7;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}), A{sub 8;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}, 8{sup +}), and A{sub 8;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup -}, 5{sup +}, 6{sup +}, 7{sup +}, 8{sup +}). We expect the on-shell bootstrap approach to have widespread applications to phenomenological studies at colliders.
CUDA accelerated uniform re-sampling for non-Cartesian MR reconstruction.
Feng, Chaolu; Zhao, Dazhe
2015-01-01
A grid-driven gridding (GDG) method is proposed to uniformly re-sample non-Cartesian raw data acquired in PROPELLER, in which a trajectory window for each Cartesian grid is first computed. The intensity of the reconstructed image at this grid is the weighted average of raw data in this window. Taking consider of the single instruction multiple data (SIMD) property of the proposed GDG, a CUDA accelerated method is then proposed to improve the performance of the proposed GDG. Two groups of raw data sampled by PROPELLER in two resolutions are reconstructed by the proposed method. To balance computation resources of the GPU and obtain the best performance improvement, four thread-block strategies are adopted. Experimental results demonstrate that although the proposed GDG is more time consuming than traditional DDG, the CUDA accelerated GDG is almost 10 times faster than traditional DDG.
Lu, Siliang; Wang, Xiaoxian; He, Qingbo; Liu, Fang; Liu, Yongbin
2016-12-01
Transient signal analysis (TSA) has been proven an effective tool for motor bearing fault diagnosis, but has yet to be applied in processing bearing fault signals with variable rotating speed. In this study, a new TSA-based angular resampling (TSAAR) method is proposed for fault diagnosis under speed fluctuation condition via sound signal analysis. By applying the TSAAR method, the frequency smearing phenomenon is eliminated and the fault characteristic frequency is exposed in the envelope spectrum for bearing fault recognition. The TSAAR method can accurately estimate the phase information of the fault-induced impulses using neither complicated time-frequency analysis techniques nor external speed sensors, and hence it provides a simple, flexible, and data-driven approach that realizes variable-speed motor bearing fault diagnosis. The effectiveness and efficiency of the proposed TSAAR method are verified through a series of simulated and experimental case studies.
A Monte Carlo Resampling Approach for the Calculation of Hybrid Classical and Quantum Free Energies.
Cave-Ayland, Christopher; Skylaris, Chris-Kriton; Essex, Jonathan W
2017-02-14
Hybrid free energy methods allow estimation of free energy differences at the quantum mechanics (QM) level with high efficiency by performing sampling at the classical mechanics (MM) level. Various approaches to allow the calculation of QM corrections to classical free energies have been proposed. The single step free energy perturbation approach starts with a classically generated ensemble, a subset of structures of which are postprocessed to obtain QM energies for use with the Zwanzig equation. This gives an estimate of the free energy difference associated with the change from an MM to a QM Hamiltonian. Owing to the poor numerical properties of the Zwanzig equation, however, recent developments have produced alternative methods which aim to provide access to the properties of the true QM ensemble. Here we propose an approach based on the resampling of MM structural ensembles and application of a Monte Carlo acceptance test which in principle, can generate the exact QM ensemble or intermediate ensembles between the MM and QM states. We carry out a detailed comparison against the Zwanzig equation and recently proposed non-Boltzmann methods. As a test system we use a set of small molecule hydration free energies for which hybrid free energy calculations are performed at the semiempirical Density Functional Tight Binding level. Equivalent ensembles at this level of theory have also been generated allowing the reverse QM to MM perturbations to be performed along with a detailed analysis of the results. Additionally, a previously published nucleotide base pair data set simulated at the QM level using ab initio molecular dynamics is also considered. We provide a strong rationale for the use of the Monte Carlo Resampling and non-Boltzmann approaches by showing that configuration space overlaps can be estimated which provide useful diagnostic information regarding the accuracy of these hybrid approaches.
Extraction of mismatch negativity using a resampling-based spatial filtering method
Lin, Yanfei; Wu, Wei; Wu, Chaohua; Liu, Baolin; Gao, Xiaorong
2013-04-01
Objective. It is currently a challenge to extract the mismatch negativity (MMN) waveform on the basis of a small number of EEG trials, which are typically unbalanced between conditions. Approach. In order to address this issue, a method combining the techniques of resampling and spatial filtering is proposed in this paper. Specifically, the first step of the method, termed ‘resampling difference’, randomly samples the standard and deviant sweeps, and then subtracts standard sweeps from deviant sweeps. The second step of the method employs the spatial filters designed by a signal-to-noise ratio maximizer (SIM) to extract the MMN component. The SIM algorithm can maximize the signal-to-noise ratio for event-related potentials (ERPs) to improve extraction. Simulation data were used to evaluate the influence of three parameters (i.e. trial number, repeated-SIM times and sampling times) on the performance of the proposed method. Main results. Results demonstrated that it was feasible and reliable to extract the MMN waveform using the method. Finally, an oddball paradigm with auditory stimuli of different frequencies was employed to record a few trials (50 trials of deviant sweeps and 250 trials of standard sweeps) of EEG data from 11 adult subjects. Results showed that the method could effectively extract the MMN using the EEG data of each individual subject. Significance. The extracted MMN waveform has a significantly larger peak amplitude and shorter latencies in response to the more deviant stimuli than in response to the less deviant stimuli, which agreed with the MMN properties reported in previous literature using grand-averaged EEG data of multi-subjects.
Reprioritizing genetic associations in hit regions using LASSO-based resample model averaging.
Valdar, William; Sabourin, Jeremy; Nobel, Andrew; Holmes, Christopher C
2012-07-01
Significance testing one SNP at a time has proven useful for identifying genomic regions that harbor variants affecting human disease. But after an initial genome scan has identified a "hit region" of association, single-locus approaches can falter. Local linkage disequilibrium (LD) can make both the number of underlying true signals and their identities ambiguous. Simultaneous modeling of multiple loci should help. However, it is typically applied ad hoc: conditioning on the top SNPs, with limited exploration of the model space and no assessment of how sensitive model choice was to sampling variability. Formal alternatives exist but are seldom used. Bayesian variable selection is coherent but requires specifying a full joint model, including priors on parameters and the model space. Penalized regression methods (e.g., LASSO) appear promising but require calibration, and, once calibrated, lead to a choice of SNPs that can be misleadingly decisive. We present a general method for characterizing uncertainty in model choice that is tailored to reprioritizing SNPs within a hit region under strong LD. Our method, LASSO local automatic regularization resample model averaging (LLARRMA), combines LASSO shrinkage with resample model averaging and multiple imputation, estimating for each SNP the probability that it would be included in a multi-SNP model in alternative realizations of the data. We apply LLARRMA to simulations based on case-control genome-wide association studies data, and find that when there are several causal loci and strong LD, LLARRMA identifies a set of candidates that is enriched for true signals relative to single locus analysis and to the recently proposed method of Stability Selection.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Timmerman, Marieke E.; Kiers, Henk A.L.; Smilde, Age K.
2007-01-01
Confidence intervals (Cis) in principal component analysis (PCA) can be based on asymptotic standard errors and on the bootstrap methodology. The present paper offers an overview of possible strategies for bootstrapping in PCA. A motivating example shows that Ci estimates for the component loadings
Institute of Scientific and Technical Information of China (English)
2000-01-01
In this paper,the author studies the asymptotic accuracies of the one-term Edgeworth expansions and the bootstrap approximation for the studentized MLE from randomly censored exponential population.It is shown that the Edgeworth expansions and the bootstrap approximation are asymptotically close to the exact distribution of the studentized MLE with a rate.
Spanning Trees and bootstrap reliability estimation in correlation based networks
Tumminello, M; Lillo, F; Micciché, S; Mantegna, R N
2006-01-01
We introduce a new technique to associate a spanning tree to the average linkage cluster analysis. We term this tree as the Average Linkage Minimum Spanning Tree. We also introduce a technique to associate a value of reliability to links of correlation based graphs by using bootstrap replicas of data. Both techniques are applied to the portfolio of the 300 most capitalized stocks traded at New York Stock Exchange during the time period 2001-2003. We show that the Average Linkage Minimum Spanning Tree recognizes economic sectors and sub-sectors as communities in the network slightly better than the Minimum Spanning Tree does. We also show that the average reliability of links in the Minimum Spanning Tree is slightly greater than the average reliability of links in the Average Linkage Minimum Spanning Tree.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs.
González-Tablas, Ana I; Tapiador, Juan E
2016-05-11
We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN) composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user's security preferences implicitly captured by policies already in place.
A bootstrap method for estimating uncertainty of water quality trends
Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura
2015-01-01
Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs
Directory of Open Access Journals (Sweden)
Ana I. González-Tablas
2016-05-01
Full Text Available We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user’s security preferences implicitly captured by policies already in place.
Bootstrapping Inductive and Coinductive Types in HasCASL
Schröder, Lutz
2008-01-01
We discuss the treatment of initial datatypes and final process types in the wide-spectrum language HasCASL. In particular, we present specifications that illustrate how datatypes and process types arise as bootstrapped concepts using HasCASL's type class mechanism, and we describe constructions of types of finite and infinite trees that establish the conservativity of datatype and process type declarations adhering to certain reasonable formats. The latter amounts to modifying known constructions from HOL to avoid unique choice; in categorical terminology, this means that we establish that quasitoposes with an internal natural numbers object support initial algebras and final coalgebras for a range of polynomial functors, thereby partially generalising corresponding results from topos theory. Moreover, we present similar constructions in categories of internal complete partial orders in quasitoposes.
Bootstrapping Mixed Correlators in the 3D Ising Model
Kos, Filip; Simmons-Duffin, David
2014-01-01
We study the conformal bootstrap for systems of correlators involving non-identical operators. The constraints of crossing symmetry and unitarity for such mixed correlators can be phrased in the language of semidefinite programming. We apply this formalism to the simplest system of mixed correlators in 3D CFTs with a $\\mathbb{Z}_2$ global symmetry. For the leading $\\mathbb{Z}_2$-odd operator $\\sigma$ and $\\mathbb{Z}_2$-even operator $\\epsilon$, we obtain numerical constraints on the allowed dimensions $(\\Delta_\\sigma, \\Delta_\\epsilon)$ assuming that $\\sigma$ and $\\epsilon$ are the only relevant scalars in the theory. These constraints yield a small closed region in $(\\Delta_\\sigma, \\Delta_\\epsilon)$ space compatible with the known values in the 3D Ising CFT.
Bootstrapping Pure Quantum Gravity in AdS3
Bae, Jin-Beom; Lee, Sungjay
2016-01-01
The three-dimensional pure quantum gravity with negative cosmological constant is supposed to be dual to the extremal conformal field theory of central charge $c=24k$ in two dimensions. We employ the conformal bootstrap method to analyze the extremal CFTs, and find numerical evidence for the non-existence of the extremal CFTs for sufficiently large central charge ($k \\ge 20$). We also explore near-extremal CFTs, a small modification of extremal ones, and find similar evidence for their non-existence for large central charge. This indicates, under the assumption of holomorphic factorization, the pure gravity in the weakly curved AdS$_3$ do not exist as a consistent quantum theory.
The Inverse Bagging Algorithm: Anomaly Detection by Inverse Bootstrap Aggregating
Vischia, Pietro
2016-01-01
For data sets populated by a very well modeled process and by another process of unknown probability density function (PDF), a desired feature when manipulating the fraction of the unknown process (either for enhancing it or suppressing it) consists in avoiding to modify the kinematic distributions of the well modeled one. A bootstrap technique is used to identify sub-samples rich in the well modeled process, and classify each event according to the frequency of it being part of such sub-samples. Comparisons with general MVA algorithms will be shown, as well as a study of the asymptotic properties of the method, making use of a public domain data set that models a typical search for new physics as performed at hadronic colliders such as the Large Hadron Collider (LHC).
Adjorlolo, Clement; Mutanga, Onisimo; Cho, Moses A.; Ismail, Riyad
2013-04-01
In this paper, a user-defined inter-band correlation filter function was used to resample hyperspectral data and thereby mitigate the problem of multicollinearity in classification analysis. The proposed resampling technique convolves the spectral dependence information between a chosen band-centre and its shorter and longer wavelength neighbours. Weighting threshold of inter-band correlation (WTC, Pearson's r) was calculated, whereby r = 1 at the band-centre. Various WTC (r = 0.99, r = 0.95 and r = 0.90) were assessed, and bands with coefficients beyond a chosen threshold were assigned r = 0. The resultant data were used in the random forest analysis to classify in situ C3 and C4 grass canopy reflectance. The respective WTC datasets yielded improved classification accuracies (kappa = 0.82, 0.79 and 0.76) with less correlated wavebands when compared to resampled Hyperion bands (kappa = 0.76). Overall, the results obtained from this study suggested that resampling of hyperspectral data should account for the spectral dependence information to improve overall classification accuracy as well as reducing the problem of multicollinearity.
DEFF Research Database (Denmark)
Hounyo, Ulrich
We propose a bootstrap mehtod for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootsratp method suggested for the pre-averaged realized volatility estimator......-studentized statistics, our results justify using the bootstrap to esitmate the covariance matrix of a broad class of covolatility estimators. The bootstrap variance estimator is positive semi-definite by construction, an appealing feature that is not always shared by existing variance estimators of the integrated...
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify usage extensively with several real-world applications. In passing we give a pedagogical introduction to the numerical bootstrap methods.
Automotive FMCW Radar-enhanced Range Estimation via a Local Resampling Fourier Transform
Directory of Open Access Journals (Sweden)
Cailing Wang
2016-02-01
Full Text Available In complex traffic scenarios, more accurate measurement and discrimination for an automotive frequency-modulated continuous-wave (FMCW radar is required for intelligent robots, driverless cars and driver-assistant systems. A more accurate range estimation method based on a local resampling Fourier transform (LRFT for a FMCW radar is developed in this paper. Radar signal correlation in the phase space sees a higher signal-noise-ratio (SNR to achieve more accurate ranging, and the LRFT - which acts on a local neighbour as a refinement step - can achieve a more accurate target range. The rough range is estimated through conditional pulse compression (PC and then, around the initial rough estimation, a refined estimation through the LRFT in the local region achieves greater precision. Furthermore, the LRFT algorithm is tested in numerous simulations and physical system experiments, which show that the LRFT algorithm achieves a more precise range estimation than traditional FFT-based algorithms, especially for lower bandwidth signals.
Epipolar resampling of linear pushbroom satellite imagery by a new epipolarity model
Wang, Mi; Hu, Fen; Li, Jonathan
This paper presents a practical epipolarity model for high-resolution linear pushbroom satellite images acquired in either along-track or cross-track mode, based on the projection reference plane in object space. A new method for epipolar resampling of satellite stereo imagery based on this model is then developed. In this method, the pixel-to-pixel relationship between the original image and the generated epipolar image is established directly by the geometric sensor model. The approximate epipolar images are generated in a manner similar to digital image rectification. In addition, by arranging the approximate epipolar lines on the defined projection reference plane, a stereoscopic model with consistent ground sampling distance and parallel to the object space is thus available, which is more convenient for three-dimensional measurement and interpretation. The results obtained from SPOT5, IKONOS, IRS-P5, and QuickBird stereo images indicate that the generated epipolar images all achieve high accuracy. Moreover, the vertical parallaxes at check points are at sub-pixel level, thus proving the feasibility, correctness, and applicability of the method.
Su, Min; Fang, Liang; Su, Zheng
2013-05-01
Dichotomizing a continuous biomarker is a common practice in medical research. Various methods exist in the literature for dichotomizing continuous biomarkers. The most widely adopted minimum p-value approach uses a sequence of test statistics for all possible dichotomizations of a continuous biomarker, and it chooses the cutpoint that is associated with the maximum test statistic, or equivalently, the minimum p-value of the test. We herein propose a likelihood and resampling-based approach to dichotomizing a continuous biomarker. In this approach, the cutpoint is considered as an unknown variable in addition to the unknown outcome variables, and the likelihood function is maximized with respect to the cutpoint variable as well as the outcome variables to obtain the optimal cutpoint for the continuous biomarker. The significance level of the test for whether a cutpoint exists is assessed via a permutation test using the maximum likelihood values calculated based on the original as well as the permutated data sets. Numerical comparisons of the proposed approach and the minimum p-value approach showed that the proposed approach was not only more powerful in detecting the cutpoint but also provided markedly more accurate estimates of the cutpoint than the minimum p-value approach in all the simulation scenarios considered.
Depth inloop resampling using dilation filter for free viewpoint video system
Lee, Seok; Lee, Seungsin; Wey, Hocheon; Lee, Jaejoon; Park, Dusik
2013-03-01
A depth dilation filter is proposed for free viewpoint video system based on mixed resolution multi-view video plus depth (MVD). By applying gray scale dilation filter to depth images, foreground regions are extended to background region, and synthesis artifacts occur out of boundary edge. Thus, objective and subjective quality of view synthesis result is improved. A depth dilation filter is applied to inloop resampling part in encoding/decoding, and post processing part after decoding. Accurate view synthesis is important in virtual view generation for autostereoscopic display, moreover there are many coding tools which use view synthesis to reduce interview redundancy in 3D video coding such as view synthesis prediction (VSP) and depth based motion vector prediction (DMVP), and compression efficiency can be improved by accurate view synthesis. Coding and synthesis experiments are performed for performance evaluation of a dilation filter with MPEG test sequences. Dilation filter was implemented on the top of the MPEG reference software for AVC based 3D video coding. By applying a depth dilation filter, BD-rate gains of 0.5% and 6.0% in terms of PSNR of decoded views and synthesized views, respectively.
Pseudocontact Shift-Driven Iterative Resampling for 3D Structure Determinations of Large Proteins.
Pilla, Kala Bharath; Otting, Gottfried; Huber, Thomas
2016-01-29
Pseudocontact shifts (PCSs) induced by paramagnetic lanthanides produce pronounced effects in nuclear magnetic resonance spectra, which are easily measured and which deliver valuable long-range structure restraints. Even sparse PCS data greatly enhance the success rate of 3D (3-dimensional) structure predictions of proteins by the modeling program Rosetta. The present work extends this approach to 3D structures of larger proteins, comprising more than 200 residues, which are difficult to model by Rosetta without additional experimental restraints. The new algorithm improves the fragment assembly method of Rosetta by utilizing PCSs generated from paramagnetic lanthanide ions attached at four different sites as the only experimental restraints. The sparse PCS data are utilized at multiple stages, to identify native-like local structures, to rank the best structural models and to rebuild the fragment libraries. The fragment libraries are refined iteratively until convergence. The PCS-driven iterative resampling algorithm is strictly data dependent and shown to generate accurate models for a benchmark set of eight different proteins, ranging from 100 to 220 residues, using solely PCSs of backbone amide protons.
A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Liang, Faming
2013-03-01
The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.
A Poisson resampling method for simulating reduced counts in nuclear medicine images.
White, Duncan; Lawson, Richard S
2015-05-07
Nuclear medicine computers now commonly offer resolution recovery and other software techniques which have been developed to improve image quality for images with low counts. These techniques potentially mean that these images can give equivalent clinical information to a full-count image. Reducing the number of counts in nuclear medicine images has the benefits of either allowing reduced activity to be administered or reducing acquisition times. However, because acquisition and processing parameters vary, each user should ideally evaluate the use of images with reduced counts within their own department, and this is best done by simulating reduced-count images from the original data. Reducing the counts in an image by division and rounding off to the nearest integer value, even if additional Poisson noise is added, is inadequate because it gives incorrect counting statistics. This technical note describes how, by applying Poisson resampling to the original raw data, simulated reduced-count images can be obtained while maintaining appropriate counting statistics. The authors have developed manufacturer independent software that can retrospectively generate simulated data with reduced counts from any acquired nuclear medicine image.
Language bootstrapping: learning word meanings from perception-action association.
Salvi, Giampiero; Montesano, Luis; Bernardino, Alexandre; Santos-Victor, José
2012-06-01
We address the problem of bootstrapping language acquisition for an artificial system similarly to what is observed in experiments with human infants. Our method works by associating meanings to words in manipulation tasks, as a robot interacts with objects and listens to verbal descriptions of the interactions. The model is based on an affordance network, i.e., a mapping between robot actions, robot perceptions, and the perceived effects of these actions upon objects. We extend the affordance model to incorporate spoken words, which allows us to ground the verbal symbols to the execution of actions and the perception of the environment. The model takes verbal descriptions of a task as the input and uses temporal co-occurrence to create links between speech utterances and the involved objects, actions, and effects. We show that the robot is able form useful word-to-meaning associations, even without considering grammatical structure in the learning process and in the presence of recognition errors. These word-to-meaning associations are embedded in the robot's own understanding of its actions. Thus, they can be directly used to instruct the robot to perform tasks and also allow to incorporate context in the speech recognition task. We believe that the encouraging results with our approach may afford robots with a capacity to acquire language descriptors in their operation's environment as well as to shed some light as to how this challenging process develops with human infants.
A Mellin space approach to the conformal bootstrap
Gopakumar, Rajesh; Sen, Kallol; Sinha, Aninda
2016-01-01
We describe in more detail our approach to the conformal bootstrap which uses the Mellin representation of $CFT_d$ four point functions and expands them in terms of crossing symmetric combinations of $AdS_{d+1}$ Witten exchange functions. We consider arbitrary external scalar operators and set up the conditions for consistency with the operator product expansion. Namely, we demand cancellation of spurious powers (of the cross ratios, in position space) which translate into spurious poles in Mellin space. We discuss two contexts in which we can immediately apply this method by imposing the simplest set of constraint equations. The first is the epsilon expansion. We mostly focus on the Wilson-Fisher fixed point as studied in an epsilon expansion about $d=4$. We reproduce Feynman diagram results for operator dimensions to $O(\\epsilon^3)$ rather straightforwardly. This approach also yields new analytic predictions for OPE coefficients to the same order which fit nicely with recent numerical estimates for the Isin...
Bootstrap equations for $\\mathcal{N}=4$ SYM with defects
Liendo, Pedro
2016-01-01
This paper focuses on the analysis of $4d$ $\\mathcal{N}=4$ superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will write the Ward identities associated to two-point functions of $\\tfrac{1}{2}$-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to $4d$ $\\Nm=4$ superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: $4d$ $\\Nm=4$ superconformal theories with a line defect, $3d$ $\\Nm=4$ superconformal theories with no defect, and $OSP(4^*|4)$ superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootst...
Bootstrapping Multi-Parton Loop Amplitudes in QCD
Energy Technology Data Exchange (ETDEWEB)
Bern, Zvi; /UCLA; Dixon, Lance J.; /SLAC; Kosower, David A.; /Saclay, SPhT
2005-07-06
The authors present a new method for computing complete one-loop amplitudes, including their rational parts, in non-supersymmetric gauge theory. This method merges the unitarity method with on-shell recursion relations. It systematizes a unitarity-factorization bootstrap approach previously applied by the authors to the one-loop amplitudes required for next-to-leading order QCD corrections to the processes e{sup +}e{sup -} {yields} Z, {gamma}* {yields} 4 jets and pp {yields} W + 2 jets. We illustrate the method by reproducing the one-loop color-ordered five-gluon helicity amplitudes in QCD that interfere with the tree amplitude, namely A{sub 5;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}) and A{sub 5;1}(1{sup -}, 2{sup +}, 3{sup -}, 4{sup +}, 5{sup +}). Then we describe the construction of the six- and seven-gluon amplitudes with two adjacent negative-helicity gluons, A{sub 6;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}) and A{sub 7;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}), which uses the previously-computed logarithmic parts of the amplitudes as input. They present a compact expression for the six-gluon amplitude. No loop integrals are required to obtain the rational parts.
Consonant Inventories in the Spontaneous Speech of Young Children: A Bootstrapping Procedure
Van Severen, Lieve; Van Den Berg, Renate; Molemans, Inge; Gillis, Steven
2012-01-01
Consonant inventories are commonly drawn to assess the phonological acquisition of toddlers. However, the spontaneous speech data that are analysed often vary substantially in size and composition. Consequently, comparisons between children and across studies are fundamentally hampered. This study aims to examine the effect of sample size on the…
Baisden, W. T.; Prior, C.; Lambie, S.; Tate, K.; Bruhn, F.; Parfitt, R.; Schipper, L.; Wilde, R. H.; Ross, C.
2006-12-01
Soil organic matter contains more C than terrestrial biomass and atmospheric CO2 combined, and reacts to climate and land-use change on timescales requiring long-term experiments or monitoring. The direction and uncertainty of soil C stock changes has been difficult to predict and incorporate in decision support tools for climate change policies. Moreover, standardization of approaches has been difficult because historic methods of soil sampling have varied regionally, nationally and temporally. The most common and uniform type of historic sampling is soil profiles, which have commonly been collected, described and archived in the course of both soil survey studies and research. Resampling soil profiles has considerable utility in carbon monitoring and in parameterizing models to understand the ecosystem responses to global change. Recent work spanning seven soil orders in New Zealand's grazed pastures has shown that, averaged over approximately 20 years, 31 soil profiles lost 106 g C m-2 y-1 (p=0.01) and 9.1 g N m{^-2} y-1 (p=0.002). These losses are unexpected and appear to extend well below the upper 30 cm of soil. Following on these recent results, additional advantages of resampling soil profiles can be emphasized. One of the most powerful applications afforded by resampling archived soils is the use of the pulse label of radiocarbon injected into the atmosphere by thermonuclear weapons testing circa 1963 as a tracer of soil carbon dynamics. This approach allows estimation of the proportion of soil C that is `passive' or `inert' and therefore unlikely to respond to global change. Evaluation of resampled soil horizons in a New Zealand soil chronosequence confirms that the approach yields consistent values for the proportion of `passive' soil C, reaching 25% of surface horizon soil C over 12,000 years. Across whole profiles, radiocarbon data suggest that the proportion of `passive' C in New Zealand grassland soil can be less than 40% of total soil C. Below 30 cm
DEFF Research Database (Denmark)
Awan, Mehmood-Ur-Rehman; Le Moullec, Yannick; Koch, Peter;
2012-01-01
, and power optimization for field programmable gate array (FPGA) based architectures in an M -path polyphase filter bank with modified N -path polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones...... that are not multiples of the output sample rate. A non-maximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the M data-load’s time period. We present a load...... of the down-sampled data. In RA, M subfilters processes are efficiently scheduled within N data-load time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors...
High Bootstrap Current Fraction during the Synergy of LHCD and IBW on the HT-7 Tokamak
Institute of Scientific and Technical Information of China (English)
ZHANG Xian-Mei; WAN Bao-Nian; WU Zhen-Wei; HT-7 Team
2005-01-01
@@ More than 70% of the total plasma current is sustained by the bootstrap current and current drive during the synergy of lower hybrid current driving (LHCD) and ion Berstein wave (IBW) heating on the HT-7 tokamak.The lower hybrid non-inductive current source is off-axis and well localized, and more than 35% bootstrap current plasma has been obtained. The IBW in controlling electron pressure profile can be integrated into the LHCD target plasma. The largest steep gradient of the electron pressure profile in the region ρ～ 0.5-0.7 mostly comes from the electron temperature profile, which may induce the large fraction bootstrap current. The large off-axis bootstrap current can help to create negative magnetic shear, and the good plasma confinement is achieved.
Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, M.W.; Buuren, S. van; Knol, D.L.; Mechelen, W. van; Vet, H.C.W. de
2007-01-01
Background. Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable s
BOOTSTRAP TECHNIQUE FOR ROC ANALYSIS: A STABLE EVALUATION OF FISHER CLASSIFIER PERFORMANCE
Institute of Scientific and Technical Information of China (English)
Xie Jigang; iu Zhengding
2007-01-01
This paper presents a novel bootstrap based method for Receiver Operating Characteristic (ROC) analysis of Fisher classifier. By defining Fisher classifier's output as a statistic, the bootstrap technique is used to obtain the sampling distributions of the outputs for the positive class and the negative class respectively. As a result, the ROC curve is a plot of all the (False Positive Rate (FPR),True Positive Rate (TPR)) pairs by varying the decision threshold over the whole range of the bootstrap sampling distributions. The advantage of this method is, the bootstrap based ROC curves are much stable than those of the holdout or cross-validation, indicating a more stable ROC analysis of Fisher classifier. Experiments on five data sets publicly available demonstrate the effectiveness of the proposed method.
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F.
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify u...
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood.
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Predicting disease risk using bootstrap ranking and classification algorithms.
Manor, Ohad; Segal, Eran
2013-01-01
Genome-wide association studies (GWAS) are widely used to search for genetic loci that underlie human disease. Another goal is to predict disease risk for different individuals given their genetic sequence. Such predictions could either be used as a "black box" in order to promote changes in life-style and screening for early diagnosis, or as a model that can be studied to better understand the mechanism of the disease. Current methods for risk prediction typically rank single nucleotide polymorphisms (SNPs) by the p-value of their association with the disease, and use the top-associated SNPs as input to a classification algorithm. However, the predictive power of such methods is relatively poor. To improve the predictive power, we devised BootRank, which uses bootstrapping in order to obtain a robust prioritization of SNPs for use in predictive models. We show that BootRank improves the ability to predict disease risk of unseen individuals in the Wellcome Trust Case Control Consortium (WTCCC) data and results in a more robust set of SNPs and a larger number of enriched pathways being associated with the different diseases. Finally, we show that combining BootRank with seven different classification algorithms improves performance compared to previous studies that used the WTCCC data. Notably, diseases for which BootRank results in the largest improvements were recently shown to have more heritability than previously thought, likely due to contributions from variants with low minimum allele frequency (MAF), suggesting that BootRank can be beneficial in cases where SNPs affecting the disease are poorly tagged or have low MAF. Overall, our results show that improving disease risk prediction from genotypic information may be a tangible goal, with potential implications for personalized disease screening and treatment.
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
Reynaud-Bouret, Patricia; Laurent, Béatrice
2012-01-01
Considering two independent Poisson processes, we address the question of testing equality of their respective intensities. We construct multiple testing procedures from the aggregation of single tests whose testing statistics come from model selection, thresholding and/or kernel estimation methods. The corresponding critical values are computed through a non-asymptotic wild bootstrap approach. The obtained tests are proved to be exactly of level $\\alpha$, and to satisfy non-asymptotic oracle type inequalities. From these oracle type inequalities, we deduce that our tests are adaptive in the minimax sense over a large variety of classes of alternatives based on classical and weak Besov bodies in the univariate case, but also Sobolev and anisotropic Nikol'skii-Besov balls in the multivariate case. A simulation study furthermore shows that they strongly perform in practice.
DEFF Research Database (Denmark)
Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.
2012-01-01
A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis...... results regardless of the origin of data and the nature of the original variables. The approach is suitable for getting an overview of product confidence intervals and also applicable for data obtained from ‘one repetition’ evaluations. Furthermore, it is a convenient way to get an overview of variations...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....
Fast Generation of Ensembles of Cosmological N-Body Simulations via Mode-Resampling
Energy Technology Data Exchange (ETDEWEB)
Schneider, M D; Cole, S; Frenk, C S; Szapudi, I
2011-02-14
We present an algorithm for quickly generating multiple realizations of N-body simulations to be used, for example, for cosmological parameter estimation from surveys of large-scale structure. Our algorithm uses a new method to resample the large-scale (Gaussian-distributed) Fourier modes in a periodic N-body simulation box in a manner that properly accounts for the nonlinear mode-coupling between large and small scales. We find that our method for adding new large-scale mode realizations recovers the nonlinear power spectrum to sub-percent accuracy on scales larger than about half the Nyquist frequency of the simulation box. Using 20 N-body simulations, we obtain a power spectrum covariance matrix estimate that matches the estimator from Takahashi et al. (from 5000 simulations) with < 20% errors in all matrix elements. Comparing the rates of convergence, we determine that our algorithm requires {approx}8 times fewer simulations to achieve a given error tolerance in estimates of the power spectrum covariance matrix. The degree of success of our algorithm indicates that we understand the main physical processes that give rise to the correlations in the matter power spectrum. Namely, the large-scale Fourier modes modulate both the degree of structure growth through the variation in the effective local matter density and also the spatial frequency of small-scale perturbations through large-scale displacements. We expect our algorithm to be useful for noise modeling when constraining cosmological parameters from weak lensing (cosmic shear) and galaxy surveys, rescaling summary statistics of N-body simulations for new cosmological parameter values, and any applications where the influence of Fourier modes larger than the simulation size must be accounted for.
Resampling method for applying density-dependent habitat selection theory to wildlife surveys.
Directory of Open Access Journals (Sweden)
Olivia Tardy
Full Text Available Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection
Directory of Open Access Journals (Sweden)
Yongbin Liu
2017-01-01
Full Text Available Envelope spectrum analysis is a simple, effective, and classic method for bearing fault identification. However, in the wayside acoustic health monitoring system, owing to the high relative moving speed between the railway vehicle and the wayside mounted microphone, the recorded signal is embedded with Doppler effect, which brings in shift and expansion of the bearing fault characteristic frequency (FCF. What is more, the background noise is relatively heavy, which makes it difficult to identify the FCF. To solve the two problems, this study introduces solutions for the wayside acoustic fault diagnosis of train bearing based on Doppler effect reduction using the improved time-domain interpolation resampling (TIR method and diagnosis-relevant information enhancement using Weighted-Correlation-Coefficient-Guided Stochastic Resonance (WCCSR method. First, the traditional TIR method is improved by incorporating the original method with kinematic parameter estimation based on time-frequency analysis and curve fitting. Based on the estimated parameters, the Doppler effect is removed using the TIR easily. Second, WCCSR is employed to enhance the diagnosis-relevant period signal component in the obtained Doppler-free signal. Finally, paved with the above two procedures, the local fault is identified using envelope spectrum analysis. Simulated and experimental cases have verified the effectiveness of the proposed method.
Alfaro, Michael E; Zoller, Stefan; Lutzoni, François
2003-02-01
Bayesian Markov chain Monte Carlo sampling has become increasingly popular in phylogenetics as a method for both estimating the maximum likelihood topology and for assessing nodal confidence. Despite the growing use of posterior probabilities, the relationship between the Bayesian measure of confidence and the most commonly used confidence measure in phylogenetics, the nonparametric bootstrap proportion, is poorly understood. We used computer simulation to investigate the behavior of three phylogenetic confidence methods: Bayesian posterior probabilities calculated via Markov chain Monte Carlo sampling (BMCMC-PP), maximum likelihood bootstrap proportion (ML-BP), and maximum parsimony bootstrap proportion (MP-BP). We simulated the evolution of DNA sequence on 17-taxon topologies under 18 evolutionary scenarios and examined the performance of these methods in assigning confidence to correct monophyletic and incorrect monophyletic groups, and we examined the effects of increasing character number on support value. BMCMC-PP and ML-BP were often strongly correlated with one another but could provide substantially different estimates of support on short internodes. In contrast, BMCMC-PP correlated poorly with MP-BP across most of the simulation conditions that we examined. For a given threshold value, more correct monophyletic groups were supported by BMCMC-PP than by either ML-BP or MP-BP. When threshold values were chosen that fixed the rate of accepting incorrect monophyletic relationship as true at 5%, all three methods recovered most of the correct relationships on the simulated topologies, although BMCMC-PP and ML-BP performed better than MP-BP. BMCMC-PP was usually a less biased predictor of phylogenetic accuracy than either bootstrapping method. BMCMC-PP provided high support values for correct topological bipartitions with fewer characters than was needed for nonparametric bootstrap.
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Energy Technology Data Exchange (ETDEWEB)
Silva, Cleomacio Miguel da; Amaral, Romilton dos Santos; Santos Junior, Jose Araujo dos; Vieira, Jose Wilson; Leoterio, Dilmo Marques da Silva [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Radioecologia (RAE)], E-mail: cleomaciomiguel@yahoo.com.br; Amaral, Ademir [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Estudos em Radioprotecao e Radioecologia
2007-07-01
The distribution of natural radionuclides in samples from typically anomalous environments has generally a great significant asymmetry, as a result of outlier. For diminishing statistic fluctuation researchers, in radioecology, commonly use geometric mean or median, once the average has no stability under the effect of outliers. As the median is not affected by anomalous values, this parameter of central tendency is the most frequently employed for evaluation of a set of data containing discrepant values. On the other hand, Efron presented a non-parametric method the so-called bootstrap that can be used to decrease the dispersion around the central-tendency value. Generally, in radioecology, statistics procedures are used in order to reduce the effect results of the presence of anomalous values as regards averages. In this context, the present study had as an objective to evaluate the application of the non-parametric bootstrap method (BM) for determining the average concentration of {sup 226}Ra in cultivated forage palms (Opuntia spp.) in soils with uranium anomaly on the dairy milk farms, localized in the cities of Pedra and Venturosa, Pernambuco-Brazil, as well as discussing the utilization of this method in radioecology. The results of {sup 226}Ra in samples of forage palm varied from 1,300 to 25,000 mBq.kg{sup -1} (dry matter), with arithmetic average of 5,965.86 +- 5,903.05 mBq.kg{sup -1}. The result obtained for this average using BM was 5,963.82 +- 1,202.96 mBq.kg{sup -1} (dry matter). The use of BM allowed an automatic filtration of experimental data, without the elimination of outliers, leading to the reduction of dispersion around the average. As a result, the BM permitted reaching a stable arithmetic average of the effects of the outliers. (author)
Baisden, W. T.; Parfitt, R. L.; Schipper, L. A.; Filley, T. R.; Ross, C.
2008-12-01
A New Zealand data set of archived and resampled pasture soil profiles has identified a systematic pattern large soil C and N losses and gains that appear to be related to land-use intensity. We use isotope and organic geochemistry techniques in selected archived and resampled soil horizons to identify reasons for the observed large soil C and N losses and gains in intensive flat non-allophanic pasture and hill country soil profiles, respectively. These techniques allow us to examine 3 of the ~10 hypotheses proposed to explain the large losses initially observed in intensive pasture soils. These three hypotheses are: (1) soil C and N changes may be due to erosion and deposition; (2) pre-European forest-derived organic matter is being lost; and (3) changes in litter quality are reducing the amount of plant C and N stabilized in soil. To test hypothesis (1), we use 137Cs, accumulated in the soil clay fraction from nuclear fallout between 1945 and 1965. Measurements comparing archived (post-1965) and resampled horizons show losses or gains of 137Cs, which we interpret as erosion and deposition, respectively. Apparent wind erosion of up to ~6 cm of surface soil explains large surface soil C losses in 2 flat profiles, while apparent deposition explains soil C gains in two hill country profiles. Measurements of 14C assist in the evaluation of hypothesis (2) by suggesting that, after accounting for 137Cs-estimated erosion or deposition, surface soils are mainly losing C fixed since bomb 14C was injected into the atmosphere (post-1950). In contrast, soil C losses below 40 cm depth are dominated by C derived from pre-European forests. Biomarker compounds, particularly lignin-derivatives, allow us to evaluate hypotheses (2) and (3). Results to date suggest that failure to stabilize grass-derived C is more important than losses of forest-derived C in explaining soil C losses in the upper 30 cm. More broadly, biomarker and 137Cs measurements suggest that steady
de la Cruz, Rolando; Fuentes, Claudio; Meza, Cristian; Núñez-Antón, Vicente
2016-07-08
Consider longitudinal observations across different subjects such that the underlying distribution is determined by a non-linear mixed-effects model. In this context, we look at the misclassification error rate for allocating future subjects using cross-validation, bootstrap algorithms (parametric bootstrap, leave-one-out, .632 and [Formula: see text]), and bootstrap cross-validation (which combines the first two approaches), and conduct a numerical study to compare the performance of the different methods. The simulation and comparisons in this study are motivated by real observations from a pregnancy study in which one of the main objectives is to predict normal versus abnormal pregnancy outcomes based on information gathered at early stages. Since in this type of studies it is not uncommon to have insufficient data to simultaneously solve the classification problem and estimate the misclassification error rate, we put special attention to situations when only a small sample size is available. We discuss how the misclassification error rate estimates may be affected by the sample size in terms of variability and bias, and examine conditions under which the misclassification error rate estimates perform reasonably well.
The economics of bootstrapping space industries - Development of an analytic computer model
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Bootstrap confidence intervals in a complex situation: A sequential paired clinical trial
Energy Technology Data Exchange (ETDEWEB)
Morton, S.C.
1988-06-01
This paper considers the problem of determining a confidence interval for the difference between two treatments in a simplified sequential paired clinical trial, which is analogous to setting an interval for the drift of a random walk subject to a parabolic stopping boundary. Three bootstrap methods of construction are applied: Efron's accelerated bias-covered, the DiCiccio-Romano, and the bootstrap-t. The results are compared with a theoretical approximate interval due to Siegmund. Difficulties inherent in the use of these bootstrap methods in a complex situations are illustrated. The DiCiccio-Romano method is shown to be the easiest to apply and to work well. 13 refs.
Closure of the Operator Product Expansion in the Non-Unitary Bootstrap
Esterlis, Ilya; Ramirez, David
2016-01-01
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the "Gliozzi" bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-01
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the "Gliozzi" bootstrap method, and provides a simpler setting in which to study technical challenges with the method.
The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Improving Web Learning through model Optimization using Bootstrap for a Tour-Guide Robot
Directory of Open Access Journals (Sweden)
Rafael León
2012-09-01
Full Text Available We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability
A New Regime for Studying the High Bootstrap Current Fraction Plasma
Institute of Scientific and Technical Information of China (English)
A. Isayama; Y. Kamada; K. Ushigusa; T. Fujita; T. Suzuki; X. Gao
2001-01-01
A new experimental regime has recently been studied for achieving the high fraction of the bootstrap current in the JT-60U hydrogen discharges. The high poloidal beta(βp ～ 3.61) plasma was obtained by high-power neutral beam injection heating at a very high edge safety factor (Ip = 0.3 MA, Bt = 3.65 T, qeff = 25 - 35) region, and the bootstrap current fraction (fBS) was correspondingly about 40% using the ACCOME code calculation. It was observed that there were no magnetohydrodynamic instabilities to retard the increase of the βp and fBS parameters in the new regime.
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
Excitons in solids with time-dependent density-functional theory: the bootstrap kernel and beyond
Byun, Young-Moo; Yang, Zeng-Hui; Ullrich, Carsten
Time-dependent density-functional theory (TDDFT) is an efficient method to describe the optical properties of solids. Lately, a series of bootstrap-type exchange-correlation (xc) kernels have been reported to produce accurate excitons in solids, but different bootstrap-type kernels exist in the literature, with mixed results. In this presentation, we reveal the origin of the confusion and show a new empirical TDDFT xc kernel to compute excitonic properties of semiconductors and insulators efficiently and accurately. Our method can be used for high-throughput screening calculations and large unit cell calculations. Work supported by NSF Grant DMR-1408904.
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Directory of Open Access Journals (Sweden)
Thawatchai Onjun
2012-02-01
Full Text Available The investigation of bootstrap current formation in ITER is carried out using BALDUR integrated predictive modelingcode. The combination of Mixed B/gB anomalous transport model and NLCASS module together with the pedestal model isused in BALDUR code to simulate the time evolution of temperature, density, and plasma current profiles. It was found inthe simulations that without the presence of ITB, a minimal fraction of bootstrap current (as well as low fusion performancewas achieved. The enhancement due to ITB depends sensitively on the strength of toroidal velocity. A sensitivity study wasalso carried out to optimize the bootstrap current fraction and plasma performance. It was found that the bootstrap currentfraction slightly improved; while the plasma performance greatly improved with increasing of NBI power or pedestal temperature.On the other hand, higher impurity concentration resulted in a significant degradation of fusion performance, buta smaller degradation in bootstrap current.
The use of vector bootstrapping to improve variable selection precision in Lasso models.
Laurin, Charles; Boomsma, Dorret; Lubke, Gitta
2016-08-01
The Lasso is a shrinkage regression method that is widely used for variable selection in statistical genetics. Commonly, K-fold cross-validation is used to fit a Lasso model. This is sometimes followed by using bootstrap confidence intervals to improve precision in the resulting variable selections. Nesting cross-validation within bootstrapping could provide further improvements in precision, but this has not been investigated systematically. We performed simulation studies of Lasso variable selection precision (VSP) with and without nesting cross-validation within bootstrapping. Data were simulated to represent genomic data under a polygenic model as well as under a model with effect sizes representative of typical GWAS results. We compared these approaches to each other as well as to software defaults for the Lasso. Nested cross-validation had the most precise variable selection at small effect sizes. At larger effect sizes, there was no advantage to nesting. We illustrated the nested approach with empirical data comprising SNPs and SNP-SNP interactions from the most significant SNPs in a GWAS of borderline personality symptoms. In the empirical example, we found that the default Lasso selected low-reliability SNPs and interactions which were excluded by bootstrapping.
Parametric bootstrap methods for testing multiplicative terms in GGE and AMMI models.
Forkman, Johannes; Piepho, Hans-Peter
2014-09-01
The genotype main effects and genotype-by-environment interaction effects (GGE) model and the additive main effects and multiplicative interaction (AMMI) model are two common models for analysis of genotype-by-environment data. These models are frequently used by agronomists, plant breeders, geneticists and statisticians for analysis of multi-environment trials. In such trials, a set of genotypes, for example, crop cultivars, are compared across a range of environments, for example, locations. The GGE and AMMI models use singular value decomposition to partition genotype-by-environment interaction into an ordered sum of multiplicative terms. This article deals with the problem of testing the significance of these multiplicative terms in order to decide how many terms to retain in the final model. We propose parametric bootstrap methods for this problem. Models with fixed main effects, fixed multiplicative terms and random normally distributed errors are considered. Two methods are derived: a full and a simple parametric bootstrap method. These are compared with the alternatives of using approximate F-tests and cross-validation. In a simulation study based on four multi-environment trials, both bootstrap methods performed well with regard to Type I error rate and power. The simple parametric bootstrap method is particularly easy to use, since it only involves repeated sampling of standard normally distributed values. This method is recommended for selecting the number of multiplicative terms in GGE and AMMI models. The proposed methods can also be used for testing components in principal component analysis.
Bootstrap-estimated land-to-water coefficients from the CBTN_v4 SPARROW model
Ator, Scott; Brakebill, John W.; Blomquist, Joel D.
2017-01-01
This file contains 200 sets of bootstrap-estimated land-to-water coefficients from the CBTN_v4 SPARROW model, which is documented in USGS Scientific Investigations Report 2011-5167. The coefficients were produced as part of CBTN_v4 model calibration to provide information about the uncertainty in model estimates.
Giving the Boot to the Bootstrap: How Not to Learn the Natural Numbers
Rips, Lance J.; Asmuth, Jennifer; Bloomfield, Amber
2006-01-01
According to one theory about how children learn the concept of natural numbers, they first determine that "one", "two", and "three" denote the size of sets containing the relevant number of items. They then make the following inductive inference (the Bootstrap): The next number word in the counting series denotes the size of the sets you get by…
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
Random resampling masks: a non-Bayesian one-shot strategy for noise reduction in digital holography.
Bianco, V; Paturzo, M; Memmolo, P; Finizio, A; Ferraro, P; Javidi, B
2013-03-01
Holographic imaging may become severely degraded by a mixture of speckle and incoherent additive noise. Bayesian approaches reduce the incoherent noise, but prior information is needed on the noise statistics. With no prior knowledge, one-shot reduction of noise is a highly desirable goal, as the recording process is simplified and made faster. Indeed, neither multiple acquisitions nor a complex setup are needed. So far, this result has been achieved at the cost of a deterministic resolution loss. Here we propose a fast non-Bayesian denoising method that avoids this trade-off by means of a numerical synthesis of a moving diffuser. In this way, only one single hologram is required as multiple uncorrelated reconstructions are provided by random complementary resampling masks. Experiments show a significant incoherent noise reduction, close to the theoretical improvement bound, resulting in image-contrast improvement. At the same time, we preserve the resolution of the unprocessed image.
Institute of Scientific and Technical Information of China (English)
2012-01-01
In this paper, we describe resourceefficient hardware architectures for softwaredefined radio （SDR） frontends. These architectures are made efficient by using a polyphase channelizer that performs arbitrary sample rate changes, frequency selection, and bandwidth control. We discuss area, time, and power optimization for field programmable gate array （FPGA） based architectures in an Mpath polyphase filter bank with modified Npath polyphase filter. Such systems allow resampling by arbitrary ratios while simultaneously performing baseband aliasing from center frequencies at Nyquist zones that are not multiples of the output sample rate. A nonmaximally decimated polyphase filter bank, where the number of data loads is not equal to the number of M subfilters, processes M subfilters in a time period that is either less than or greater than the Mdataload ＇ s time period. We present a loadprocess architecture （LPA） and a runtime architecture （RA） （based on serial polyphase structure） which have different scheduling. In LPA, Nsubfilters are loaded, and then M subfilters are processed at a clock rate that is a multiple of the input data rate. This is necessary to meet the output time constraint of the down-sampled data. In RA, Msubfilters processes are efficiently scheduled within Ndataload time while simultaneously loading N subfilters. This requires reduced clock rates compared with LPA, and potentially less power is consumed. A polyphase filter bank that uses different resampling factors for maximally decimated, underdecimated, overdecimated, and combined upand downsampled scenarios is used as a case study, and an analysis of area, time, and power for their FPGA architectures is given. For resourceoptimized SDR frontends, RA is superior for reducing operating clock rates and dynamic power consumption. RA is also superior for reducing area resources, except when indices are prestored in LUTs.
Bootstrapping on undirected binary networks via statistical mechanics.
Fushing, Hsieh; Chen, Chen; Liu, Shan-Yu; Koehl, Patrice
2014-09-01
We propose a new method inspired from statistical mechanics for extracting geometric information from undirected binary networks and generating random networks that conform to this geometry. In this method an undirected binary network is perceived as a thermodynamic system with a collection of permuted adjacency matrices as its states. The task of extracting information from the network is then reformulated as a discrete combinatorial optimization problem of searching for its ground state. To solve this problem, we apply multiple ensembles of temperature regulated Markov chains to establish an ultrametric geometry on the network. This geometry is equipped with a tree hierarchy that captures the multiscale community structure of the network. We translate this geometry into a Parisi adjacency matrix, which has a relative low energy level and is in the vicinity of the ground state. The Parisi adjacency matrix is then further optimized by making block permutations subject to the ultrametric geometry. The optimal matrix corresponds to the macrostate of the original network. An ensemble of random networks is then generated such that each of these networks conforms to this macrostate; the corresponding algorithm also provides an estimate of the size of this ensemble. By repeating this procedure at different scales of the ultrametric geometry of the network, it is possible to compute its evolution entropy, i.e. to estimate the evolution of its complexity as we move from a coarse to a ne description of its geometric structure. We demonstrate the performance of this method on simulated as well as real data networks.
Nortey, Ezekiel N N; Ansah-Narh, Theophilus; Asah-Asante, Richard; Minkah, Richard
2015-01-01
Although, there exists numerous literature on the procedure for forecasting or predicting election results, in Ghana only opinion poll strategies have been used. To fill this gap, the paper develops Markov chain models for forecasting the 2016 presidential election results at the Regional, Zonal (i.e. Savannah, Coastal and Forest) and the National levels using past presidential election results of Ghana. The methodology develops a model for prediction of the 2016 presidential election results in Ghana using the Markov chains Monte Carlo (MCMC) methodology with bootstrap estimates. The results were that the ruling NDC may marginally win the 2016 Presidential Elections but would not obtain the more than 50 % votes to be declared an outright winner. This means that there is going to be a run-off election between the two giant political parties: the ruling NDC and the major opposition party, NPP. The prediction for the 2016 Presidential run-off election between the NDC and the NPP was rather in favour of the major opposition party, the NPP with a little over the 50 % votes obtained.
Shi, Juanjuan; Liang, Ming; Guan, Yunpeng
2016-02-01
The conventional way for bearing fault diagnosis under variable rotational speed generally includes prefiltering, resampling based on shaft rotating frequency and order spectrum analysis. However, its application is confined by three major obstacles: a) knowledge-demanding parameter determination required by prefiltering, b) unavailable shaft rotating frequency for resampling as it is coupled with instantaneous fault characteristic frequency (IFCF) by a fault characteristic coefficient (FCC) which cannot be decided without knowing what fault actually exists, and c) complicated and error-prone resampling process. As such, we propose a new method to address these problems. The proposed method free from prefiltering and resampling mainly contains the following steps: a) extracting envelope by windowed fractal dimension (FD) transform, requiring no prefiltering, b) with the envelope signal, performing short time Fourier transform (STFT) to get a clear time frequency representation (TFR), from which the IFCF and the basic demodulator for generalized demodulation (GD) can be obtained, c) applying the generalized demodulation to the envelope signal with the current demodulator, converting the trajectory of the current time-frequency component into a linear path parallel to the time axis, d) frequency analyzing the demodulated signal, followed by searching the amplitude of the constant frequency where the linear path is situated. Updating demodulator via multiplying the basic demodulator by different real numbers (i.e., coefficient λ) and repeating the steps (c)-(d), the resampling-free order spectrum is then obtained. Based on the resulting spectrum, the final diagnosis decision can be made. The proposed method for its implementation on the example of simulated data is presented. Finally, experimental data are employed to validate the effectiveness of the proposed technique.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...... on assumptions that do not always hold in practice. The aim of this thesis was to illustrate the concept of bootstrap-based confidence estimation in PCA and MSPC. It particularly shows how to build bootstrapbased confidence limits in these areas to be used as alternative to the traditional/asymptotic limits....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...
Langel, Steven E.; Khanafseh, Samer M.; Pervan, Boris
2016-06-01
Differential carrier phase applications that utilize cycle resolution need the probability density function of the baseline estimate to quantify its region of concentration. For the integer bootstrap estimator, the density function has an analytical definition that enables probability calculations given perfect statistical knowledge of measurement and process noise. This paper derives a method to upper bound the tail probability of the integer bootstrapped GNSS baseline when the measurement and process noise correlation functions are unknown, but can be upper and lower bounded. The tail probability is shown to be a non-convex function of a vector of conditional variances, whose feasible region is a convex polytope. We show how to solve the non-convex optimization problem globally by discretizing the polytope into small hyper-rectangular elements, and demonstrate the method for a static baseline estimation problem.
Langel, Steven E.; Khanafseh, Samer M.; Pervan, Boris
2016-11-01
Differential carrier phase applications that utilize cycle resolution need the probability density function of the baseline estimate to quantify its region of concentration. For the integer bootstrap estimator, the density function has an analytical definition that enables probability calculations given perfect statistical knowledge of measurement and process noise. This paper derives a method to upper bound the tail probability of the integer bootstrapped GNSS baseline when the measurement and process noise correlation functions are unknown, but can be upper and lower bounded. The tail probability is shown to be a non-convex function of a vector of conditional variances, whose feasible region is a convex polytope. We show how to solve the non-convex optimization problem globally by discretizing the polytope into small hyper-rectangular elements, and demonstrate the method for a static baseline estimation problem.
A Simple Voltage Controlled Oscillator Using Bootstrap Circuits and NOR-RS Flip Flop
Chaikla, Amphawan; Pongswatd, Sawai; Sasaki, Hirofumi; Fujimoto, Kuniaki; Yahara, Mitsutoshi
This paper presents a simple and successful design for a voltage controlled oscillator. The proposed circuit is based on the use of two identical bootstrap circuits and a NOR-RS Flip Flop to generate wide-tunable sawtooth and square waves. Increasing control voltage linearly increases the output oscillation frequency. Experimental results verifying the performances of the proposed circuit are in agreement with the calculated values.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio
2014-01-01
Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Economic growth and energy consumption causal nexus viewed through a bootstrap rolling window
Energy Technology Data Exchange (ETDEWEB)
Balcilar, Mehmet [Department of Economics, Eastern Mediterranean University, Famagusta, Turkish Republic of Northern Cyprus, via Mersin 10 (Turkey); Ozdemir, Zeynel Abidin [Department of Economics, Gazi University, Besevler, 06500, Ankara (Turkey); Arslanturk, Yalcin [Gazi University, Teknikokullar, 06500, Ankara (Turkey)
2010-11-15
One puzzling results in the literature on energy consumption-economic growth causality is the variability of results particularly across sample periods, sample sizes, and model specification. In order overcome these issues this paper analyzes the causal links between energy consumption and economic growth for G-7 countries using bootstrap Granger non-causality tests with fixed size rolling subsamples. The data used includes annual total energy consumption and real Gross Domestic Product (GDP) series from 1960 to 2006 for G-7 countries, excluding Germany, for which the sample period starts from 1971. Using the full sample bootstrap Granger causality test, we find that there is predictive power from energy consumption to economic growth only for Canada. However, parameter instability tests show that none of the estimated models have constant parameters and hence the full sample results are not reliable. Analogous to the full sample results, the results obtained from the bootstrap rolling window estimation indicate no consistent causal links between energy consumption and economic growth. We, however, find that causal links are present between the series in various subsamples. Furthermore, these subsample periods correspond to significant economic events, indicating that the findings are not statistical artefacts, but correspond to real economic changes. Our results encompass previous findings and offer an explanation to varying findings. (author)
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
Energy Technology Data Exchange (ETDEWEB)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)
2013-05-15
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture.
Simulation of bootstrap current in 2D and 3D ideal magnetic fields in tokamaks
Raghunathan, M.; Graves, J. P.; Cooper, W. A.; Pedro, M.; Sauter, O.
2016-09-01
We aim to simulate the bootstrap current for a MAST-like spherical tokamak using two approaches for magnetic equilibria including externally caused 3D effects such as resonant magnetic perturbations (RMPs), the effect of toroidal ripple, and intrinsic 3D effects such as non-resonant internal kink modes. The first approach relies on known neoclassical coefficients in ideal MHD equilibria, using the Sauter (Sauter et al 1999 Phys. Plasmas 6 2834) expression valid for all collisionalities in axisymmetry, and the second approach being the quasi-analytic Shaing-Callen (Shaing and Callen 1983 Phys. Fluids 26 3315) model in the collisionless regime for 3D. Using the ideal free-boundary magnetohydrodynamic code VMEC, we compute the flux-surface averaged bootstrap current density, with the Sauter and Shaing-Callen expressions for 2D and 3D ideal MHD equilibria including an edge pressure barrier with the application of resonant magnetic perturbations, and equilibria possessing a saturated non-resonant 1/1 internal kink mode with a weak internal pressure barrier. We compare the applicability of the self-consistent iterative model on the 3D applications and discuss the limitations and advantages of each bootstrap current model for each type of equilibrium.
Finding confidence limits on population growth rates: bootstrap and analytic methods.
Picard, Nicolas; Chagneau, Pierrette; Mortier, Frédéric; Bar-Hen, Avner
2009-05-01
When predicting population dynamics, the value of the prediction is not enough and should be accompanied by a confidence interval that integrates the whole chain of errors, from observations to predictions via the estimates of the parameters of the model. Matrix models are often used to predict the dynamics of age- or size-structured populations. Their parameters are vital rates. This study aims (1) at assessing the impact of the variability of observations on vital rates, and then on model's predictions, and (2) at comparing three methods for computing confidence intervals for values predicted from the models. The first method is the bootstrap. The second method is analytic and approximates the standard error of predictions by their asymptotic variance as the sample size tends to infinity. The third method combines use of the bootstrap to estimate the standard errors of vital rates with the analytical method to then estimate the errors of predictions from the model. Computations are done for an Usher matrix models that predicts the asymptotic (as time goes to infinity) stock recovery rate for three timber species in French Guiana. Little difference is found between the hybrid and the analytic method. Their estimates of bias and standard error converge towards the bootstrap estimates when the error on vital rates becomes small enough, which corresponds in the present case to a number of observations greater than 5000 trees.
Stable bootstrap-current driven equilibria for low aspect ratio tokamaks
Energy Technology Data Exchange (ETDEWEB)
Miller, R.L.; Lin-Liu, Y.R.; Turnbull, A.D.; Chan, V.S. [General Atomics, San Diego, CA (United States); Pearlstein, L.D. [Lawrence Livermore National Lab., CA (United States); Sauter, O.; Villard, L. [Ecole Polytechnique Federale, Lausanne (Switzerland). Centre de Recherche en Physique des Plasma (CRPP)
1996-09-01
Low aspect ratio tokamaks can potentially provide a high ratio of plasma pressure to magnetic pressure {beta} and high plasma current I at a modest size, ultimately leading to a high power density compact fusion power plant. For the concept to be economically feasible, bootstrap current must be a major component of the plasma. A high value of the Troyon factor {beta}{sub N} and strong shaping are required to allow simultaneous operation at high {beta} and high bootstrap current fraction. Ideal magnetohydrodynamic stability of a range of equilibria at aspect 1.4 is systematically explored by varying the pressure profile and shape. The pressure and current profiles are constrained in such a way as to assure complete bootstrap current alignment. Both {beta}{sub N} and {beta} are defined in terms of the vacuum toroidal field. Equilibria with {beta} {sub N}{>=}8 and {beta} {approx_equal}35% to 55% exist which are stable to n = {infinity} ballooning modes, and stable to n = 0,1,2,3 kink modes with a conducting wall. The dependence of {beta} and {beta}{sub N} with respect to aspect ratio is also considered. (author) 9 figs., 14 refs.
Multivariate bootstrapped relative positioning of spacecraft using GPS L1/Galileo E1 signals
Buist, Peter J.; Teunissen, Peter J. G.; Giorgi, Gabriele; Verhagen, Sandra
2011-03-01
GNSS-based precise relative positioning between spacecraft normally requires dual frequency observations, whereas attitude determination of the spacecraft, mainly due to the stronger model given by the a priori knowledge of the length and geometry of the baselines, can be performed precisely using only single frequency observations. When the Galileo signals will come available, the number of observations at the L1 frequency will increase as we will have a GPS and Galileo multi-constellation. Moreover the L1 observations of the Galileo system and modernized GPS are more precise than legacy GPS and this, combined with the increased number of observations, will result in a stronger model for single frequency relative positioning. In this contribution we will develop an even stronger model by combining the attitude determination problem with relative positioning. The attitude determination problem will be solved by the recently developed Multivariate Constrained (MC-) LAMBDA method. We will do this for each spacecraft and use the outcome for an ambiguity constrained solution on the baseline between the spacecraft. In this way the solution for the unconstrained baseline is bootstrapped from the MC-LAMBDA solutions of each spacecraft in what is called: multivariate bootstrapped relative positioning. The developed approach will be compared in simulations with relative positioning using a single antenna at each spacecraft (standard LAMBDA) and a vectorial bootstrapping approach. In the simulations we will analyze single epoch, single frequency success rates as the most challenging application. The difference in performance for the approaches for single epoch solutions, is a good indication of the strength of the underlying models. As the multivariate bootstrapping approach has a stronger model by applying information on the geometry of the constrained baselines, for applications with large observation noise and limited number of observations this will result in a better
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
Energy Technology Data Exchange (ETDEWEB)
Koh, S.; Choe, W. [Korea Advanced Institute of Science and Technology, Department of Physics, Daejeon 305-701 (Korea, Republic of); Chang, C. S.; Ku, S.; Menard, J. E. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Weitzner, H. [Courant Institute of Mathematical Sciences, New York University, New York, New York 10012 (United States)
2012-07-15
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. A drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC
A Novel Resampling Method for Differential Protection%一种用于差动保护的新型重采样方法
Institute of Scientific and Technical Information of China (English)
王业; 陆于平; 徐以超; 许但轩
2012-01-01
For multi-rate source, protection receives the digital signals with different sampling rates from communication interface, resulting in the different sampling rates on the two sides of differential protection and errors. This paper concentrates mainly on resampling method for input signal with different sampling rates in multi-rate source, and proposes the frequency domain transformation resampling method. Firstly, performs half-wave Fourier transformation on protection sampling signal. Secondly, processes the frequency domain resulting from the transformation based on the difference in sampling rates. Thirdly, changes the processed frequency spectrum to the time domain with inverse transformation, obtaining resampled signal. Comparing with the current method for resampling, this method can not only utilize the waiting time data of R times of continuous distinguishing of differential protection of sampled values efficiently and increase the precision of resampling, but also limit the error in resampling to a low level when signals contain a wealth of high-order harmonic, as well as archive zero-delay for resampling output. The average error rates of resampling precisions during the system normal and during external fault removal, showed by the simulation, are only 0.50% and 1.10% separately, both are better than traditional method of time domain resampling.%多数字源环境下，保护从通信接口中接收不同采样率的数字量，导致差动保护两侧采样率不一致，无法正常运行。针对此问题，该文研究多数字源不同采样率输入信号的重采样问题，提出频域变换重采样方法，先将保护采样信号进行半波傅里叶变换，再根据采样率的差异对变换后的频域进行处理，最后将处理过的频谱反变换到时域，得到重采样后的信号。该方法与现有重采样方法相比，不仅可以将采样值差动连续尺次判别所用等待时间中的数据很好地利用起来，增
Institute of Scientific and Technical Information of China (English)
CHAN Kung-Sik; TONG Howell; STENSETH Nils Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton's pioneering work. Many scientific studies have been designed to collect yearly rodent abundance data, but the resulting time series are generally subject to at least two "problems": being short and non-linear. We explore the use of the continuous threshold autoregressive (TAR) models for analyzing such data. In the simplest case, the continuous TAR models are additive autoregressive models, being piecewise linear in one lag, and linear in all other lags. The location of the slope change is called the threshold parameter. The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions. The lag in which the threshold is located sheds important insights on the structure of the prey-predator system. We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap (NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process. Essentially, the NBB bootstraps blocks of random block sizes, with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series. We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjarvi, Northern Finland.
Bernau, Christoph; Augustin, Thomas; Boulesteix, Anne-Laure
2013-09-01
High-dimensional binary classification tasks, for example, the classification of microarray samples into normal and cancer tissues, usually involve a tuning parameter. By reporting the performance of the best tuning parameter value only, over-optimistic prediction errors are obtained. For correcting this tuning bias, we develop a new method which is based on a decomposition of the unconditional error rate involving the tuning procedure, that is, we estimate the error rate of wrapper algorithms as introduced in the context of internal cross-validation (ICV) by Varma and Simon (2006, BMC Bioinformatics 7, 91). Our subsampling-based estimator can be written as a weighted mean of the errors obtained using the different tuning parameter values, and thus can be interpreted as a smooth version of ICV, which is the standard approach for avoiding tuning bias. In contrast to ICV, our method guarantees intuitive bounds for the corrected error. Additionally, we suggest to use bias correction methods also to address the conceptually similar method selection bias that results from the optimal choice of the classification method itself when evaluating several methods successively. We demonstrate the performance of our method on microarray and simulated data and compare it to ICV. This study suggests that our approach yields competitive estimates at a much lower computational price.
The Particle Filter Algorithm Based on Improved Resampling Techonology%基于重采样技术改进的粒子滤波算法
Institute of Scientific and Technical Information of China (English)
李小婷; 史健芳
2016-01-01
There exists some defects in Particle Filter algorithm based on the traditional resampling techonology such as large calculation ,severe particle impoverishment ,low estimation accuracy and so on .To solve these problems , this paper proposes an improved Particle Filter algorithm based on resampling techonology . First , the initial particles need to be processed before resampling to get a new set of particles .Then all the particles are divided into two types :one is medium weight particle collection , the other one is large weight and small weight particle collection .The medium weight particle collection remain unchanged without resamping .However ,the other one collection first has to be judged whether it can satisfy the condition for resampling .Finally ,a linear combination of large weight and small weight particles will be used to get new particles if it meets the resampling requisites .Results of simulation shows that the proposed improved Particle Filter algorithm is feasible .%基于传统重采样技术的粒子滤波算法存在着计算量大、粒子枯竭现象严重、估计精度较差的缺陷，针对这些问题，提出一种基于重采样技术改进的粒子滤波算法（Improved Resample Particle Filter ，IRPF ）。该算法首先在重采样时对粒子进行处理，得到新粒子集；然后对所有的粒子进行分类，得到两类粒子集，对中等权值粒子集不进行重采样；最后对大权值粒子和小权值粒子组成的粒子集先进行判断，若符合重采样条件，则对其使用线性组合方式进行重采样得到新粒子。仿真结果表明，提出的算法是可行的。
Bootstrap 方法建立生物参考区间的研究%Estimation of reference intervals by bootstrap methods
Institute of Scientific and Technical Information of China (English)
李小佩; 王建新; 施秀英; 王惠民
2016-01-01
目的：探讨Bootstrap方法建立生物参考区间的可行性及最少样本量。方法选取120例健康者检测血清尿素（BUN），采用美国临床和实验室标准协会（CLSI）C28‐A3推荐参数、非参数方法及Bootstrap方法建立参考区间；计算参考区间上下限值、均方根误差（RMSE）、精密度比值（Id ）等指标，研究Bootstrap方法建立参考区间最少样本量。结果非参数、参数、Bootstrap百分位数、Bootstrap标准方法建立参考区间分别为3．00～7．39 mmol／L、3．05～7．33 mmol／L、3．12～7．18 mmol／L、3．10～7．33 mmol／L ，参考区间覆盖率大于95％，Bootstrap方法与C28‐A3推荐方法结果一致；进一步研究发现随着样本量减少，参考区间增宽、RMSE和Id 逐渐升高，样本量减少到60时结果稳定。结论 Bootstrap方法可应用于各种数据分布类型生物参考区间的建立，在小样本量时有一定优势，为临床实验室制订参考区间提供了新思路。%Objective To research the bootstrap methods for reference intervals calculation are available and the minimum samples .Methods The concentrations of serum ureanitrogen(BUN) levels in 120 healthy reference subjects were determined and the reference intervals of BUN were calculated .Comparisons between results from par‐ametric and nonparametric method which were recommended by Clinical and Laboratory Standards Institute(CLSI) C28‐A3 and bootstrap methods .Reference interval limit and the values of RMSE ,Id were performed using RefValAdv to identify the minimum number of samples .Results The reference intervals calculated by nonparametric and para‐metric methods ,percentile and standard bootstrap methods were 3 .00-7 .39 mmol/L ,3 .05 -7 .33 mmol/L ,3 .12-7 .18 mmol/L ,3 .10 -7 .33 mmol/L respectively ,results were basically consistent .Overlap between reference inter‐vals were greater than 95% .The values of RMSE and Id increased ,the reference
Holoien, Thomas W -S; Wechsler, Risa H
2016-01-01
We describe two new open source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program for using Gaussian mixtures to do density estimation of noisy data using extreme deconvolution (XD) algorithms that has functionality not available in other XD tools. It allows the user to select between the AstroML (Vanderplas et al. 2012; Ivezic et al. 2015) and Bovy et al. (2011) fitting methods and is compatible with scikit-learn machine learning algorithms (Pedregosa et al. 2011). Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model conditioned on known values of other parameters. EmpiriciSN is an example application of this functionality that can be used for fitting an XDGMM model to observed supernova/host datas...
Kiselev, V V
2012-01-01
A huge value of cosmological constant characteristic for the particle physics and the inflation of early Universe are inherently related to each other: one can construct a fine-tuned superpotential, which produces a flat potential of inflaton with a constant density of energy V=\\Lambda^4 after taking into account for leading effects due to the supergravity, so that an introduction of small quantum loop-corrections to parameters of this superpotential naturally results in the dynamical instability relaxing the primary cosmological constant by means of inflationary regime. The model phenomenologically agrees with observational data on the large scale structure of Universe at \\Lambda~10^{16} GeV.
Institute of Scientific and Technical Information of China (English)
王焱; 汪震; 黄民翔; 蔡祯祺; 杨濛濛
2014-01-01
提出了一种基于在线序贯极限学习机(OS-ELM)的超短期风电功率预测方法。利用 OS-ELM学习速度快、泛化能力强的优点，将批处理和逐次迭代相结合，不断更新训练数据和网络结构，实现了对数值天气预报风速的快速实时修正和风电机组输出功率的快速预测。随后，采用计算机自助(Bootstrap)法构造伪样本，给出了预测功率的置信区间评估。实例和研究结果表明，该预测方法与反向传播(BP)网络、支持向量机(SVM)方法相比，在计算时间上更能满足在线应用需求，而且预测精度相当，有较好的应用前景。%An ultra-short-term wind power prediction method based on an online sequential extreme learning machine (OS-ELM) is proposed.Firstly,the OS-ELM is utilized to correct the predicted wind speed sequence so as to amend and improve the accuracy of predicted wind speed.Then,by combining batch processing with successive iteration,real-time prediction of wind turbine power output is accomplished with the help of the advantages of OS-ELM”s fast learning speed and strong generalization ability.Finally,a Bootstrap method is adopted to estimate the predicted intervals by resampling data.Analysis results show that,compared with the back propagation (BP) network and support vector machine (SVM) method,this prediction method can better meet the demand of online application and has good application prospects,while its forecasting accuracy is comparable to BP network and SVM method. This work is supported by National High Technology Research and Development Program of China (863 Program) (No.2011AA050204)and National Natural Science Foundation of China(No.51277160).
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
Directory of Open Access Journals (Sweden)
Ristya Widi Endah Yani
2008-12-01
Full Text Available Background: Bootstrap is a computer simulation-based method that provides estimation accuracy in estimating inferential statistical parameters. Purpose: This article describes a research using secondary data (n = 30 aimed to elucidate bootstrap method as the estimator of linear regression test based on the computer programs MINITAB 13, SPSS 13, and MacroMINITAB. Methods: Bootstrap regression methods determine ˆ β and Yˆ value from OLS (ordinary least square, ε i = Yi −Yˆi value, determine how many repetition for bootstrap (B, take n sample by replacement from ε i to ε (i , Yi = Yˆi + ε (i value, ˆ β value from sample bootstrap at i vector. If the amount of repetition less than, B a recalculation should be back to take n sample by using replacement from ε i . Otherwise, determine ˆ β from “bootstrap” methods as the average ˆ β value from the result of B times sample taken. Result: The result has similar result compared to linear regression equation with OLS method (α = 5%. The resulting regression equation for caries was = 1.90 + 2.02 (OHI-S, indicating that every one increase of OHI-S unit will result in caries increase of 2.02 units. Conclusion: This was conducted with B as many as 10,500 with 10 times iterations.
A treatment procedure for Gemini North/NIFS data cubes: application to NGC 4151
Menezes, R B; Ricci, T V
2014-01-01
We present a detailed procedure for treating data cubes obtained with the Near-Infrared Integral Field Spectrograph (NIFS) of the Gemini North telescope. This process includes the following steps: correction of the differential atmospheric refraction, spatial re-sampling, Butterworth spatial filtering, 'instrumental fingerprint' removal and Richardson-Lucy deconvolution. The clearer contours of the structures obtained with the spatial re-sampling, the high spatial-frequency noise removed with the Butterworth spatial filtering, the removed 'instrumental fingerprints' (which take the form of vertical stripes along the images) and the improvement of the spatial resolution obtained with the Richardson-Lucy deconvolution result in images with a considerably higher quality. An image of the Br{\\gamma} emission line from the treated data cube of NGC 4151 allows the detection of individual ionized-gas clouds (almost undetectable without the treatment procedure) of the narrow-line region of this galaxy, which are also ...
Institute of Scientific and Technical Information of China (English)
郑红; 隋强强; 孙玉泉
2012-01-01
There were many problems existing in the missile alarm and tracking for aircrafts,such as 2D missile motion modeling,the nonlinearity of the motion model and the non-Gaussian of the interference.The motion pattern of the proportional navigated missile from the 3D space to the camera imaging plane was studied and the motion model of the 2D projection of the proportional navigated missile was established.Since the velocity and acceleration in the model was non-linear,and the random interference（e.g.,wind direction,wind power,cyclone,air flow） encountered by missiles was also non-linear,the particle filter was used to track missiles.As for the particle degeneration problem in the tracking procedure,the ordered weight residual resampling particle filter（OWRR-PF） method was presented to release the particle degeneration problem and improve the tracking accuracy.Experimental results demonstrate that the tracking accuracy by using OWRR-PF method was improved by 70% than the standard particle filter,and 15% than the residual resampling particle filter.%针对利用图像信息实现飞行器的导弹预警跟踪中导弹运动2D建模、运动模型非线性和目标干扰非高斯等问题,研究了比例导引下导弹在三维空间中的运动在成像面上的运动模式,建立了比例制导导弹的2D投影运动状态模型.由于模型中速度和加速度等主要物理量的非线性,并考虑导弹运动过程中受到的风向、风力、气旋和气流等随机干扰的非高斯性,采用粒子滤波方法实现导弹跟踪;并针对粒子滤波在跟踪过程中存在的粒子退化问题,提出有序权值残差重采样粒子滤波（OWRR-PF,Ordered Weight Residual Resampling ParticleFilter）方法,该方法缓解了粒子退化问题,提高了跟踪的准确度.利用所建立的导弹运动模型进行连续视频试验,与标准粒子滤波相比跟踪精度提高了70%左右,与残差重采样粒子滤波方法相比跟踪精度提高了15%左右.
Analytic Bounds and Emergence of $\\textrm{AdS}_2$ Physics from the Conformal Bootstrap
Mazac, Dalimil
2016-01-01
We study analytically the constraints of the conformal bootstrap on the low-lying spectrum of operators in field theories with global conformal symmetry in one and two spacetime dimensions. We introduce a new class of linear functionals acting on the conformal bootstrap equation. In 1D, we use the new basis to construct extremal functionals leading to the optimal upper bound on the gap above identity in the OPE of two identical primary operators of integer or half-integer scaling dimension. We also prove an upper bound on the twist gap in 2D theories with global conformal symmetry. When the external scaling dimensions are large, our functionals provide a direct point of contact between crossing in a 1D CFT and scattering of massive particles in large $\\textrm{AdS}_2$. In particular, CFT crossing can be shown to imply that appropriate OPE coefficients exhibit an exponential suppression characteristic of massive bound states, and that the 2D flat-space S-matrix should be analytic away from the real axis.
Bootstrapping Mixed Correlators in the Five Dimensional Critical O(N) Models
Li, Zhijin
2016-01-01
We use the conformal bootstrap approach to explore $5D$ CFTs with $O(N)$ global symmetry, which contain $N$ scalars $\\phi_i$ transforming as $O(N)$ vector. Specifically, we study multiple four-point correlators of the leading $O(N)$ vector $\\phi_i$ and the $O(N)$ singlet $\\sigma$. The crossing symmetry of the four-point functions and the unitarity condition provide nontrivial constraints on the scaling dimensions ($\\Delta_\\phi$, $\\Delta_\\sigma$) of $\\phi_i$ and $\\sigma$. With reasonable assumptions on the gaps between scaling dimensions of $\\phi_i$ ($\\sigma$) and the next $O(N)$ vector (singlet) scalar, we are able to isolate the scaling dimensions $(\\Delta_\\phi$, $\\Delta_\\sigma)$ in small islands. In particular, for large $N=500$, the isolated region is highly consistent with the result obtained from large $N$ expansion. We also study the interacting $O(N)$ CFTs for $1\\leqslant N\\leqslant100$. Isolated regions on $(\\Delta_\\phi,\\Delta_\\sigma)$ plane are obtained using conformal bootstrap program with lower or...
Impurities in a non-axisymmetric plasma: Transport and effect on bootstrap current
Energy Technology Data Exchange (ETDEWEB)
Mollén, A., E-mail: albertm@chalmers.se [Department of Applied Physics, Chalmers University of Technology, Göteborg (Sweden); Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Landreman, M. [Institute for Research in Electronics and Applied Physics, University of Maryland, College Park, Maryland 20742 (United States); Smith, H. M.; Helander, P. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Braun, S. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); German Aerospace Center, Institute of Engineering Thermodynamics, Pfaffenwaldring 38-40, D-70569 Stuttgart (Germany)
2015-11-15
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21, 042503 (2014)] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/ν-scaling of the inter-species radial transport coefficient at low collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We also use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z{sub eff} of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.
Directory of Open Access Journals (Sweden)
Tásia Hickmann
2015-11-01
Full Text Available In this paper, an iterative forecasting methodology for time series prediction that integrates wavelet de-noising and decomposition with an Artificial Neural Network (ANN and Bootstrap methods is put forward here. Basically, a given time series to be forecasted is initially decomposed into trend and noise (wavelet components by using a wavelet de-noising algorithm. Both trend and noise components are then further decomposed by means of a wavelet decomposition method producing orthonormal Wavelet Components (WCs for each one. Each WC is separately modelled through an ANN in order to provide both in-sample and out-of-sample forecasts. At each time t, the respective forecasts of the WCs of the trend and noise components are simply added to produce the in-sample and out-of-sample forecasts of the underlying time series. Finally, out-of-sample predictive densities are empirically simulated by the Bootstrap sampler and the confidence intervals are then yielded, considering some level of credibility. The proposed methodology, when applied to the well-known Canadian lynx data that exhibit non-linearity and non-Gaussian properties, has outperformed other methods traditionally used to forecast it.
Gray bootstrap method for estimating frequency-varying random vibration signals with small samples
Directory of Open Access Journals (Sweden)
Wang Yanqing
2014-04-01
Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
Directory of Open Access Journals (Sweden)
Gu Xun
2007-03-01
Full Text Available Abstract Background Phylogenetically related miRNAs (miRNA families convey important information of the function and evolution of miRNAs. Due to the special sequence features of miRNAs, pair-wise sequence identity between miRNA precursors alone is often inadequate for unequivocally judging the phylogenetic relationships between miRNAs. Most of the current methods for miRNA classification rely heavily on manual inspection and lack measurements of the reliability of the results. Results In this study, we designed an analysis pipeline (the Phylogeny-Bootstrap-Cluster (PBC pipeline to identify miRNA families based on branch stability in the bootstrap trees derived from overlapping genome-wide miRNA sequence sets. We tested the PBC analysis pipeline with the miRNAs from six animal species, H. sapiens, M. musculus, G. gallus, D. rerio, D. melanogaster, and C. elegans. The resulting classification was compared with the miRNA families defined in miRBase. The two classifications were largely consistent. Conclusion The PBC analysis pipeline is an efficient method for classifying large numbers of heterogeneous miRNA sequences. It requires minimum human involvement and provides measurements of the reliability of the classification results.
A bootstrapped switch employing a new clock feed-through compensation technique
Institute of Scientific and Technical Information of China (English)
Wu Xiaofeng; Liu Hongxia; Su Li; Hao Yue; Li Di; Hu Shigang
2009-01-01
Nonlinearity caused by the clock feed-through of a bootstrapped switch and its compensation techniques are analyzed. All kinds of clock feed-through compensation configurations and their drawbacks are also investigated.It is pointed out that the delay path match of the clock boosting circuit is the critical factor that affects the effective-ness of clock feed-through compensation. Based on that, a new clock feed-through compensation configuration and corresponding bootstrapped switch are presented and designed optimally with the UMC mixed-mode/RF 0.18μm 1P6M P-sub twin-well CMOS process by orientating and elaborately designing the switch MOSFETs that influence the delay path match of the clock boosting circuit. HSPICE simulation results show that the proposed clock feed-through compensation configuration can not only enhance the sampling accuracy under variations of process, power supply voltage, temperature and capacitors but also decrease the even harmonic, high-order odd harmonic and THD on the whole effectively.
Lenardic, A.; Hoink, T.
2008-12-01
Several studies have highlighted the role of a low viscosity asthenosphere in promoting plate-like behavior in mantle convection models. It has also been argued that the asthenosphere is fed by mantle plumes (Phipps- Morgan et al. 1993; Deffeyes 1972) and that the existence of the specific plume types required for this depends on plate subduction (Lenardic and Kaula 1995; Jellinek et al. 2002). Independent of plumes, plate subduction can generate a non-adiabatic temperature gradient which, together with temperature dependent mantle viscosity, leads to a low viscosity near surface region. The above suggests a conceptual model in which the asthenosphere can not be defined solely in terms of material properties but must also be defined in terms of an active process, plate tectonics, which both maintains it and is maintained by it. The bootstrap aspect of the model is its circular causality between plates and the asthenosphere, neither being more fundamental than the other and the existence of each depending on the other. Several of the feedbacks key to the conceptual model will be quantified. The implications for modeling mantle convection in a plate-tectonic mode will also be discussed: 1) A key is to get numerical simulations into the bootstrap mode of operation and this is dependent on assumed initial conditions; 2) The model implies potentially strong hysteresis effects (e.g., transition between convection states, associated with variable yield stress, will occur at different values depending on whether the yield stress is systematically lowered or raised between successive models).
Salleh, Mad Ithnin; Ismail, Shariffah Nur Illiana Syed; Habidin, Nurul Fadly; Latip, Nor Azrin Md; Ishak, Salomawati
2014-12-01
The Malaysian government has put a greater attention on enhancing the productivity in Technical Vocational and Educational Training (TVET) sector to increase the development of a skilled workforce by the year 2020. The implementation of National Higher Education Strategic Plan (NHESP) in 2007 led to the changes in Malaysian Polytechnics sector. Thus, the study of efficiency and productivity make it possible to identify scope of improvement for the institution to perform in more efficient and effective manner. This paper aims to identify the efficiency and productivity of 24 polytechnics main campuses as in 2007. This paper applied bootstrapped Malmquist indices to investigate the effects of NHESP on the technical efficiency and changes in productivity in the Malaysian Polytechnics individually from the year 2007-2010. This method enables a more robust analysis of technical efficiency and productivity changes among polytechnics. The bootstrap simulation method is capable to identify whether or not the computed productivity changes are statistically significant. This paper founds that, the overall mean efficiency score demonstrate a significant growth. In addition, the sector as a whole has undergone positive productivity growth at the frontier during the post-NHESP period except in 2009-2010. The increase in productivity growth during post-NHESP was majorly led by technological growth. The empirical results indicated that during the post-NHESP period, the entire polytechnic showed significant TFP growth. This finding shows NHESP contribution towards the positive growth in the overall performance of the Malaysia's polytechnics sector.
Institute of Scientific and Technical Information of China (English)
黎光明; 张敏强
2013-01-01
Bootstrap方法是一种有放回的再抽样方法,可用于概化理论的方差分量及其变异量估计.用MonteCarlo技术模拟四种分布数据,分别是正态分布、二项分布、多项分布和偏态分布数据.基于p×i设计,探讨校正的Bootstrap方法相对于未校正的Bootstrap方法,是否改善了概化理论估计四种模拟分布数据的方差分量及其变异量.结果表明:跨越四种分布数据,从整体到局部,不论是“点估计”还是“变异量”估计,校正的Bootstrap方法都要优于未校正的Bootstrap方法,校正的Bootstrap方法改善了概化理论方差分量及其变异量估计.%Bootstrap is a returned re-sampling method used to estimate the variance component and their variability. Adjusted bootstrap method was used by Wiley in pxi design for normal data in 2001. However, Wiley did not compare the difference between adjusted method and unadjusted method when estimating the variability. To expand Wiley's 2001 study, our study applied Monte Carlo method to simulate four distribution data. The aim of simulation is to explore the effects of four different estimation methods when estimating the variability of estimated variance components for generalizability theory. The four distribution data are normal distribution data, dichotomous distribution data, polytomous distribution data and skewed distribution data. It is common that researchers focus on normal distribution data and neglect non-normal distribution data, yet non-normal distribution data could always be seen in tests such as TOEFL and GRE. There are several methods to estimate the variability of variance components, including traditional, bootstrap, jackknife and Markov Chain Monte Carlo (MCMC). Former research by Li and Zhang (2009) shows that bootstrap method is significantly better than traditional, jackknife, and MCMC methods in estimating the variability for four distribution data. Bootstrap method has superior cross-distribution quality
Institute of Scientific and Technical Information of China (English)
CHAN; Kung-Sik; TONG; Howell; STENSETH; Nils; Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton’s pioneering work.Many scientific studies have been designed to collect yearly rodent abundance data,but the resulting time series are generally subject to at least two "problems":being short and non-linear.We explore the use of the continuous threshold autoregressive(TAR) models for analyzing such data.In the simplest case,the continuous TAR models are additive autoregressive models,being piecewise linear in one lag,and linear in all other lags.The location of the slope change is called the threshold parameter.The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions.The lag in which the threshold is located sheds important insights on the structure of the prey-predator system.We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap(NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process.Essentially,the NBB bootstraps blocks of random block sizes,with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series.We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjrvi,Northern Finland.
Braun, M
1995-01-01
The bootstrap condition is generalized to n reggeized gluons. As a result it is demonstrated that the intercept generated by n reggeized gluons cannot be lower than the one for n=2. Arguments are presented that in the limit N_{c}\\rightarrow\\infty the bootstrap condition reduces the n gluon chain with interacting neighbours to a single BFKL pomeron. In this limit the leading contribution from n gluons corresponds to n/2 non-interacting BFKL pomerons (the n/2 pomeron cut). The sum over n leads to a unitary \\gamma^{\\ast}\\gamma amplitude of the eikonal form.
Institute of Scientific and Technical Information of China (English)
HALL Peter
2009-01-01
@@ This is a very attractive article. It combines fascinating new methodology with a most interesting dataset, and a highly motivating presentation. However, despite the many opportunities for discussion, I am going to confine attention to the issue of the block bootstrap, ingeniously developed in this paper into the nearest block bootstrap.
Resampling algorithm for particle filter based on layered transacting MCMC%基于分层转移的粒子滤波MCMC重采样算法
Institute of Scientific and Technical Information of China (English)
田隽; 钱建生; 李世银
2011-01-01
针对粒子滤波中如何设计重采样策略以解决“权值蜕化”，同时又可避免“样本贫化”的问题，提出一种基于分层转移的Monte Carlo Markov链（MCMC）重采样算法．当样本容量检测出现“蜕化”时，将样本集按权值蜕化程度进行分层，利用提出的变异繁殖算法，将其与PSO融合产生MCMC转移核，并施以分层子集；然后通过Metroplis—Hastings算法进行接收-拒绝采样，由此构建的Markov链可收敛到与目标真实后验等价的平稳分布．数值仿真结果表明，所提出的算法能以更快的收敛速度和更小的估计误差贴近目标真%To resolve weight degeneracy and avoid sample impoverishment in resampling algorithms of particle filter, a method, named layered transacting MCMC-resampling algorithm, is proposed. When the effective sample size is below a fixed threshold, particles are dived into two sample subsets according to their individual weights. Mutation operator and PSO, which are considered as transition kernels of MCMC, are applied to sample subsets respectively. Then an acceptancerejection rule of Metropolis-Hastings algorithm is used to generate the Markov chain with the stationary distribution which is equivalent to target posterior density. The simulation results show that the proposed method is superior to other resampling algorithms both in accuracy and convergence speed.
A bootstrapped, low-noise, and high-gain photodetector for shot noise measurement
Energy Technology Data Exchange (ETDEWEB)
Zhou, Haijun; Yang, Wenhai; Li, Zhixiu; Li, Xuefeng; Zheng, Yaohui, E-mail: yhzheng@sxu.edu.cn [State Key Laboratory of Quantum Optics and Quantum Optics Devices, Institute of Opto-Electronics, Shanxi University, Taiyuan 030006 (China)
2014-01-15
We presented a low-noise, high-gain photodetector based on the bootstrap structure and the L-C (inductance and capacitance) combination. Electronic characteristics of the photodetector, including electronic noise, gain and frequency response, and dynamic range, were verified through a single-frequency Nd:YVO{sub 4} laser at 1064 nm with coherent output. The measured shot noise of 50 μW laser was 13 dB above the electronic noise at the analysis frequency of 2 MHz, and 10 dB at 3 MHz. And a maximum clearance of 28 dB at 2 MHz was achieved when 1.52 mW laser was illuminated. In addition, the photodetector showed excellent linearities for both DC and AC amplifications in the laser power range between 12.5 μW and 1.52 mW.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Dynamic Assessment of Vibration of Tooth Modification Gearbox Using Grey Bootstrap Method
Directory of Open Access Journals (Sweden)
Hui-liang Wang
2015-01-01
Full Text Available The correlation analysis between gear modification and vibration characteristics of transmission system was difficult to quantify; a novel small sample vibration of gearbox prediction method based on grey system theory and bootstrap theory was presented. The method characterized vibration base feature of tooth modification gearbox by developing dynamic uncertainty, estimated true value, and systematic error measure, and these parameters could indirectly dynamically evaluate the effect of tooth modification. The method can evaluate the vibration signal of gearbox with installation of no tooth modification gear and topological modification gear, respectively, considering that 100% reliability is the constraints condition and minimum average uncertainty is the target value. Computer simulation and experiment results showed that vibration amplitude of gearbox was decreased partly due to topological tooth modification, and each value of average dynamic uncertainty, mean true value, and systematic error measure was smaller than the no tooth modification value. The study provided an important guide for tooth modification, dynamic performance optimization.
Solid oxide fuel cell power plant having a bootstrap start-up system
Lines, Michael T
2016-10-04
The bootstrap start-up system (42) achieves an efficient start-up of the power plant (10) that minimizes formation of soot within a reformed hydrogen rich fuel. A burner (48) receives un-reformed fuel directly from the fuel supply (30) and combusts the fuel to heat cathode air which then heats an electrolyte (24) within the fuel cell (12). A dilute hydrogen forming gas (68) cycles through a sealed heat-cycling loop (66) to transfer heat and generated steam from an anode side (32) of the electrolyte (24) through fuel processing system (36) components (38, 40) and back to an anode flow field (26) until fuel processing system components (38, 40) achieve predetermined optimal temperatures and steam content. Then, the heat-cycling loop (66) is unsealed and the un-reformed fuel is admitted into the fuel processing system (36) and anode flow (26) field to commence ordinary operation of the power plant (10).
Renyi Entropies, the Analytic Bootstrap, and 3D Quantum Gravity at Higher Genus
Headrick, Matthew; Perlmutter, Eric; Zadeh, Ida G
2015-01-01
We compute the contribution of the vacuum Virasoro representation to the genus-two partition function of an arbitrary CFT with central charge $c>1$. This is the perturbative pure gravity partition function in three dimensions. We employ a sewing construction, in which the partition function is expressed as a sum of sphere four-point functions of Virasoro vacuum descendants. For this purpose, we develop techniques to efficiently compute correlation functions of holomorphic operators, which by crossing symmetry are determined exactly by a finite number of OPE coefficients; this is an analytic implementation of the conformal bootstrap. Expanding the results in $1/c$, corresponding to the semiclassical bulk gravity expansion, we find that---unlike at genus one---the result does not truncate at finite loop order. Our results also allow us to extend earlier work on multiple-interval Renyi entropies and on the partition function in the separating degeneration limit.
Directory of Open Access Journals (Sweden)
Xintao Xia
2013-07-01
Full Text Available This study proposed the bootstrap maximum-entropy method to evaluate the uncertainty of the starting torque of a slewing bearing. Addressing the variation coefficient of the slewing bearing starting torque under load, the probability density function, estimated true value and variation domain are obtained through experimental investigation of the slewing bearing starting torque under various loads. The probability density function is found to be characterized by variational figure, scale and location. In addition, the estimated true value and the variation domain vary from large to small along with increasing load, indicating better evolution of the stability and reliability of the starting friction torque. Finally, a sensitive spot exists where the estimated true value and the variation domain rise abnormally, showing a fluctuation in the immunity and a degenerative disorder in the stability and reliability of the starting friction torque.
Application of Robust Regression and Bootstrap in Poductivity Analysis of GERD Variable in EU27
Directory of Open Access Journals (Sweden)
Dagmar Blatná
2014-06-01
Full Text Available The GERD is one of Europe 2020 headline indicators being tracked within the Europe 2020 strategy. The headline indicator is the 3% target for the GERD to be reached within the EU by 2020. Eurostat defi nes “GERD” as total gross domestic expenditure on research and experimental development in a percentage of GDP. GERD depends on numerous factors of a general economic background, namely of employment, innovation and research, science and technology. The values of these indicators vary among the European countries, and consequently the occurrence of outliers can be anticipated in corresponding analyses. In such a case, a classical statistical approach – the least squares method – can be highly unreliable, the robust regression methods representing an acceptable and useful tool. The aim of the present paper is to demonstrate the advantages of robust regression and applicability of the bootstrap approach in regression based on both classical and robust methods.
A spatial bootstrap technique for parameter estimation of rainfall annual maxima distribution
Directory of Open Access Journals (Sweden)
F. Uboldi
2013-09-01
Full Text Available Estimation of extreme event distributions and depth-duration-frequency (DDF curves is achieved at any target site by repeated sampling among all available raingauge data in the surrounding area. The estimate is computed over a gridded domain in Northern Italy, using precipitation time series from 1929 to 2011, including data from historical analog stations and from the present-day automatic observational network. The presented local regionalisation naturally overcomes traditional station-point methods, with their demand of long historical series and their sensitivity to very rare events occurring at very few stations, possibly causing unrealistic spatial gradients in DDF relations. At the same time, the presented approach allows for spatial dependence, necessary in a geographical domain such as Lombardy, complex for both its topography and its climatology. The bootstrap technique enables evaluating uncertainty maps for all estimated parameters and for rainfall depths at assigned return periods.
A bootstrapped, low-noise, and high-gain photodetector for shot noise measurement.
Zhou, Haijun; Yang, Wenhai; Li, Zhixiu; Li, Xuefeng; Zheng, Yaohui
2014-01-01
We presented a low-noise, high-gain photodetector based on the bootstrap structure and the L-C (inductance and capacitance) combination. Electronic characteristics of the photodetector, including electronic noise, gain and frequency response, and dynamic range, were verified through a single-frequency Nd:YVO4 laser at 1064 nm with coherent output. The measured shot noise of 50 μW laser was 13 dB above the electronic noise at the analysis frequency of 2 MHz, and 10 dB at 3 MHz. And a maximum clearance of 28 dB at 2 MHz was achieved when 1.52 mW laser was illuminated. In addition, the photodetector showed excellent linearities for both DC and AC amplifications in the laser power range between 12.5 μW and 1.52 mW.
Integrating Multiple Microarray Data for Cancer Pathway Analysis Using Bootstrapping K-S Test
Directory of Open Access Journals (Sweden)
Bing Han
2009-01-01
Full Text Available Previous applications of microarray technology for cancer research have mostly focused on identifying genes that are differentially expressed between a particular cancer and normal cells. In a biological system, genes perform different molecular functions and regulate various biological processes via interactions with other genes thus forming a variety of complex networks. Therefore, it is critical to understand the relationship (e.g., interactions between genes across different types of cancer in order to gain insights into the molecular mechanisms of cancer. Here we propose an integrative method based on the bootstrapping Kolmogorov-Smirnov test and a large set of microarray data produced with various types of cancer to discover common molecular changes in cells from normal state to cancerous state. We evaluate our method using three key pathways related to cancer and demonstrate that it is capable of finding meaningful alterations in gene relations.
Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems
DeDeo, Simon; Klingenstein, Sara; Hitchcock, Tim
2013-01-01
We characterize the statistical bootstrap for the estimation of information-theoretic quantities from data, with particular reference to its use in the study of large-scale social phenomena. Our methods allow one to preserve, approximately, the underlying axiomatic relationships of information theory---in particular, consistency under arbitrary coarse-graining---that motivate use of these quantities in the first place, while providing reliability comparable to the state of the art for Bayesian estimators. We show how information-theoretic quantities allow for rigorous empirical study of the decision-making capacities of rational agents, and the time-asymmetric flows of information in distributed systems. We provide illustrative examples by reference to ongoing collaborative work on the semantic structure of the British Criminal Court system and the conflict dynamics of the contemporary Afghanistan insurgency.
Bootstrapping in a language of thought: a formal model of numerical concept learning.
Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D
2012-05-01
In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning.
Financial Development and Economic Growth in European Countries: Bootstrap Causality Analysis
Directory of Open Access Journals (Sweden)
Fuat Lebe
2016-01-01
Full Text Available In the present study, it was investigated whether there was a causality relationship between financial development and economic growth for sixteen European countries. Data from the period of 1988-2012 was analyzed using the bootstrap panel causality test, which takes cross-section dependence and heterogeneity into account. The results of the test showed that there was a strong causality relationship between financial development and economic growth in European countries. In European countries, there was a causality relationship from economic growth to financial development and from financial development to economic growth. These results support both the supply-leading and the demand-following hypotheses. Therefore, it can be said that the feedback hypothesis is valid for European countries.
A Bootstrapping Based Approach for Open Geo-entity Relation Extraction
Directory of Open Access Journals (Sweden)
YU Li
2016-05-01
Full Text Available Extracting spatial relations and semantic relations between two geo-entities from Web texts, asks robust and effective solutions. This paper puts forward a novel approach: firstly, the characteristics of terms (part-of-speech, position and distance are analyzed by means of bootstrapping. Secondly, the weight of each term is calculated and the keyword is picked out as the clue of geo-entity relations. Thirdly, the geo-entity pairs and their keywords are organized into structured information. Finally, an experiment is conducted with Baidubaike and Stanford CoreNLP. The study shows that the presented method can automatically explore part of the lexical features and find additional relational terms which neither the domain expert knowledge nor large scale corpora need. Moreover, compared with three classical frequency statistics methods, namely Frequency, TF-IDF and PPMI, the precision and recall are improved about 5% and 23% respectively.
Multi-level bootstrap analysis of stable clusters in resting-state fMRI.
Bellec, Pierre; Rosa-Neto, Pedro; Lyttelton, Oliver C; Benali, Habib; Evans, Alan C
2010-07-01
A variety of methods have been developed to identify brain networks with spontaneous, coherent activity in resting-state functional magnetic resonance imaging (fMRI). We propose here a generic statistical framework to quantify the stability of such resting-state networks (RSNs), which was implemented with k-means clustering. The core of the method consists in bootstrapping the available datasets to replicate the clustering process a large number of times and quantify the stable features across all replications. This bootstrap analysis of stable clusters (BASC) has several benefits: (1) it can be implemented in a multi-level fashion to investigate stable RSNs at the level of individual subjects and at the level of a group; (2) it provides a principled measure of RSN stability; and (3) the maximization of the stability measure can be used as a natural criterion to select the number of RSNs. A simulation study validated the good performance of the multi-level BASC on purely synthetic data. Stable networks were also derived from a real resting-state study for 43 subjects. At the group level, seven RSNs were identified which exhibited a good agreement with the previous findings from the literature. The comparison between the individual and group-level stability maps demonstrated the capacity of BASC to establish successful correspondences between these two levels of analysis and at the same time retain some interesting subject-specific characteristics, e.g. the specific involvement of subcortical regions in the visual and fronto-parietal networks for some subjects.
BOBA FRET: bootstrap-based analysis of single-molecule FRET data.
Directory of Open Access Journals (Sweden)
Sebastian L B König
Full Text Available Time-binned single-molecule Förster resonance energy transfer (smFRET experiments with surface-tethered nucleic acids or proteins permit to follow folding and catalysis of single molecules in real-time. Due to the intrinsically low signal-to-noise ratio (SNR in smFRET time traces, research over the past years has focused on the development of new methods to extract discrete states (conformations from noisy data. However, limited observation time typically leads to pronounced cross-sample variability, i.e., single molecules display differences in the relative population of states and the corresponding conversion rates. Quantification of cross-sample variability is necessary to perform statistical testing in order to assess whether changes observed in response to an experimental parameter (metal ion concentration, the presence of a ligand, etc. are significant. However, such hypothesis testing has been disregarded to date, precluding robust biological interpretation. Here, we address this problem by a bootstrap-based approach to estimate the experimental variability. Simulated time traces are presented to assess the robustness of the algorithm in conjunction with approaches commonly used in thermodynamic and kinetic analysis of time-binned smFRET data. Furthermore, a pair of functionally important sequences derived from the self-cleaving group II intron Sc.ai5γ (d3'EBS1/IBS1 is used as a model system. Through statistical hypothesis testing, divalent metal ions are shown to have a statistically significant effect on both thermodynamic and kinetic aspects of their interaction. The Matlab source code used for analysis (bootstrap-based analysis of smFRET data, BOBA FRET, as well as a graphical user interface, is available via http://www.aci.uzh.ch/rna/.
The bootstrapped model--Lessons for the acceptance of intellectual technology.
Lovie, A D
1987-09-01
This paper is intended as a non-technical introduction to a growing aspect of what has been termed 'intellectual technology'. The particular area chosen is the use of simple linear additive models for judgement and decision making purposes. Such models are said to either outperform, or perform at least as well as, the human judges on which they are based, hence they are said to 'bootstrap' such human inputs. Although the paper will provide a fairly comprehensive list of recent applications of such models, from postgraduate selection to judgements of marital happiness, the work will concentrate on the topic of Credit Scoring as an exemplar - that is, the assignment of credit by means of a simple additive rule. The paper will also present a simple system, due to Dawes, of classifying such models according to the form and source of their weights. The paper further discusses the reasons for bootstrapping and that other major phenomenon of such models - that is, the one can rarely distinguish between the prescriptions of such models, however the weights have been arrived at. It is argued that this 'principle of the flat maximum' allows us to develop a technology of judgement. The paper continues with a brief historical survey of the reactions of human experts to such models and their superiority, and suggestions for a better mix of expert and model on human engineering lines. Finally, after a brief comparison between expert systems and linear additive models, the paper concludes with a brief survey of possible future developments. A short Appendix describes two applications of such models.
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
Burg, van der Eeke; Leeuw, de Jan
1988-01-01
In this paper we discuss the estimation of mean and standard errors of the eigenvalues and category quantifications in generalized non-linear canonical correlation analysis (OVERALS). Starting points are the delta method equations, but the jack-knife and bootstrap are used to provide finite differen
DEFF Research Database (Denmark)
Callesen, Ingeborg; Vesterdal, Lars; Stupak, Inge;
Forest soil plots (N=112) of the size 50x50 meter were sampled in 1989-90 (C1) and re-sampled in 2007-9 (C2) by soil auger, producing composite samples from the depths 0-25, 25-50, 50-75 and 75-100 cm. The soils were classified according to the carbon concentration in the uppermost mineral soil h...... especially base rich loamy soils to take full advantage of the nitrogen deposition and CO2 fertilization effects.......Forest soil plots (N=112) of the size 50x50 meter were sampled in 1989-90 (C1) and re-sampled in 2007-9 (C2) by soil auger, producing composite samples from the depths 0-25, 25-50, 50-75 and 75-100 cm. The soils were classified according to the carbon concentration in the uppermost mineral soil...... horizon (0-25 cm) at C1. Soils with less than 1.8% carbon gained carbon during the 18 yr period, while initially very carbon rich (4soils and organic soils (C%>12) lost carbon. We hypothesize that the carbon losses reflect a very slow process of adaptation to the current more aerobic...
Shimada, Makoto K; Nishida, Tsunetoshi
2017-02-20
Felsenstein's PHYLIP package of molecular phylogeny tools has been used globally since 1980. The programs are receiving renewed attention because of their character-based user interface, which has the advantage of being scriptable for use with large-scale data studies based on super-computers or massively parallel computing clusters. However, occasionally we found, the PHYLIP Consense program output text file displays two or more divided bootstrap values for the same cluster in its result table, and when this happens the output Newick tree file incorrectly assigns only the last value to that cluster that disturbs correct estimation of a consensus tree. We ascertained the cause of this aberrant behavior in the bootstrapping calculation. Our rewrite of the Consense program source code outputs bootstrap values, without redundancy, in its result table, and a Newick tree file with appropriate, corresponding bootstrap values. Furthermore, we developed an add-on program and shell script, add_bootstrap.pl and fasta2tre_bs.bsh, to generate a Newick tree containing the topology and branch lengths inferred from the original data along with valid bootstrap values, and to actualize the automated inference of a phylogenetic tree containing the originally inferred topology and branch lengths with bootstrap values, from multiple unaligned sequences, respectively. These programs can be downloaded at: https://github.com/ShimadaMK/PHYLIP_enhance/.
Adams, Jenny; Cheng, Dunlei; Lee, John; Shock, Tiffany; Kennedy, Kathleen; Pate, Scotty
2014-07-01
Physical fitness testing is a common tool for motivating employees with strenuous occupations to reach and maintain a minimum level of fitness. Nevertheless, the use of such tests can be hampered by several factors, including required compliance with US antidiscrimination laws. The Highland Park (Texas) Department of Public Safety implemented testing in 1991, but no single test adequately evaluated its sworn employees, who are cross-trained and serve as police officers and firefighters. In 2010, the department's fitness experts worked with exercise physiologists from Baylor Heart and Vascular Hospital to develop and evaluate a single test that would be equitable regardless of race/ethnicity, disability, sex, or age >50 years. The new test comprised a series of exercises to assess overall fitness, followed by two sequences of job-specific tasks related to firefighting and police work, respectively. The study group of 50 public safety officers took the test; raw data (e.g., the number of repetitions performed or the time required to complete a task) were collected during three quarterly testing sessions. The statistical bootstrap method was then used to determine the levels of performance that would correlate with 0, 1, 2, or 3 points for each task. A sensitivity analysis was done to determine the overall minimum passing score of 17 points. The new physical fitness test and scoring system have been incorporated into the department's policies and procedures as part of the town's overall employee fitness program.
Adamowski, J. F.; Quilty, J.; Khalil, B.; Rathinasamy, M.
2014-12-01
This paper explores forecasting short-term urban water demand (UWD) (using only historical records) through a variety of machine learning techniques coupled with a novel input variable selection (IVS) procedure. The proposed IVS technique termed, bootstrap rank-ordered conditional mutual information for real-valued signals (brCMIr), is multivariate, nonlinear, nonparametric, and probabilistic. The brCMIr method was tested in a case study using water demand time series for two urban water supply system pressure zones in Ottawa, Canada to select the most important historical records for use with each machine learning technique in order to generate forecasts of average and peak UWD for the respective pressure zones at lead times of 1, 3, and 7 days ahead. All lead time forecasts are computed using Artificial Neural Networks (ANN) as the base model, and are compared with Least Squares Support Vector Regression (LSSVR), as well as a novel machine learning method for UWD forecasting: the Extreme Learning Machine (ELM). Results from one-way analysis of variance (ANOVA) and Tukey Honesty Significance Difference (HSD) tests indicate that the LSSVR and ELM models are the best machine learning techniques to pair with brCMIr. However, ELM has significant computational advantages over LSSVR (and ANN) and provides a new and promising technique to explore in UWD forecasting.
Learning Biological Networks via Bootstrapping with Optimized GO-based Gene Similarity
Energy Technology Data Exchange (ETDEWEB)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.; Baddeley, Robert L.; Riensche, Roderick M.; Jensen, Russell S.; Verhagen, Marc
2010-08-02
Microarray gene expression data provide a unique information resource for learning biological networks using "reverse engineering" methods. However, there are a variety of cases in which we know which genes are involved in a given pathology of interest, but we do not have enough experimental evidence to support the use of fully-supervised/reverse-engineering learning methods. In this paper, we explore a novel semi-supervised approach in which biological networks are learned from a reference list of genes and a partial set of links for these genes extracted automatically from PubMed abstracts, using a knowledge-driven bootstrapping algorithm. We show how new relevant links across genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. We describe an application of this approach to the TGFB pathway as a case study and show how the ensuing results prove the feasibility of the approach as an alternate or complementary technique to fully supervised methods.
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
Kantar, E.; Deviren, B.; Keskin, M.
2011-11-01
We present a study, within the scope of econophysics, of the hierarchical structure of 98 among the largest international companies including 18 among the largest Turkish companies, namely Banks, Automobile, Software-hardware, Telecommunication Services, Energy and the Oil-Gas sectors, viewed as a network of interacting companies. We analyze the daily time series data of the Boerse-Frankfurt and Istanbul Stock Exchange. We examine the topological properties among the companies over the period 2006-2010 by using the concept of hierarchical structure methods (the minimal spanning tree (MST) and the hierarchical tree (HT)). The period is divided into three subperiods, namely 2006-2007, 2008 which was the year of global economic crisis, and 2009-2010, in order to test various time-windows and observe temporal evolution. We carry out bootstrap analyses to associate the value of statistical reliability to the links of the MSTs and HTs. We also use average linkage clustering analysis (ALCA) in order to better observe the cluster structure. From these studies, we find that the interactions among the Banks/Energy sectors and the other sectors were reduced after the global economic crisis; hence the effects of the Banks and Energy sectors on the correlations of all companies were decreased. Telecommunication Services were also greatly affected by the crisis. We also observed that the Automobile and Banks sectors, including Turkish companies as well as some companies from the USA, Japan and Germany were strongly correlated with each other in all periods.
Economic policy uncertainty and housing returns in Germany: Evidence from a bootstrap rolling window
Directory of Open Access Journals (Sweden)
David Su
2016-06-01
Full Text Available The purpose of this investigation is to research the causal link between economic policy uncertainty (EPU and the housing returns (HR in Germany. In the estimated vector autoregressive models, we test its stability and find the short-run relationship between HR and EPU is unstable. As a result, a time-varying approach (bootstrap rolling window causality test is utilized to revisit the dynamic causal link, and we find EPU has no impact on HR due to the stability of the real estate market in Germany. HR does not have significant effects on EPU in most time periods. However, significant feedback in several sub-periods (both positive and negative are found from HR to EPU, which indicates the causal link from HR to EPU varies over time. The empirical results do not support the general equilibrium model of government policy choices that indicate EPU does not play a role in the real estate market. The basic conclusion is that the real estate market shows its stability due to the social welfare nature and the rational institutional arrangement of the real estate in Germany, and the real estate market also shows its importance that it has significant effect on the economic policy choice in some periods when negative external shocks occur.
Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime
Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; Holcomb, C. T.; Lao, L. L.; McKee, G. R.; Meneghini, O.; Staebler, G. M.; Grierson, B. A.; Qian, J. P.; Solomon, W. M.; Turnbull, A. D.; Holland, C.; Guo, W. F.; Ding, S. Y.; Pan, C. K.; Xu, G. S.; Wan, B. N.
2016-06-01
Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of βp and βN , despite strong internal transport barriers. Good confinement has been achieved with reduced toroidal rotation. These high βp plasmas challenge the energy transport understanding, especially in the electron energy channel. A new turbulent transport model, named TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. More investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.
Directory of Open Access Journals (Sweden)
Łukasz Lach
2010-06-01
Full Text Available This paper examines the size performance of the Toda-Yamamoto testfor Granger causality in the case of trivariate integrated and cointegratedVAR systems. The standard asymptotic distribution theory andthe residual-based bootstrap approach are applied. A variety of typesof distribution of error term is considered. The impact of misspecificationof initial parameters as well as the influence of an increase in samplesize and number of bootstrap replications on size performance ofToda-Yamamoto test statistics is also examined. The results of the conductedsimulation study confirm that standard asymptotic distributiontheory may often cause significant over-rejection. Application of bootstrapmethods usually leads to improvement of size performance of theToda-Yamamoto test. However, in some cases the considered bootstrapmethod also leads to serious size distortion and performs worse thanthe traditional approach based on χ2 distribution.
DJANSENA, Alradix; 田中, 宏明; 工藤, 亮
2015-01-01
CFRP has been used in aircraft structures for decades. Although CFRP is light, its laminationis its main weakness. We have developed a new method to increase the probability of detectingdelamination in carbon fiber reinforced plastic (CFRP) by narrowing the confidence interval ofthe changes in natural frequency. The changes in the natural frequency in delaminated CFRPare tiny compared with measurement errors. We use the bootstrap method, a statisticaltechnique that increases the estimation ac...
Directory of Open Access Journals (Sweden)
Campbell Michael J
2004-12-01
Full Text Available Abstract Health-Related Quality of Life (HRQoL measures are becoming increasingly used in clinical trials as primary outcome measures. Investigators are now asking statisticians for advice on how to analyse studies that have used HRQoL outcomes. HRQoL outcomes, like the SF-36, are usually measured on an ordinal scale. However, most investigators assume that there exists an underlying continuous latent variable that measures HRQoL, and that the actual measured outcomes (the ordered categories, reflect contiguous intervals along this continuum. The ordinal scaling of HRQoL measures means they tend to generate data that have discrete, bounded and skewed distributions. Thus, standard methods of analysis such as the t-test and linear regression that assume Normality and constant variance may not be appropriate. For this reason, conventional statistical advice would suggest that non-parametric methods be used to analyse HRQoL data. The bootstrap is one such computer intensive non-parametric method for analysing data. We used the bootstrap for hypothesis testing and the estimation of standard errors and confidence intervals for parameters, in four datasets (which illustrate the different aspects of study design. We then compared and contrasted the bootstrap with standard methods of analysing HRQoL outcomes. The standard methods included t-tests, linear regression, summary measures and General Linear Models. Overall, in the datasets we studied, using the SF-36 outcome, bootstrap methods produce results similar to conventional statistical methods. This is likely because the t-test and linear regression are robust to the violations of assumptions that HRQoL data are likely to cause (i.e. non-Normality. While particular to our datasets, these findings are likely to generalise to other HRQoL outcomes, which have discrete, bounded and skewed distributions. Future research with other HRQoL outcome measures, interventions and populations, is required to
Directory of Open Access Journals (Sweden)
Enrico Zio
2008-01-01
Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making.
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
Pope, Crystal L.; Crenshaw, D. Michael; Fischer, Travis C.
2016-01-01
We present a preliminary analysis of the inflows and outflows in the narrow-line regions of nearby (zNIFS). In addition to the standard reduction procedure for NIFS data cubes, these observations were treated for multiple sources of noise and artifacts from the adaptive optics observations and the NIFS instrument. This procedure included the following steps: correction of the differential atmospheric refraction, spatial resampling, low-pass Butterworth spatial filtering, removal of the "instrumental fingerprint", and the Richardson-Lucy deconvolution. We compare measurements from NIFS data cubes with and without the additional correction procedures to determine the effect of this data treatment on our scientific results.
一种基于SVM重采样的似然粒子滤波算法%Likelihood particle filter based on support vector machines resampling
Institute of Scientific and Technical Information of China (English)
蒋蔚; 伊国兴; 曾庆双
2011-01-01
To cope with state estimation problems of nonlinear/non-Gaussian dynamic systems with weak measurement noise, an improved likelihood particle filter(LPF) algorithm is proposed based on support vector machines(SVM) resampling.Firstly, the algorithm employs the likelihood as proposal distribution and takes account of the most recent observation, so it is comparably closer to the posterior than the transition prior used as proposal. Then, the posterior probability density model of the states is estimated by SVM with current particles and their importance weights during iteration. Finally, after resampling the new particles from the given density model, degeneration problem is solved effectively by these diversiform particles.The simulation results show the feasibility and effectiveness of the algorithm.%针对弱观测噪声条件下非线性、非高斯动态系统的滤波问题,提出一种基于支持向量机的似然粒子滤波算法.首先,采用似然函数作为提议分布,融入最新的观测信息,比采用先验转移密度的一般粒子滤波算法更接近状态的真实后验密度;然后,利用当前粒子及其权值,使用支持向量机估计出状态的后验概率密度模型;最后,根据此模型重采样更新粒子集,有效地克服粒子退化现象并提高状态估计精度.仿真结果表明了所提出算法的可行性和有效性.
Directory of Open Access Journals (Sweden)
PEREIRA JOSÉ ERIVALDO
2000-01-01
Full Text Available Estimativas "bootstrap" da média aritmética dos genótipos de soja 'Pickett', 'Peking', PI88788 e PI90763 e os intervalos de confiança obtidos pela teoria normal e através da distribuição "bootstrap" deste estimador, como o percentil "bootstrap" e o BCa, correção para o viés e aceleração, do parâmetro de diferenciação da cultivar padrão de suscetibilidade Lee são utilizados para classificar raças do nematóide de cisto da soja. Os intervalos de confiança obtidos a partir da distribuição "bootstrap" apresentaram menor amplitude e foram muito similares, dessa forma, o limite inferior do intervalo de confiança percentil "bootstrap" foi tomado como nível de referência nas distribuições "bootstrap" do estimador da média aritmética dos genótipos diferenciadores, permitindo estimar a probabilidade empírica de uma reação positiva ou negativa, e, conseqüentemente, identificar a raça mais provável sob determinado teste.
Tsai, Chia-Ling; Li, Chun-Yi; Yang, Gehua
2008-03-01
Red-free (RF) fundus retinal images and fluorescein angiogram (FA) sequence are often captured from an eye for diagnosis and treatment of abnormalities of the retina. With the aid of multimodal image registration, physicians can combine information to make accurate surgical planning and quantitative judgment of the progression of a disease. The goal of our work is to jointly align the RF images with the FA sequence of the same eye in a common reference space. Our work is inspired by Generalized Dual-Bootstrap Iterative Closest Point (GDB-ICP), which is a fully-automatic, feature-based method using structural similarity. GDB-ICP rank-orders Lowe keypoint matches and refines the transformation computed from each keypoint match in succession. Albeit GDB-ICP has been shown robust to image pairs with illumination difference, the performance is not satisfactory for multimodal and some FA pairs which exhibit substantial non-linear illumination changes. Our algorithm, named Edge-Driven DBICP, modifies generation of keypoint matches for initialization by extracting the Lowe keypoints from the gradient magnitude image, and enriching the keypoint descriptor with global-shape context using the edge points. Our dataset consists of 61 randomly selected pathological sequences, each on average having two RF and 13 FA images. There are total of 4985 image pairs, out of which 1323 are multimodal pairs. Edge-Driven DBICP successfully registered 93% of all pairs, and 82% multimodal pairs, whereas GDB-ICP registered 80% and 40%, respectively. Regarding registration of the whole image sequence in a common reference space, Edge-Driven DBICP succeeded in 60 sequences, which is 26% improvement over GDB-ICP.
Rivola, Alessandro; Troncossi, Marco
2014-02-01
An experimental test campaign was performed on the valve train of a racing motorbike engine in order to get insight into the dynamic of the system. In particular the valve motion was acquired in cold test conditions by means of a laser vibrometer able to acquire displacement and velocity signals. The valve time-dependent measurements needed to be referred to the camshaft angular position in order to analyse the data in the angular domain, as usually done for rotating machines. To this purpose the camshaft was fitted with a zebra tape whose dark and light stripes were tracked by means of an optical probe. Unfortunately, both manufacturing and mounting imperfections of the employed zebra tape, resulting in stripes with slightly different widths, precluded the possibility to directly obtain the correct relationship between camshaft angular position and time. In order to overcome this problem, the identification of the zebra tape was performed by means of the original and practical procedure that is the focus of the present paper. The method consists of three main steps: namely, an ad-hoc test corresponding to special operating conditions, the computation of the instantaneous angular speed, and the final association of the stripes with the corresponding shaft angular position. The results reported in the paper demonstrate the suitability of the simple procedure for the zebra tape identification performed with the final purpose to implement a computed order tracking technique for the data analysis.
Verifying interpretive criteria for bioaerosol data using (bootstrap) Monte Carlo techniques.
Spicer, R Christopher; Gangloff, Harry
2008-02-01
A number of interpretive descriptors have been proposed for bioaerosol data due to the lack of health-based numerical standards, but very few have been verified as to their ability to describe a suspect indoor environment. Culturable and nonculturable (spore trap) sampling using the bootstrap version of Monte Carlo simulation (BMC) at several sites during 2003-2006 served as a source of indoor and outdoor data to test various criteria with regard to their variability in characterizing an indoor or outdoor environment. The purpose was to gain some insight for the reliability of some of the interpretive criteria in use as well as to demonstrate the utility of BMC methods as a generalized technique for validation of various interpretive criteria for bioaerosols. The ratio of nonphylloplane (NP) fungi (total of Aspergillus and Penicillium) to phylloplane (P) fungi (total of Cladosporium, Alternaria, and Epicoccum), or NP/P, is a descriptor that has been used to identify "dominance" of nonphylloplane fungi (NP/P > 1.0), assumed to be indicative of a problematic indoor environment. However, BMC analysis of spore trap and culturable bioaerosol data using the NP/P ratio identified frequent dominance by nonphylloplane fungi in outdoor air. Similarly, the NP/P descriptor indicated dominance of nonphylloplane fungi in buildings with visible mold growth and/or known water intrusion with a frequency often in the range of 0.5 Fixed numerical criteria for spore trap data of 900 and 1300 spores/m(3) for total spores and 750 Aspergillus/Penicillium spores/m(3) exhibited similar variability, as did ratios of nonphylloplane to total fungi, phylloplane to total fungi, and indoor/outdoor ratios for total fungal spores. Analysis of bioaerosol data by BMC indicates that numerical levels or descriptors based on dominance of certain fungi are unreliable as criteria for characterizing a given environment. The utility of BMC analysis lies in its generalized application to test mathematically
Directory of Open Access Journals (Sweden)
Gogarten J Peter
2002-02-01
Full Text Available Abstract Background Horizontal gene transfer (HGT played an important role in shaping microbial genomes. In addition to genes under sporadic selection, HGT also affects housekeeping genes and those involved in information processing, even ribosomal RNA encoding genes. Here we describe tools that provide an assessment and graphic illustration of the mosaic nature of microbial genomes. Results We adapted the Maximum Likelihood (ML mapping to the analyses of all detected quartets of orthologous genes found in four genomes. We have automated the assembly and analyses of these quartets of orthologs given the selection of four genomes. We compared the ML-mapping approach to more rigorous Bayesian probability and Bootstrap mapping techniques. The latter two approaches appear to be more conservative than the ML-mapping approach, but qualitatively all three approaches give equivalent results. All three tools were tested on mitochondrial genomes, which presumably were inherited as a single linkage group. Conclusions In some instances of interphylum relationships we find nearly equal numbers of quartets strongly supporting the three possible topologies. In contrast, our analyses of genome quartets containing the cyanobacterium Synechocystis sp. indicate that a large part of the cyanobacterial genome is related to that of low GC Gram positives. Other groups that had been suggested as sister groups to the cyanobacteria contain many fewer genes that group with the Synechocystis orthologs. Interdomain comparisons of genome quartets containing the archaeon Halobacterium sp. revealed that Halobacterium sp. shares more genes with Bacteria that live in the same environment than with Bacteria that are more closely related based on rRNA phylogeny . Many of these genes encode proteins involved in substrate transport and metabolism and in information storage and processing. The performed analyses demonstrate that relationships among prokaryotes cannot be accurately
The analysis of slope flatting caused by DEM resampling%重采样方法对DEM地形表达的影响分析
Institute of Scientific and Technical Information of China (English)
王宇; 白天路
2013-01-01
Based on 5m resolution DEM obtained by 1: 10000 digital topographic maps, the slopes of 10m, 20m, 40m resolution DEM are obtained by resampling. Frequency analysis of slopes shows that the DEM slopes decrease as its resolution became lower, i.e. , smaller slopes frequency increased, larger slopes frequency decreased. The real experiment results show when DEM resolution is higher, the difference concentrated in the smaller slopes. The difference distribution of curvatures became more dispersed and concentrated in the larger slopes while the DEM resolution is lower. After study of the profile curvature frequency, the curvature of profile become smaller for lower resolution resampling, which indicates lose of topographic information and simplification of spatial structure of DEM.%重采样方法是获取不同分辨率DEM数据的主要手段之一,同时也影响DEM地形表达的能力.本文采用坡度和剖面曲率作为分析重采样影响DEM地形表达的因子,通过比较利用重采样方法生成的不同分辨率DEM的坡度和剖面曲率,分析重采样方法对DEM地形表达的影响.以5m分辨率DEM为基础重采样为10m、20m、40m水平分辨率DEM的实验表明,重采样过程中随着DEM分辨率的降低,坡度整体呈减小趋势,地形趋于平缓,坡度高频部分损失严重；剖面曲率随着重采样后DEM分辨率的降低,其频率分布曲线由分散变得集中,表明地形信息丰富的高分辨率DEM由于重采样方法而造成地形信息丢失,空间结构简单化.
Directory of Open Access Journals (Sweden)
Dropkin Greg
2009-12-01
Full Text Available Abstract Background The International Commission on Radiological Protection (ICRP recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years, liver (36.9, lung (13.6, leukaemia (23.66, and pancreas (11.86 and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of
Pismensky, Artem L
2015-01-01
The method of calculation of $\\varepsilon$-expansion in model of scalar field with $\\varphi^3$-interaction based on conformal bootstrap equations is proposed. This technique is based on self-consistent skeleton equations involving full propagator and full triple vertex. Analytical computations of the Fisher's index $\\eta$ are performed in four-loop approximation. The three-loop result coincides with one obtained previously by the renormalization group equations technique based on calculation of a larger number of Feynman diagrams. The four-loop result agrees with its numerical value obtained by other authors.
Energy Technology Data Exchange (ETDEWEB)
Narayan, Paresh Kumar [Department of Accounting, Finance and Economics, Griffith University, Gold Coast (Australia); Prasad, Arti [School of Economics, University of the South Pacific, Suva (Fiji)
2008-02-15
The goal of this paper is to examine any causal effects between electricity consumption and real GDP for 30 OECD countries. We use a bootstrapped causality testing approach and unravel evidence in favour of electricity consumption causing real GDP in Australia, Iceland, Italy, the Slovak Republic, the Czech Republic, Korea, Portugal, and the UK. The implication is that electricity conservation policies will negatively impact real GDP in these countries. However, for the rest of the 22 countries our findings suggest that electricity conversation policies will not affect real GDP. (author)
Learning Semantic Lexicons Using Graph Mutual Reinforcement Based Bootstrapping%利用基于图互增理论的自举算法学习语义辞典
Institute of Scientific and Technical Information of China (English)
张奇; 邱锡鹏; 黄萱菁; 吴立德
2008-01-01
This paper presents a method to learn semantic lexicons using a new bootstrapping method based on graph mutual reinforcement (GMR). The approach uses only unlabeled data and a few seed words to learn new words for each semantic category.Different from other bootstrapping methods, we use GMR-based bootstrapping to sort the candidate words and patterns. Experi-mental results show that the GMR-based bootstrapping approach outperforms the existing algorithms both in in-domain data and out-domain data. Furthermore, it shows that the result depends on not only the size of the corpus but also the quality.
Ramponi, Denise R
2016-01-01
Dental problems are a common complaint in emergency departments in the United States. There are a wide variety of dental issues addressed in emergency department visits such as dental caries, loose teeth, dental trauma, gingival infections, and dry socket syndrome. Review of the most common dental blocks and dental procedures will allow the practitioner the opportunity to make the patient more comfortable and reduce the amount of analgesia the patient will need upon discharge. Familiarity with the dental equipment, tooth, and mouth anatomy will help prepare the practitioner for to perform these dental procedures.
Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.
2017-02-01
Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.
Hasan, Mahmudul; Tycner, Christopher; Sigut, Aaron; Zavala, Robert T.
2017-01-01
We describe a modified bootstrap Monte Carlo method that was developed to assess quantitatively the impact of systematic residual errors on calibrated optical interferometry data from the Navy Precision Optical Interferometer. A variety of atmospheric and instrumental effects represent the sources of residual systematic errors that remain in the data after calibration, for example when there are atmospheric fluctuations with shorter time scales than the time scale between the observations of calibrator-target pairs. The modified bootstrap Monte Carlo method retains the inherent structure of how the underlying data set was acquired, by accounting for the fact that groups of data points are obtained simultaneously instead of individual data points. When telescope pairs (baselines) and spectral channels corresponding to a specific output beam from a beam combiner are treated as groups, this method provides a more realistic (and typically larger) uncertainties associated with the fitted model parameters, such as angular diameters of resolved stars, than the standard method based solely on formal errors.This work has been supported by NSF grant AST-1614983.
Eleuteri, A; Fisher, A C; Groves, D; Dewhurst, C J
2012-01-01
The heart rate variability (HRV) signal derived from the ECG is a beat-to-beat record of RR intervals and is, as a time series, irregularly sampled. It is common engineering practice to resample this record, typically at 4 Hz, onto a regular time axis for analysis in advance of time domain filtering and spectral analysis based on the DFT. However, it is recognised that resampling introduces noise and frequency bias. The present work describes the implementation of a time-varying filter using a smoothing priors approach based on a Gaussian process model, which does not require data to be regular in time. Its output is directly compatible with the Lomb-Scargle algorithm for power density estimation. A web-based demonstration is available over the Internet for exemplar data. The MATLAB (MathWorks Inc.) code can be downloaded as open source.
Directory of Open Access Journals (Sweden)
A. Eleuteri
2012-01-01
Full Text Available The heart rate variability (HRV signal derived from the ECG is a beat-to-beat record of RR intervals and is, as a time series, irregularly sampled. It is common engineering practice to resample this record, typically at 4 Hz, onto a regular time axis for analysis in advance of time domain filtering and spectral analysis based on the DFT. However, it is recognised that resampling introduces noise and frequency bias. The present work describes the implementation of a time-varying filter using a smoothing priors approach based on a Gaussian process model, which does not require data to be regular in time. Its output is directly compatible with the Lomb-Scargle algorithm for power density estimation. A web-based demonstration is available over the Internet for exemplar data. The MATLAB (MathWorks Inc. code can be downloaded as open source.
A treatment procedure for Gemini North/NIFS data cubes: application to NGC 4151
Menezes, R. B.; Steiner, J. E.; Ricci, T. V.
2014-03-01
We present a detailed procedure for treating data cubes obtained with the Near-Infrared Integral Field Spectrograph (NIFS) of the Gemini North telescope. This process includes the following steps: correction of the differential atmospheric refraction, spatial re-sampling, Butterworth spatial filtering, `instrumental fingerprint' removal and Richardson-Lucy deconvolution. The clearer contours of the structures obtained with the spatial re-sampling, the high spatial-frequency noise removed with the Butterworth spatial filtering, the removed `instrumental fingerprints' (which take the form of vertical stripes along the images) and the improvement of the spatial resolution obtained with the Richardson-Lucy deconvolution result in images with a considerably higher quality. An image of the Brγ emission line from the treated data cube of NGC 4151 allows the detection of individual ionized-gas clouds (almost undetectable without the treatment procedure) of the narrow-line region of this galaxy, which are also seen in an [O III] image obtained with the Hubble Space Telescope. The radial velocities determined for each one of these clouds seem to be compatible with models of biconical outflows proposed by previous studies. Considering the observed improvements, we believe that the procedure we describe in this work may result in more reliable analysis of data obtained with this instrument.
Institute of Scientific and Technical Information of China (English)
徐红鹃; 王锋; 艾宪芸; 邵晖; 魏星
2014-01-01
The character of airborne gamma -ray spectrum and the basic principles of bootstrap method are de-scribed in the paper .The 40 K window of the gamma spectrum data measured by the GR -820 airborne multi-channel gamma spectrometers at the height of 300 m from the ground is taken as a small sample .Bootstrap meth-od is used to infer the distribution parameter of the small sample and sampling to obtain calculated spectra .The calculated spectra are compared with the measured spectra at the height of 60m from the ground , satisfactory re-sults were obtained .It provides a feasible technology for quick measurement of airborne gamma spectrum at low activity levels .%介绍了航空γ能谱的特征和Bootstrap方法的基本原理，把GR-820机载核辐射监测系统测量的距地面300 m处的40 K窗口γ能谱数据作为小样本，利用Bootstrap方法对该小样本进行分布参数推断并进行抽样得到计算谱，与距地60 m处测得的γ能谱数据进行对比，吻合度较高，得到了满意的结果，在一定程度上突破了谱仪硬件限制，为实现航空γ能谱在低活度水平的快速测量提供了可行性技术方案。
A Hybrid Segmentation Framework for Computer-Assisted Dental Procedures
Hosntalab, Mohammad; Aghaeizadeh Zoroofi, Reza; Abbaspour Tehrani-Fard, Ali; Shirani, Gholamreza; Reza Asharif, Mohammad
Teeth segmentation in computed tomography (CT) images is a major and challenging task for various computer assisted procedures. In this paper, we introduced a hybrid method for quantification of teeth in CT volumetric dataset inspired by our previous experiences and anatomical knowledge of teeth and jaws. In this regard, we propose a novel segmentation technique using an adaptive thresholding, morphological operations, panoramic re-sampling and variational level set algorithm. The proposed method consists of several steps as follows: first, we determine the operation region in CT slices. Second, the bony tissues are separated from other tissues by utilizing an adaptive thresholding technique based on the 3D pulses coupled neural networks (PCNN). Third, teeth tissue is classified from other bony tissues by employing panorex lines and anatomical knowledge of teeth in the jaws. In this case, the panorex lines are estimated using Otsu thresholding and mathematical morphology operators. Then, the proposed method is followed by calculating the orthogonal lines corresponding to panorex lines and panoramic re-sampling of the dataset. Separation of upper and lower jaws and initial segmentation of teeth are performed by employing the integral projections of the panoramic dataset. Based the above mentioned procedures an initial mask for each tooth is obtained. Finally, we utilize the initial mask of teeth and apply a variational level set to refine initial teeth boundaries to final contour. In the last step a surface rendering algorithm known as marching cubes (MC) is applied to volumetric visualization. The proposed algorithm was evaluated in the presence of 30 cases. Segmented images were compared with manually outlined contours. We compared the performance of segmentation method using ROC analysis of the thresholding, watershed and our previous works. The proposed method performed best. Also, our algorithm has the advantage of high speed compared to our previous works.
Hanson, Sonya M.; Ekins, Sean; Chodera, John D.
2015-12-01
All experimental assay data contains error, but the magnitude, type, and primary origin of this error is often not obvious. Here, we describe a simple set of assay modeling techniques based on the bootstrap principle that allow sources of error and bias to be simulated and propagated into assay results. We demonstrate how deceptively simple operations—such as the creation of a dilution series with a robotic liquid handler—can significantly amplify imprecision and even contribute substantially to bias. To illustrate these techniques, we review an example of how the choice of dispensing technology can impact assay measurements, and show how large contributions to discrepancies between assays can be easily understood and potentially corrected for. These simple modeling techniques—illustrated with an accompanying IPython notebook—can allow modelers to understand the expected error and bias in experimental datasets, and even help experimentalists design assays to more effectively reach accuracy and imprecision goals.
Alonso-Prieto, Esther; Pancaroglu, Raika; Dalrymple, Kirsten A; Handy, Todd; Barton, Jason J S; Oruc, Ipek
2015-01-01
Prior event-related potential studies using group statistics within a priori selected time windows have yielded conflicting results about familiarity effects in face processing. Our goal was to evaluate the temporal dynamics of the familiarity effect at all time points at the single-subject level. Ten subjects were shown faces of anonymous people or celebrities. Individual results were analysed using a point-by-point bootstrap analysis. While familiarity effects were less consistent at later epochs, all subjects showed them between 130 and 195 ms in occipitotemporal electrodes. However, the relation between the time course of familiarity effects and the peak latency of the N170 was variable. We concluded that familiarity effects between 130 and 195 ms are robust and can be shown in single subjects. The variability of their relation to the timing of the N170 potential may lead to underestimation of familiarity effects in studies that use group-based statistics.
Hanson, Sonya M; Ekins, Sean; Chodera, John D
2015-12-01
All experimental assay data contains error, but the magnitude, type, and primary origin of this error is often not obvious. Here, we describe a simple set of assay modeling techniques based on the bootstrap principle that allow sources of error and bias to be simulated and propagated into assay results. We demonstrate how deceptively simple operations--such as the creation of a dilution series with a robotic liquid handler--can significantly amplify imprecision and even contribute substantially to bias. To illustrate these techniques, we review an example of how the choice of dispensing technology can impact assay measurements, and show how large contributions to discrepancies between assays can be easily understood and potentially corrected for. These simple modeling techniques--illustrated with an accompanying IPython notebook--can allow modelers to understand the expected error and bias in experimental datasets, and even help experimentalists design assays to more effectively reach accuracy and imprecision goals.
Indian Academy of Sciences (India)
SHATAKSHEE CHATTERJEE; PARTHA P. MAJUMDER; PRIYANKA PANDEY
2016-09-01
Study of temporal trajectory of gene expression is important. RNA sequencing is popular in genome-scale studies of transcription. Because of high expenses involved, many time-course RNA sequencing studies are challenged by inadequacy of sample sizes. This poses difficulties in conducting formal statistical tests of significance of null hypotheses. We propose a bootstrap algorithm to identify ‘cognizable’ ‘time-trends’ of gene expression. Properties of the proposed algorithm are derived using a simulation study. The proposed algorithm captured known ‘time-trends’ in the simulated data with a high probability of success, even when sample sizes were small (n<10). The proposed statistical method is efficient and robust to capture ‘cognizable’ ‘time-trends’ in RNA sequencing data.
Pang, Yi; Rong, Junchen; Su, Ning
2016-12-01
We consider ϕ 3 theory in 6 - 2 ɛ with F 4 global symmetry. The beta function is calculated up to 3 loops, and a stable unitary IR fixed point is observed. The anomalous dimensions of operators quadratic or cubic in ϕ are also computed. We then employ conformal bootstrap technique to study the fixed point predicted from the perturbative approach. For each putative scaling dimension of ϕ (Δ ϕ ), we obtain the corresponding upper bound on the scaling dimension of the second lowest scalar primary in the 26 representation ( Δ 26 2nd ) which appears in the OPE of ϕ × ϕ. In D = 5 .95, we observe a sharp peak on the upper bound curve located at Δ ϕ equal to the value predicted by the 3-loop computation. In D = 5, we observe a weak kink on the upper bound curve at ( Δ ϕ , Δ 26 2nd ) = (1.6, 4).
Irurtzun, Aritz
2015-01-01
In recent research (Boeckx and Benítez-Burraco, 2014a,b) have advanced the hypothesis that our species-specific language-ready brain should be understood as the outcome of developmental changes that occurred in our species after the split from Neanderthals-Denisovans, which resulted in a more globular braincase configuration in comparison to our closest relatives, who had elongated endocasts. According to these authors, the development of a globular brain is an essential ingredient for the language faculty and in particular, it is the centrality occupied by the thalamus in a globular brain that allows its modulatory or regulatory role, essential for syntactico-semantic computations. Their hypothesis is that the syntactico-semantic capacities arise in humans as a consequence of a process of globularization, which significantly takes place postnatally (cf. Neubauer et al., 2010). In this paper, I show that Boeckx and Benítez-Burraco's hypothesis makes an interesting developmental prediction regarding the path of language acquisition: it teases apart the onset of phonological acquisition and the onset of syntactic acquisition (the latter starting significantly later, after globularization). I argue that this hypothesis provides a developmental rationale for the prosodic bootstrapping hypothesis of language acquisition (cf. i.a. Gleitman and Wanner, 1982; Mehler et al., 1988, et seq.; Gervain and Werker, 2013), which claim that prosodic cues are employed for syntactic parsing. The literature converges in the observation that a large amount of such prosodic cues (in particular, rhythmic cues) are already acquired before the completion of the globularization phase, which paves the way for the premises of the prosodic bootstrapping hypothesis, allowing babies to have a rich knowledge of the prosody of their target language before they can start parsing the primary linguistic data syntactically.
Directory of Open Access Journals (Sweden)
Aritz eIrurtzun
2015-12-01
Full Text Available In recent research Boeckx & Benítez-Burraco (2014a,b have advanced the hypothesis that our species-specific language-ready brain should be understood as the outcome of developmental changes that occurred in our species after the split from Neanderthals-Denisovans, which resulted in a more globular braincase configuration in comparison to our closest relatives, who had elongated endocasts. According to these authors, the development of a globular brain is an essential ingredient for the language faculty and in particular, it is the centrality occupied by the thalamus in a globular brain that allows its modulatory or regulatory role, essential for syntactico-semantic computations. Their hypothesis is that the syntactico-semantic capacities arise in humans as a consequence of a process of globularization, which significantly takes place postnatally (cf. Neubauer et al. (2010. In this paper, I show that Boeckx & Benítez-Burraco’s hypothesis makes an interesting developmental prediction regarding the path of language acquisition: it teases apart the onset of phonological acquisition and the onset of syntactic acquisition (the latter starting significantly later, after globularization. I argue that this hypothesis provides a developmental rationale for the prosodic bootstrapping hypothesis of language acquisition (cf. i.a. Gleitman & Wanner (1982; Mehler et al. (1988, et seq.; Gervain & Werker (2013, which claim that prosodic cues are employed for syntactic parsing. The literature converges in the observation that a large amount of such prosodic cues (in particular, rhythmic cues are already acquired before the completion of the globularization phase, which paves the way for the premises of prosodic bootstrapping hypothesis, allowing babies to have a rich knowledge of the prosody of their target language before they can start parsing the primary linguistic data syntactically.
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
An improved sample-and-hold reconstruction procedure for estimation of power spectra from LDA data
Energy Technology Data Exchange (ETDEWEB)
Simon, Laurent [Laboratoire d' Acoustique de l' Universite du Maine, UMR-CNRS 6613, Avenue Messiaen, 72085, Le Mans (France); Fitzpatrick, John [Mechanical Engineering Department, Trinity College, Dublin 2 (Ireland)
2004-08-01
Techniques for deriving the auto or power spectrum (PSD) of turbulence from laser Doppler anemometry (LDA) measurements are reviewed briefly. The low pass filter and step noise errors associated with the sample-and-hold process are considered and a discrete version of the low pass filter for the resampled signal is derived. This is then used to develop a procedure by which the PSD estimates obtained from sample and hold measurements can be corrected. The application of the procedures is examined using simulated data and the results show that the frequency range of the analysis can be extended beyond the Nyquist frequency based on the mean sample rate. The results are shown to be comparable to those obtained using the method of Nobach et al. (1998) but the new procedures are more straightforward to implement. The technique is then used to determine the PSD of real LDA data and the results are compared with those from a hot wire anemometer. (orig.)
Bootstrap Analysis of Cointegration Parameters Basing on Johansen Procedure%基于Johansen程序的协整参数自举分析
Institute of Scientific and Technical Information of China (English)
叶光
2009-01-01
考虑静态和动态两类数据生成过程,利用蒙特卡罗模拟方法,从估计偏差、实际检验水平和检验功效三个方面对基于Johansen程序的长期参数渐近分析和自举分析进行全面比较.结果表明,与渐近分析相比,自举分析可以减小实际检验水平对名义水平的偏差,但要以检验功效的降低为代价.严格意义上,自举分析是降低了"拒真"错误出现的概率,如果VAR(Vector Autoregression)模型能够很好地拟合数据,自举分析可能导致实际检验水平低于名义水平.此时应该慎用.使用Johansen程序估计协整参数时,容易出现异常估计值,因而不宜通过自举法修正估计偏差.
Institute of Scientific and Technical Information of China (English)
陈雁翔; 吴玺
2012-01-01
盲取证指针对篡改信号无需添加任何附加信息就可鉴别出信号的真伪,而音频篡改中篡改者经常利用重采样以达到更好的篡改效果,因此重采样检测作为音频盲取证的重要组成部分得到了高度的重视.对信号的重采样会引入相关性,这种相关性是周期出现的,本文通过基于期望最大化的检测方法揭示这种相关性,并通过判断这种相关性是否呈现周期性达到检测目的.在检测流程中采取了奇异防止、低频段去除、归一化三阶原点矩等措施,达到了更好的检测效果.实验验证了该方法对于各种插值函数的鲁棒性,以及不同重采样率下和音频拼接篡改时检测的有效性.%Blind forensics can discriminate such tampered signal without any additional information. During the audio tampering tampers often use re-sampling to achieve better results, thus re-sampling detection obtained the high value as an important aspect of blind forensics. Re-sampling will introduce correlation and this relationship is often periodic. We use method based on Expectation Maximization to reveal the correlation and reach the purpose of detection by checking whether the correlation is periodic. To achieve better effect, we also take the steps of singularity prevention, low-band elimination and normalized 3-order origin moment. The experiment results show the robustness of interpolating functions, and the validity of detection under different re-sampling rates as well as during splice tampering.
AlHakeem, Donna Ibrahim
This thesis focuses on short-term photovoltaic forecasting (STPVF) for the power generation of a solar PV system using probabilistic forecasts and deterministic forecasts. Uncertainty estimation, in the form of a probabilistic forecast, is emphasized in this thesis to quantify the uncertainties of the deterministic forecasts. Two hybrid intelligent models are proposed in two separate chapters to perform the STPVF. In Chapter 4, the framework of the deterministic proposed hybrid intelligent model is presented, which is a combination of wavelet transform (WT) that is a data filtering technique and a soft computing model (SCM) that is generalized regression neural network (GRNN). Additionally, this chapter proposes a model that is combined as WT+GRNN and is utilized to conduct the forecast of two random days in each season for 1-hour-ahead to find the power generation. The forecasts are analyzed utilizing accuracy measures equations to determine the model performance and compared with another SCM. In Chapter 5, the framework of the proposed model is presented, which is a combination of WT, a SCM based on radial basis function neural network (RBFNN), and a population-based stochastic particle swarm optimization (PSO). Chapter 5 proposes a model combined as a deterministic approach that is represented as WT+RBFNN+PSO, and then a probabilistic forecast is conducted utilizing bootstrap confidence intervals to quantify uncertainty from the output of WT+RBFNN+PSO. In Chapter 5, the forecasts are conducted by furthering the tests done in Chapter 4. Chapter 5 forecasts the power generation of two random days in each season for 1-hour-ahead, 3-hour-ahead, and 6-hour-ahead. Additionally, different types of days were also forecasted in each season such as a sunny day (SD), cloudy day (CD), and a rainy day (RD). These forecasts were further analyzed using accuracy measures equations, variance and uncertainty estimation. The literature that is provided supports that the proposed
Pang, Yi; Su, Ning
2016-01-01
We consider $\\phi^3$ theory in $6-2\\epsilon$ with $F_4$ global symmetry. The beta function is calculated up to 3 loops, and a stable unitary IR fixed point is observed. The anomalous dimensions of operators quadratic or cubic in $\\phi$ are also computed. We then employ conformal bootstrap technique to study the fixed point predicted from the perturbative approach. For each putative scaling dimension of $\\phi$ ($\\Delta_{\\phi})$, we obtain the corresponding upper bound on the scaling dimension of the second lowest scalar primary in the ${\\mathbf 26}$ representation $(\\Delta^{\\rm 2nd}_{{\\mathbf 26}})$ which appears in the OPE of $\\phi\\times\\phi$. In $D=5.95$, we observe a sharp peak on the upper bound curve located at $\\Delta_{\\phi}$ equal to the value predicted by the 3-loop computation. In $D=5$, we observe a weak kink on the upper bound curve at $(\\Delta_{\\phi},\\Delta^{\\rm 2nd}_{{\\mathbf 26}})$=$(1.6,4)$.
Directory of Open Access Journals (Sweden)
Raéf Bahrini
2017-02-01
Full Text Available This paper measures and analyzes the technical efficiency of Islamic banks in the Middle East and North Africa (MENA region during the period 2007–2012. To do this, the bootstrap Data Envelopment Analysis (DEA approach was employed in order to provide a robust estimation of the overall technical efficiency and its components: pure technical efficiency and scale efficiency in the case of MENA Islamic banks. The main results show that over the period of study, pure technical inefficiency was the main source of overall technical inefficiency instead of scale inefficiency. This finding was confirmed for all MENA Islamic banks as well as for the two subsamples: Gulf Cooperation Council (GCC and non-GCC Islamic banks. Furthermore, our results show that GCC Islamic banks had stable efficiency scores during the global financial crisis (2007–2008 and in the early post-crisis period (2009–2010. However, a decline in overall technical efficiency of all panels of MENA Islamic banks was recorded in the last two years of the study period (2011–2012. Thus, we recommend that MENA Islamic bank managers focus more on improving their management practices rather than increasing their sizes. We also recommend that financial authorities in MENA countries implement several regulatory and financial measures in order to ensure the development of MENA Islamic banking.
Shimada, Hirohiko; Hikami, Shinobu
2016-12-01
The fractal dimensions of polymer chains and high-temperature graphs in the Ising model both in three dimension are determined using the conformal bootstrap applied for the continuation of the O( N) models from N=1 (Ising model) to N=0 (polymer). Even for non-integer N, the O( N) sum rule allows one to study the unitarity bound formally defined from the positivity, which may be violated in a non-unitary CFT. This unitarity bound of the scaling dimension for the O( N)-symmetric-tensor develops a kink as a function of the fundamental field as in the case of the energy operator dimension in the Z_2 (Ising) sum rule. Although this kink structure becomes less pronounced as N tends to zero, we found instead an emerging asymmetric minimum in the current central charge C_J. Despite the non-unitarity of the O( N) model at non-integer N, we find the C_J-kink along the unitarity bound lies very close to the location of the infrared (IR) O( N) CFT estimated by other methods. It is pointed out that certain level degeneracies at the IR CFT should induce these singular shapes of the unitarity bounds. As an application to the quantum and classical spin systems, we also predict critical exponents associated with the N=1 supersymmetry, which could be relevant for locating the corresponding fixed point in the phase diagram.
Iha, Hisashi; Suzuki, Hiroshi
2016-01-01
We study four-dimensional conformal field theories with an $SU(N)$ global symmetry by employing the numerical conformal bootstrap. We consider the crossing relation associated with a four-point function of a spin~$0$ operator~$\\phi_i^{\\Bar{k}}$ which belongs to the adjoint representation of~$SU(N)$. For~$N=12$ for example, we found that the theory contains a spin~$0$ $SU(12)$-breaking relevant operator if the scaling dimension of~$\\phi_i^{\\Bar{k}}$, $\\Delta_{\\phi_i^{\\Bar{k}}}$, is smaller than~$1.63$. Considering the lattice simulation of the many-flavor QCD with $12$~flavors on the basis of the staggered fermion, the above $SU(12)$-breaking relevant operator, if it exists, would be induced by the flavor breaking effect of the staggered fermion and would prevent an approach to an infrared fixed point. Actual lattice simulations do not show such signs. Thus, assuming the absence of the above $SU(12)$-breaking relevant operator, we have an upper bound on the mass anomalous dimension at the fixed point~$\\gamma_m...
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, A.M. Robert
Empirical evidence from time series methods which assume the usual I(0)/I(1) paradigm suggests that the efficient market hypothesis, stating that spot and futures prices of a commodity should cointegrate with a unit slope on futures prices, does not hold. However, these statistical methods...... fractionally integrated model we are able to find a body of evidence in support of the efficient market hypothesis for a number of commodities. Our new tests are wild bootstrap implementations of score-based tests for the order of integration of a fractionally integrated time series. These tests are designed...... principle do. A Monte Carlo simulation study demonstrates that very significant improvements infinite sample behaviour can be obtained by the bootstrap vis-à-vis the corresponding asymptotic tests in both heteroskedastic and homoskedastic environments....
Energy Technology Data Exchange (ETDEWEB)
Gonzalez-Manteiga, W.; Prada-Sanchez, J.M.; Fiestras-Janeiro, M.G.; Garcia-Jurado, I. (Universidad de Santiago de Compostela, Santiago de Compostela (Spain). Dept. de Estadistica e Investigacion Operativa)
1990-11-01
A statistical study of the dependence between various critical fusion temperatures of a certain kind of coal and its chemical components is carried out. As well as using classical dependence techniques (multiple, stepwise and PLS regression, principal components, canonical correlation, etc.) together with the corresponding inference on the parameters of interest, non-parametric regression and bootstrap inference are also performed. 11 refs., 3 figs., 8 tabs.
Directory of Open Access Journals (Sweden)
Hojin Moon
2006-08-01
Full Text Available A computational tool for testing for a dose-related trend and/or a pairwise difference in the incidence of an occult tumor via an age-adjusted bootstrap-based poly-k test and the original poly-k test is presented in this paper. The poly-k test (Bailer and Portier 1988 is a survival-adjusted Cochran-Armitage test, which achieves robustness to effects of differential mortality across dose groups. The original poly-k test is asymptotically standard normal under the null hypothesis. However, the asymptotic normality is not valid if there is a deviation from the tumor onset distribution that is assumed in this test. Our age-adjusted bootstrap-based poly-k test assesses the significance of assumed asymptotic normal tests and investigates an empirical distribution of the original poly-k test statistic using an age-adjusted bootstrap method. A tumor of interest is an occult tumor for which the time to onset is not directly observable. Since most of the animal carcinogenicity studies are designed with a single terminal sacrifice, the present tool is applicable to rodent tumorigenicity assays that have a single terminal sacrifice. The present tool takes input information simply from a user screen and reports testing results back to the screen through a user-interface. The computational tool is implemented in C/C++ and is applied to analyze a real data set as an example. Our tool enables the FDA and the pharmaceutical industry to implement a statistical analysis of tumorigenicity data from animal bioassays via our age-adjusted bootstrap-based poly-k test and the original poly-k test which has been adopted by the National Toxicology Program as its standard statistical test.
Institute of Scientific and Technical Information of China (English)
付聪; 练士龙; 李强
2011-01-01
目的:本文针对表面肌电(sEMG)信号探讨动作电位传导速度(APCV)估计问题.方法:以生理学仿真sEMG信号为基础,采用基于互相关分析的时延估计技术来获取相应的APCV估计值,并利用重采样技术来提高估计的精度.结果:实验表明,针对重采样后的仿真信号,其APCV的估计误差得到了明显降低.结论:所采用方法能够有效获取满意的APCV估计效果.%Objective: The action potential conduction velocity (APCV) estimation was explored in this paper. Methods: Based on a physiological model of sEMG signal, the APCV estimation result was acquired by using the time-delay estimation of cross correlation, and the re-sampling technique was utilized to improve the estimation performance. Results: The experimental results showed that the APCV estimation error could be reduced by the re-sampling technique. Conclusion: The APCV estimation performance could be satisfac torily acquired by the proposed method.
基于Bootstrap框架的响应式网页设计与实现%Design and Implementation of Bootstrap-based Responsive Webpage
Institute of Scientific and Technical Information of China (English)
舒后; 熊一帆; 葛雪娇
2016-01-01
为了解决各种移动设备兼容显示网页的问题,以确保良好的用户体验,提出了响应式网页设计技术。研究了web前端开发框架Bootstrap,分析其在响应式网页设计中的作用,并以数字媒体技术专业介绍为背景,设计并构建了基于Bootstrap框架的响应式网站,实现了移动端与PC端网页显示的一致性。%In order to make the mobile device compatible with the website so as to provide a user-friendly interface and experience, this paper proposes a responsive webpage technology. It researches a web front-end framework of Bootstrap, analyzes the functions in the design of responsive webpage. Taking the introduction of digital media technology major as the background, this paper designs and builds a Bootstrap-based website, and realizes the compatibility of display between mobile devices and PC terminal.
Kent, Robert; Belitz, Kenneth; Fram, Miranda S.
2014-01-01
The Priority Basin Project (PBP) of the Groundwater Ambient Monitoring and Assessment (GAMA) Program was developed in response to the Groundwater Quality Monitoring Act of 2001 and is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The GAMA-PBP began sampling, primarily public supply wells in May 2004. By the end of February 2006, seven (of what would eventually be 35) study units had been sampled over a wide area of the State. Selected wells in these first seven study units were resampled for water quality from August 2007 to November 2008 as part of an assessment of temporal trends in water quality by the GAMA-PBP. The initial sampling was designed to provide a spatially unbiased assessment of the quality of raw groundwater used for public water supplies within the seven study units. In the 7 study units, 462 wells were selected by using a spatially distributed, randomized grid-based method to provide statistical representation of the study area. Wells selected this way are referred to as grid wells or status wells. Approximately 3 years after the initial sampling, 55 of these previously sampled status wells (approximately 10 percent in each study unit) were randomly selected for resampling. The seven resampled study units, the total number of status wells sampled for each study unit, and the number of these wells resampled for trends are as follows, in chronological order of sampling: San Diego Drainages (53 status wells, 7 trend wells), North San Francisco Bay (84, 10), Northern San Joaquin Basin (51, 5), Southern Sacramento Valley (67, 7), San Fernando–San Gabriel (35, 6), Monterey Bay and Salinas Valley Basins (91, 11), and Southeast San Joaquin Valley (83, 9). The groundwater samples were analyzed for a large number of synthetic organic constituents (volatile organic compounds [VOCs], pesticides, and pesticide degradates), constituents of special interest (perchlorate, N
Neural network pruning with Tukey-Kramer multiple comparison procedure.
Duckro, Donald E; Quinn, Dennis W; Gardner, Samuel J
2002-05-01
Reducing a neural network's complexity improves the ability of the network to generalize future examples. Like an overfitted regression function, neural networks may miss their target because of the excessive degrees of freedom stored up in unnecessary parameters. Over the past decade, the subject of pruning networks produced nonstatistical algorithms like Skeletonization, Optimal Brain Damage, and Optimal Brain Surgeon as methods to remove connections with the least salience. The method proposed here uses the bootstrap algorithm to estimate the distribution of the model parameter saliences. Statistical multiple comparison procedures are then used to make pruning decisions. We show this method compares well with Optimal Brain Surgeon in terms of ability to prune and the resulting network performance.
... Center Access to Care Toolkit EHB Access Toolkit Bariatric Surgery Procedures Bariatric surgical procedures cause weight loss by ... Bariatric procedures also often cause hormonal changes. Most weight loss surgeries today are performed using minimally invasive techniques (laparoscopic ...
Bootstrapping Time Dilation Decoherence
Gooding, Cisco
2015-01-01
We present a general relativistic model of a spherical shell of matter with a perfect fluid on its surface coupled to an internal oscillator, which generalizes a model recently introduced by the authors to construct a self-gravitating interferometer [1]. The internal oscillator evolution is defined with respect to the local proper time of the shell, allowing the oscillator to serve as a local clock that ticks differently depending on the shell's position and momentum. A Hamiltonian reduction is performed on the system, and an approximate quantum description is given to the reduced phase space. If we focus only on the external dynamics, we must trace out the clock degree of freedom, and this results in a form of intrinsic decoherence that shares some features with a proposed "universal" decoherence mechanism attributed to gravitational time dilation [2]. We show that the proposed decoherence remains present in the (gravity-free) limit of flat spacetime, indicating that the effect can be attributed entirely to ...
Energy Technology Data Exchange (ETDEWEB)
Rastelli, Leonardo [C.N. Yang Institute for Theoretical Physics, Stony Brook University, Stony Brook, NY (United States)
2016-04-15
This contribution collects background material and references for the overview talk that was delivered in the 21st European string workshop and 3rd COST MP1210 meeting, 'The String Theory Universe'. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Rahman, Syed Fazle
2015-01-01
If you are a web developer who has a basic understanding of Ruby on Rails, this is the book for you. You should definitely have previous knowledge about HTML and how it works. However, CSS and JavaScript knowledge is optional for this book.
Energy Technology Data Exchange (ETDEWEB)
Blandford, Roger; Funk, Stefan; /KIPAC, Menlo Park
2007-10-10
Recent observations with TeV telescopes strongly indicate that young supernova remnants are capable of accelerating cosmic ray protons almost to PeV energies. On quite general grounds, this, in turn, suggests that the magnetic field strength must be enhanced above the standard interstellar value by about two orders of magnitude. It is suggested that protons and electrons are accelerated through diffusive shock acceleration, with the highest energy protons streaming furthest ahead of the shock front. It is then shown that the pressure of the {approx} 300TeV protons dominates that of the ambient thermal particles and magnetic field and is likely to be sufficiently anisotropic to render the pre-shock fluid unstable to resonant and non-resonant instability. A new theory of the non-resonant instabilities is outlined. The nonlinear evolution of these instabilities requires careful numerical simulation but it is conjectured that the magnetic field is amplified in this location and provides the means for efficient acceleration of progressively lower energy particles as it is convected towards the subshock in the thermal plasma. Further possible implications of these ideas are sketched.
Wechsung, Frank; Wechsung, Maximilian
2016-11-01
The STatistical Analogue Resampling Scheme (STARS) statistical approach was recently used to project changes of climate variables in Germany corresponding to a supposed degree of warming. We show by theoretical and empirical analysis that STARS simply transforms interannual gradients between warmer and cooler seasons into climate trends. According to STARS projections, summers in Germany will inevitably become dryer and winters wetter under global warming. Due to the dominance of negative interannual correlations between precipitation and temperature during the year, STARS has a tendency to generate a net annual decrease in precipitation under mean German conditions. Furthermore, according to STARS, the annual level of global radiation would increase in Germany. STARS can be still used, e.g., for generating scenarios in vulnerability and uncertainty studies. However, it is not suitable as a climate downscaling tool to access risks following from changing climate for a finer than general circulation model (GCM) spatial scale.
Uma Comparação entre Técnicas de Propagação de Erros em Astrofísica: Monte Carlo x Bootstrap
Zabot, Alexandre; Baptista, Raymundo
2005-07-01
Neste trabalho é feito um estudo comparativo entre dois algoritmos numéricos usados para propagação de erros em dados experimentais. Um deles é conhecido por Método de Monte carlo e o outro por Método de Bootstrap. Recentemente, Dhullon & Watson argüiram que a aplicação do método de Monte Carlo introduz ruído nos dados, e propuseram então a utilização do Bootstrap como alternativa capaz de produzir resultados superiores. O objetivo deste trabalho é testar a validade dessa afirmação. As duas técnicas foram aplicadas a três problemas diferentes: o ajsute de modelos de emissão LTE simples e atmosfera estelar a espectros estelares observados e o ajuste de curvas de luz de eclipses de Variáveis Cataclísmicas para a detemrinação da distribuição radial de brilho dos seus discos de acréscimo. Os métodos foram testados quanto à sua robusteza, ou seja, a capacidade de prover resultados coerentes enre si. Além disso, as soluções dos métodos foram comparadas. Os resultados indicam que não existe evidência de superioridade de um métodos em relação ao outro.
Directory of Open Access Journals (Sweden)
NI PUTU AYU DINITA TRISNAYANTI
2015-06-01
Full Text Available In this research bootstrap methods are used to determine the points inference of biplot figures on the analysis of AMMI. If the environmental factors are assumed to be random factors, then Mixed AMMI is used as a model of analysis. In the analysis of the stabilit, the main components score interaction used are KUI1 and KUI2. The purpose of this study is to determine the Biplot figures based on two scores these are KUI with the greatest diversity of Mixed AMMI models and the points inference by using the bootstrap method. The stable genotypes obtained from biplot AMMI2 are G1, G5, and G6. Based on points inference of each genotype, G1 and G5 can be regarded as the most stable genotype. This is because the distribution of G1 and G5 are the closest to the center point (0,0 and both of them have a small radius.
Energy Technology Data Exchange (ETDEWEB)
Greene, David L [ORNL; Duleep, Dr. K. G. [Energy and Environmental Analysis, Inc., an ICF Company
2008-10-01
The North American Proton Exchange Membrane (PEM) fuel cell industry may be at a critical juncture. A large-scale market for automotive fuel cells appears to be several years away and in any case will require a long-term, coordinated commitment by government and industry to insure the co-evolution of hydrogen infrastructure and fuel cell vehicles (Greene et al., 2008). The market for non-automotive PEM fuel cells, on the other hand, may be much closer to commercial viability (Stone, 2006). Cost targets are less demanding and manufacturers appear to be close, perhaps within a factor of two, of meeting them. Hydrogen supply is a significant obstacle to market acceptance but may not be as great a barrier as it is for hydrogen-powered vehicles due to the smaller quantities of hydrogen required. PEM fuel cells appear to be potentially competitive in two markets: (1) Backup power (BuP) supply, and (2) electrically-powered MHE (Mahadevan et al., 2007a, 2007b). There are several Original Equipment Manufacturers (OEMs) of PEM fuel cell systems for these applications but production levels have been quite low (on the order of 100-200 per year) and cumulative production experience is also limited (on the order of 1,000 units to date). As a consequence, costs remain above target levels and PEM fuel cell OEMs are not yet competitive in these markets. If cost targets can be reached and acceptable solutions to hydrogen supply found, a sustainable North American PEM fuel cell industry could be established. If not, the industry and its North American supply chain could disappear within a year or two. The Hydrogen Fuel Cell and Infrastructure Technologies (HFCIT) program of the U.S. Department of Energy (DOE) requested a rapid assessment of the potential for a government acquisition program to bootstrap the market for non-automotive PEM fuel cells by driving down costs via economies of scale and learning-by-doing. The six week study included in-depth interviews of three manufacturers
Computerized procedures system
Lipner, Melvin H.; Mundy, Roger A.; Franusich, Michael D.
2010-10-12
An online data driven computerized procedures system that guides an operator through a complex process facility's operating procedures. The system monitors plant data, processes the data and then, based upon this processing, presents the status of the current procedure step and/or substep to the operator. The system supports multiple users and a single procedure definition supports several interface formats that can be tailored to the individual user. Layered security controls access privileges and revisions are version controlled. The procedures run on a server that is platform independent of the user workstations that the server interfaces with and the user interface supports diverse procedural views.
Dynamic Allocation of Firewall Link Combined Machine Learning with Resampling%重采样与机器学习结合的防火墙链接动态分配
Institute of Scientific and Technical Information of China (English)
陈渝龙
2015-01-01
通过对网络防火墙连接的动态分配设计,提高网络安全性能和对攻击数据的检测性能,传统方法采用多级处理的有限排队理论的动态分配设计方案,导致防火墙对攻击序列排队解析模型处理混乱,攻击捕获性能不好.提出一种基于重采样与机器学习结合的防火墙链接动态分配设计方法.基于重采样与机器学习结合方法,进行防火墙协议的动态链接分配,数据输入包输入防火墙模型中,排队等待处理,定义数据防火墙链接信息流全极点中心矩阵,基于贝叶斯参数估计进行攻击信息特征挖掘,通过重采样技术提高防火墙的抗干扰能力,求得重采样频谱.提高数据包信息的有效到达率,网络防火墙的链接动态分配转化为对数据集从数据向量点积,优化的动态分配设计.实验得出,采用该算法,能有效提高对攻击数据的检测性能,差错率较低,性能优越.%Through the design of dynamic allocation of a connection to a network firewall, improve the network security per-formance and to attack the data detection performance, the traditional method using a multistage processing in finite queu-ing theory of dynamic allocation scheme design, cause the firewall to attack sequence queue processing analytical models of chaos, attack capture performance is not good. Proposed a resampling combination of firewall chaining dynamic allocation design method and based on machine learning. Resampling combination methods and machine learning based on dynamic link allocation protocol, firewall, data input package input firewall model, queue waiting for processing, all pole center ma-trix flow definition data firewall link information, Bayesian parameter estimation based on the characteristics of attack infor-mation mining, through resampling technique to improve the anti-interference ability of firewall, obtain the weight the sam-pling frequency. To improve the data packet information arrival rate
Conflict among Testing Procedures?
1982-04-01
AM4ONG TESTING PROCEDURES? Daniel F . Kohler April 1982 ( i’ 4:3 rpis tsnlb u lailtsd P-6765 8 8 O1 V 068 The Rand Paper Series Papers are issued by...TESTING PROCEDURES? Daniel F . Kohler April 1982 : i ! ,I I CONFLICT AMONG TESTING PROCEDURES? 1. Introduction "- Savin [1976] and Berndt and Savin [19771
Directory of Open Access Journals (Sweden)
Carlos Montenegro Silva
2009-01-01
Full Text Available Se analizó el desempeño de distintos tamaños de muestra para estimar la composición de tallas de las capturas del langostino colorado (Pleuroncodes monodon, a partir de un procedimiento de remuestreo computacional. Se seleccionaron datos recolectados en mayo de 2002 entre los 29°10'S y 32°10'S. A partir de éstos, se probaron siete escenarios de muestreo de viajes de pesca (1-7 viajes, 12 escenarios de número de ejemplares muestreados (25, 50,...300, cada 25 ejemplares y dos estrategias de muestreo de lances de pesca al interior de un viaje de pesca (censo de lances y muestreo sistemático. Se probó la combinación de todos estos escenarios, lo que permitió analizar el desempeño de 168 escenarios de tamaño de muestra para estimar la composición de tallas por sexo. Los resultados indicaron una disminución en el índice de error en la estimación de la distribución de frecuencia de tallas, conforme aumentó el número de viajes de pesca, con disminuciones progresivamente menores entre escenarios adyacentes. Del mismo modo, se verificó una disminución en el índice de error al aumentar el número de ejemplares muestreados, con mejoras marginales sobre los 175 ejemplares.The performances of different sample sizes for estimating the size distribution of squat lobster (Pleuroncodes monodon catches were analyzed using a computer resampling procedure. The data selected were gathered in May 2002 between 29°10'S and 32°10'S. These data were used to test seven sampling scenarios for fishing trips (1-7 trips, twelve scenarios of the number of individuals sampled per tow (25, 50,..., 300, and two within-trip sampling strategies (sampling all tows and systematic tow sampling. By testing the combination of all these scenarios, we were able to analyze the performance of 168 scenarios of sample size for estimating the composition of sizes by sex. The results indicate a lower error index for estimates of the size frequency distribution as the
Nortey, Ezekiel N. N.; Ansah-Narh, Theophilus; Asah-Asante, Richard; Minkah, Richard
2015-01-01
Although, there exists numerous literature on the procedure for forecasting or predicting election results, in Ghana only opinion poll strategies have been used. To fill this gap, the paper develops Markov chain models for forecasting the 2016 presidential election results at the Regional, Zonal (i.e. Savannah, Coastal and Forest) and the National levels using past presidential election results of Ghana. The methodology develops a model for prediction of the 2016 presidential election results...
Energy Technology Data Exchange (ETDEWEB)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.; Baddeley, Robert L.; Riensche, Roderick M.; Jensen, Russell S.; Verhagen, Marc; Pustejovsky, James
2011-02-18
Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant links between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.
Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.
2010-02-01
Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.
Bathe, Klaus-Jürgen
2015-01-01
Finite element procedures are now an important and frequently indispensable part of engineering analyses and scientific investigations. This book focuses on finite element procedures that are very useful and are widely employed. Formulations for the linear and nonlinear analyses of solids and structures, fluids, and multiphysics problems are presented, appropriate finite elements are discussed, and solution techniques for the governing finite element equations are given. The book presents general, reliable, and effective procedures that are fundamental and can be expected to be in use for a long time. The given procedures form also the foundations of recent developments in the field.
Cardiac Procedures and Surgeries
... Peripheral Artery Disease Venous Thromboembolism Aortic Aneurysm More Cardiac Procedures and Surgeries Updated:Sep 16,2016 If you've had ... degree of coronary artery disease (CAD) you have. Cardiac Procedures and Surgeries Angioplasty Also known as Percutaneous Coronary Interventions [PCI], ...
DEFF Research Database (Denmark)
Aldashev, Gani; Kirchsteiger, Georg; Sebald, Alexander Christopher
2009-01-01
It is a persistent finding in psychology and experimental economics that people's behavior is not only shaped by outcomes but also by decision-making procedures. In this paper we develop a general framework capable of modelling these procedural concerns. Within the context of psychological games we...
DEFF Research Database (Denmark)
Werlauff, Erik
The book contains an up-to-date survey of Danish civil procedure after the profound Danish procedural reforms in 2007. It deals with questions concerning competence and function of Danish courts, commencement and preparation of civil cases, questions of evidence and burden of proof, international...
Institute of Scientific and Technical Information of China (English)
陈淑静; 马天才
2009-01-01
针对红外成像消噪的粒子滤波退化问题,提出正则粒子重采样算法.该算法从粒子群重采样获得粒子云(x_k~j,n_j)_j~m=1,解决了粒子多样性消失的问题并克服粒子匮乏的现象;接着又通过添加辅助粒子v,将下一时刻观测值权值大的粒子进行标识,使粒子权值ω_k~i∝(p (x_k|y_(k-1)~i))/(p (x_k|μ_(k-1)~i))更加稳定;给出了运动物体的红外成像消噪模型.实验仿真表明:正则粒子重采样算法通过添加辅助粒子使红外成像消噪效果好,成像清晰度在95%以上.%The regular granule heavy sampling algorithm is proposed for solving the deterioration of the granules in the infrared imaging denoising process. The granule cloud is obtained by the algorithm based on the granule resampling which can eliminate the phenomena of the granule diversity vanishing and granule want, the granules with great weight value observed at the next moment are marked to make the granule weight value more stable by adding some auxiliary granules, and then an infrared imaging denoising model of a moving object is established. The experimental result indicates that the method makes the effect of the infrared imaging denoising much better, and the imaging definition above 95%.
DEFF Research Database (Denmark)
Hammar, Emil
Through the theories of play by Gadamer (2004) and Henricks (2006), I will show how the relationship between play and game can be understood as dialectic and disruptive, thus challenging understandings of how the procedures of games determine player activity and vice versa. As such, I posit some...... analytical consequences for understandings of digital games as procedurally fixed (Boghost, 2006; Flannagan, 2009; Bathwaite & Sharp, 2010). That is, if digital games are argued to be procedurally fixed and if play is an appropriative and dialectic activity, then it could be argued that the latter affects...
2016-01-01
The Nuss procedure is now the preferred operation for surgical correction of pectus excavatum (PE). It is a minimally invasive technique, whereby one to three curved metal bars are inserted behind the sternum in order to push it into a normal position. The bars are left in situ for three years and then removed. This procedure significantly improves quality of life and, in most cases, also improves cardiac performance. Previously, the modified Ravitch procedure was used with resection of cartilage and the use of posterior support. This article details the new modified Nuss procedure, which requires the use of shorter bars than specified by the original technique. This technique facilitates the operation as the bar may be guided manually through the chest wall and no additional stabilizing sutures are necessary. PMID:27747185
Hemodialysis access procedures
... this page: //medlineplus.gov/ency/article/007641.htm Hemodialysis access procedures To use the sharing features on ... An access is needed for you to get hemodialysis. The access is where you receive hemodialysis . Using ...
Canalith Repositioning Procedure
... repositioning procedure can help relieve benign paroxysmal positional vertigo (BPPV), a condition in which you have brief, ... dizziness that occur when you move your head. Vertigo usually comes from a problem with the part ...
... does it cost? As a rule, almost all cosmetic surgery is considered “elective” and is not typically covered ... premier specialty group representing dermatologists performing all procedures – cosmetic, general, ... Reserved. / Disclaimer / Terms of Use / ...
Procedures for Sampling Vegetation
US Fish and Wildlife Service, Department of the Interior — This report outlines vegetation sampling procedures used on various refuges in Region 3. The importance of sampling the response of marsh vegetation to management...
使用基于模式的Bootstrapping方法抽取情感词%Extracting sentiment words using pattern based Bootstrapping method
Institute of Scientific and Technical Information of China (English)
王昌厚; 王菲
2014-01-01
Sentiment(or opinionated)lexicons play an important role in sentiment analysis. With the blooming of net neol-ogisms, it is quite necessary to identify new sentiment words and improve current sentiment lexicons. This paper proposes a pattern based Bootstrapping method which extracts sentiment words from micro blogs. The experimental results validate the effectiveness of the method and large quantity of un-recorded sentiment words are extracted with reasonable precisions.%情感评价词典在情感分析中具有非常重要的作用，在新词频发的网络环境中，识别新的情感评价词，完善现有的情感词典是非常有必要的。使用基于模式的Bootstrapping方法，在微博语料中抽取情感评价词。实验证明，在保持了较理想的精确率的情况下，上述方法抽取了数量可观的传统情感词典未收录的情感评价词。
Mobile Energy Laboratory Procedures
Energy Technology Data Exchange (ETDEWEB)
Armstrong, P.R.; Batishko, C.R.; Dittmer, A.L.; Hadley, D.L.; Stoops, J.L.
1993-09-01
Pacific Northwest Laboratory (PNL) has been tasked to plan and implement a framework for measuring and analyzing the efficiency of on-site energy conversion, distribution, and end-use application on federal facilities as part of its overall technical support to the US Department of Energy (DOE) Federal Energy Management Program (FEMP). The Mobile Energy Laboratory (MEL) Procedures establish guidelines for specific activities performed by PNL staff. PNL provided sophisticated energy monitoring, auditing, and analysis equipment for on-site evaluation of energy use efficiency. Specially trained engineers and technicians were provided to conduct tests in a safe and efficient manner with the assistance of host facility staff and contractors. Reports were produced to describe test procedures, results, and suggested courses of action. These reports may be used to justify changes in operating procedures, maintenance efforts, system designs, or energy-using equipment. The MEL capabilities can subsequently be used to assess the results of energy conservation projects. These procedures recognize the need for centralized NM administration, test procedure development, operator training, and technical oversight. This need is evidenced by increasing requests fbr MEL use and the economies available by having trained, full-time MEL operators and near continuous MEL operation. DOE will assign new equipment and upgrade existing equipment as new capabilities are developed. The equipment and trained technicians will be made available to federal agencies that provide funding for the direct costs associated with MEL use.
Arianespace streamlines launch procedures
Lenorovitch, Jeffrey M.
1992-06-01
Ariane has entered a new operational phase in which launch procedures have been enhanced to reduce the length of launch campaigns, lower mission costs, and increase operational availability/flexibility of the three-stage vehicle. The V50 mission utilized the first vehicle from a 50-launcher production lot ordered by Arianespace, and was the initial flight with a stretched third stage that enhances Ariane's performance. New operational procedures were introduced gradually over more than a year, starting with the V42 launch in January 1991.
Procedure and Program Examples
Britz, Dieter
Here some modules, procedures and whole programs are described, that may be useful to the reader, as they have been, to the author. They are all in Fortran 90/95 and start with a generally useful module, that will be used in most procedures and programs in the examples, and another module useful for programs using a Rosenbrock variant. The source texts (except for the two modules) are not reproduced here, but can be downloaded from the web site www.springerlink.com/openurl.asp?genre=issue &issn=1616-6361&volume=666 (the two lines form one contiguous URL!).
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2015-05-01
Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories.
DEFF Research Database (Denmark)
Pilegaard, Hans Kristian
2016-01-01
The Nuss procedure is now the preferred operation for surgical correction of pectus excavatum (PE). It is a minimally invasive technique, whereby one to three curved metal bars are inserted behind the sternum in order to push it into a normal position. The bars are left in situ for three years...
Straightening out Legal Procedures
Institute of Scientific and Technical Information of China (English)
无
2011-01-01
China’s top legislature mulls giving the green light to class action litigations The long-awaited amendment of China’s Civil Procedure Law has taken a crucial step.On October 28,the Standing Committee of the National People’s Congress(NPC),China’s top legislature,reviewed a draft amendment to the law for the first time.
Educational Accounting Procedures.
Tidwell, Sam B.
This chapter of "Principles of School Business Management" reviews the functions, procedures, and reports with which school business officials must be familiar in order to interpret and make decisions regarding the school district's financial position. Among the accounting functions discussed are financial management, internal auditing,…
Anxiety Around Medical Procedures
... understand that these procedures are necessary for fighting cancer, your child may not understand — and it is often hard ... Deaths Per Year 5-Year Survival Rate Childhood Cancer Infographics Webinars Parents Webinars Child Life Specialist Webinars School Personnel Webinars Video Library ...
Institute of Scientific and Technical Information of China (English)
孟祥松; 张福民; 曲兴华
2015-01-01
Frequency modulated continuous wave (FMCW) laser ranging is one of the most interesting techniques for precision distance metrology. It is a promising candidate for absolute distance measurement at large standoff distances (10 to 100 m) with high precision and accuracy, and no cooperation target is needed during the measuring process. How to improve the measurement resolution in practice has been the research focus of the FMCW laser ranging in recent years. FMCW laser ranging system uses the method which may convert the measurement of flight time to the frequency measurement, while the ranging resolution can be determined by the tuning range of the optical frequency sweep in theory. The main impact-factor that reduces the resolution is the tuning nonlinearity of the laser source, which may cause an amount of error points within the sampling signal. So a dual-interferometric FMCW laser ranging system is adopted in this paper. Compared to the traditional Michelson scheme, an assistant interferometer is added. The assistant interferometer has an all-fiber optical Mach-Zehnder configuration, and the delay distance is at least 2 times longer than OPD (optical path difference) of the main interferometer. Because it provides the reference length, the length of the fiber must remain unchanged. The interference signal is obtained on the photodetector. At the time points of every peak and bottom of the auxiliary interferometer signal, the beating signal from the main interferometer is re-sampled. The original signal is not the equal time intervals, while the re-sampled signal is the equal optical frequency intervals. Based on the property of the re-sampled signal, a method by splicing the re-sampled signal to optimize the signal processing is proposed, by which the tuning range of the laser source limitation can be broken and high precision can be easily obtained. Also, a simple high-speed measuring method is proposed. Based on all the above principles, the two-fiber optical
Security bootstrapping model of key pre-sharing in MANET%MANET环境下的密钥预共享安全引导模型
Institute of Scientific and Technical Information of China (English)
吴畏; 彭茜; 冯力; 张剑
2011-01-01
Key pre-sharing model based on one-way hash function and （t,n） threshold schema of La- grange polynomial group was proposed to implement the security bootstrapping in mobile Ad hoc net- works （MANET） environment. This model involved following two phases, including the pre-sharing keys based on one way hash function and＇I Lagrange interpolation polynomial group, and recovering the secure key based on the digital signature of threshold schema. The one way hash approach can effec- tively prevent the splitted key pieces in a key pool from being exposed. The digital signature of threshold schema also has advantages to detect and block the DoS attack and other malicious fraudu- lent behaviors during the processes of key reconstruction and recovery. The experiments of simulated environment were performed to validate the approach on the performance of successful establishing se- cure link, computation complexity, the security of bootstrapping process, the capability of network recovery from compromised nodes, and the network scale etc. The simulated experimental results show that this approach can harden the security of MANET environment with better performance.%提出了一种移动自组织网络（MANET）环境下基于单向哈希函数和拉格朗日插值多项式组的（t,n）门限方案的随机密钥预共享安全引导模型,该安全引导过程分为基于单向哈希函数和拉格朗日插值多项式组的密钥预共享安全引导过程以及基于门限数字签名的密钥安全恢复协议2个部分.引导模型采用了单向哈希函数,使得每个密钥分片的子密钥难以被暴露,同时将门限数字签名机制引入到了密钥恢复协议的安全引导过程中,有效检测和防止了在密钥恢复或重组过程中的欺骗行为以及DoS攻击.实验验证从安全引导成功的性能、模型计算复杂度、节点被俘后的网络恢复能力、引导过程安全性、网络对各种路由攻击的抵抗力以及
Directory of Open Access Journals (Sweden)
AA. Khoshkhonejad
1994-06-01
Full Text Available Nowadays, due to recent developments and researches in dental science, it is possible to preserve and restore previously extracted cases such as teeth with extensive caries, fractured or less appropriate cases for crown coverage as well as teeth with external perforation caused by restorative pins. In order to restore the teeth with preservation of periodontium, we should know thoroughly physiological aspects of periodontium and protection of Biologic Width which is formed by epithelial and supracrestal connective tissue connections. Considering biologic width is one of the principal rules of teeth restoration, otherwise we may destruct periodontal tissues. Several factors are involved in placing a restoration and one of the most important ones is where the restoration margin is terminated. Many studies have been conducted on the possible effects of restoration margin on the gingiva and due to the results of these studies it was concluded that restoration margin should be finished supragingivally. However, when we have to end the restoration under Gingival Crest, First a healthy gingival sulcus is required. Also, we should not invade the biological width. Since a normal biologic with is reported 2 mm and sound tooth tissue should be placed at least 2 mm coronal to the epithelial tissue, the distance between sound tooth tissue and crown margin should be at least 4mm. Thus, performing crown lengthening is essential to increase the clinical crown length. Basically, two objectives are considered: 1 restorative 2 esthetic (gummy smile Surgical procedure includes gingivectomy and flap procedure. Orthodontic procedure involves orthodontic extrusion or force eruption technique which is controlled vertical movements of teeth into occlusion. Besides, this procedure can also used to extrude teeth defects from the gingival tissue. By crown lengthening, tooth extraction is not required and furthermore, adjacent teeth preparation for placing a fixed
A Specification Test of Stochastic Diffusion Models
Institute of Scientific and Technical Information of China (English)
Shu-lin ZHANG; Zheng-hong WEI; Qiu-xiang BI
2013-01-01
In this paper,we propose a hypothesis testing approach to checking model mis-specification in continuous-time stochastic diffusion model.The key idea behind the development of our test statistic is rooted in the generalized information equality in the context of martingale estimating equations.We propose a bootstrap resampling method to implement numerically the proposed diagnostic procedure.Through intensive simulation studies,we show that our approach is well performed in the aspects of type Ⅰ error control,power improvement as well as computational efficiency.
Recent Advances and Trends in Nonparametric Statistics
Akritas, MG
2003-01-01
The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o
Loop electrosurgical excisional procedure.
Mayeaux, E J; Harper, M B
1993-02-01
Loop electrosurgical excisional procedure, or LEEP, also known as loop diathermy treatment, loop excision of the transformation zone (LETZ), and large loop excision of the transformation zone (LLETZ), is a new technique for outpatient diagnosis and treatment of dysplastic cervical lesions. This procedure produces good specimens for cytologic evaluation, carries a low risk of affecting childbearing ability, and is likely to replace cryotherapy or laser treatment for cervical neoplasias. LEEP uses low-current, high-frequency electrical generators and thin stainless steel or tungsten loops to excise either lesions or the entire transformation zone. Complication rates are comparable to cryotherapy or laser treatment methods and include bleeding, incomplete removal of the lesion, and cervical stenosis. Compared with other methods, the advantages of LEEP include: removal of abnormal tissue in a manner permitting cytologic study, low cost, ease of acquiring necessary skills, and the ability to treat lesions with fewer visits. Patient acceptance of the procedure is high. Widespread use of LEEP by family physicians can be expected.
Usuelli, F G; Montrasio, U Alfieri
2012-06-01
Flexible flatfoot is one of the most common deformities. Arthroereisis procedures are designed to correct this deformity. Among them, the calcaneo-stop is a procedure with both biomechanical and proprioceptive properties. It is designed for pediatric treatment. Results similar to endorthesis procedure are reported. Theoretically the procedure can be applied to adults if combined with other procedures to obtain a stable plantigrade foot, but medium-term follow up studies are missing.
基于压缩感知/重采样的NMR噪声抑制新方法%A Compressed Sensing and Resampling Based Noise Suppression Method for NMR
Institute of Scientific and Technical Information of China (English)
聂莉莎; 蒋滨; 张许; 刘买利
2016-01-01
发展高灵敏检测方法是分析化学的永恒主题之一，提高信号强度和降低噪声水平是增强灵敏度的根本途径。在核磁共振波谱（NMR）分析中，通常采用高磁场强度的谱仪或复杂的脉冲实验方法来提高信号强度，或通过使用超低温探头来降低噪声水平，但这无疑会提高实验成本或增加实验难度。相较而言，利用数据后处理方法辨识和抑制噪声，是更为经济的提高信噪比（SNR）的途径。因此，该文在前期研究中发展的基于统计学中重采样原理的数据后处理方法（NASR）的基础上，通过引入压缩感知（CS）技术，对重采样方法进行了优化改进，所发展的NMR数据处理新方法（CS_NASR）可有效排除主观因素影响，提高处理结果的鲁棒性。%Sensitivity enhancement is an everlasting topic in analytical chemistry, for which the two most common approaches are signal enhancing and noise suppressing. In nuclear magnetic resonance (NMR) spectroscopy, signal enhancement can be achieved by utilizing high field spectrometers, cryogenic probes and/or using sophisticated pulse sequences. However, these approaches are often associated with dramatically increased cost. The other way to enhance sensitivity is to de-noise the data by post-processing, which is obviously more cost-effective and attractive. Based on the statistical resampling principle, we previously developed an NMR data post-processing method named NASR (An Effective Approach for Simultaneous Noise and Artifact Suppression in NMR Spectroscopy), which is effective to reduce unwanted noises and suppress artifacts in one-dimensional and multiple-dimensional NMR experiments. In practice, however, the optimal parameter setting for NASR is often difficult to achieve. In this study, compressed sensing (CS) was incorporated into the original NASR approach, resulting in a novel and robust noise suppression method for NMR experiments (CS
Directory of Open Access Journals (Sweden)
Simionescu, Mihaela
2014-12-01
Full Text Available The necessity of improving the forecasts accuracy grew in the context of actual economic crisis, but few researchers were interested till now in finding out some empirical strategies to improve their predictions. In this article, for the inflation rate forecasts on the horizon 2010-2012, we proved that the one-step-ahead forecasts based on updated AR(2 models for Romania and ARMA(1,1 models for Bulgaria could be substantially improved by generating new predictions using Monte Carlo method and bootstrap technique to simulate the models' coefficients. In this article we introduced a new methodology of constructing the forecasts, by using the limits of the bias-corrected-accelerated bootstrap intervals for the initial data series of the variable to predict. After evaluating the accuracy of the new forecasts, we found out that all the proposed strategies improved the initial AR(2 and ARMA(1,1 forecasts. These techniques also improved the predictions of experts in forecasting made for Romania and the forecasts of the European Commission made for Bulgaria. Our own method based on the lower limits of BCA intervals generated the best forecasts. In the forecasting process based on ARMA models the uncertainty analysis was introduced, by calculating, under the hypothesis of normal distribution, the probability that the predicted value exceeds a critical value. For 2013 in both countries we anticipate a decrease in the degree of uncertainty for annual inflation rate. || La necesidad de mejorar la precisión de las previsiones ha crecido en el contexto de crisis económica actual, pero son pocos los investigadores que se habían interesado hasta ahora por la búsqueda de estrategias empíricas para mejorar sus predicciones. En este artículo, a través de las previsiones de la tasa de inflación en el horizonte 2010-2012, hemos podido comprobar que las previsiones de un solo paso adelante sobre la base de modelos actualizados AR(2 para Rumanía y ARMA(1
CELT optics Alignment Procedure
Mast, Terry S.; Nelson, Jerry E.; Chanan, Gary A.; Noethe, Lothar
2003-01-01
The California Extremely Large Telescope (CELT) is a project to build a 30-meter diameter telescope for research in astronomy at visible and infrared wavelengths. The current optical design calls for a primary, secondary, and tertiary mirror with Ritchey-Chretién foci at two Nasmyth platforms. The primary mirror is a mosaic of 1080 actively-stabilized hexagonal segments. This paper summarizes a CELT report that describes a step-by-step procedure for aligning the many degrees of freedom of the CELT optics.
Interventional radiology neck procedures.
Zabala Landa, R M; Korta Gómez, I; Del Cura Rodríguez, J L
2016-05-01
Ultrasonography has become extremely useful in the evaluation of masses in the head and neck. It enables us to determine the anatomic location of the masses as well as the characteristics of the tissues that compose them, thus making it possible to orient the differential diagnosis toward inflammatory, neoplastic, congenital, traumatic, or vascular lesions, although it is necessary to use computed tomography or magnetic resonance imaging to determine the complete extension of certain lesions. The growing range of interventional procedures, mostly guided by ultrasonography, now includes biopsies, drainages, infiltrations, sclerosing treatments, and tumor ablation.
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
(such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key $k$ which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: - Introducing and formally...
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
(such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key k which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: { Introducing and formally dening...
Optimization of Bootstrapping in Circuits
Benhamouda, Fabrice; Lepoint, Tancrède; Mathieu, Claire; Zhou, Hang
2016-01-01
In 2009, Gentry proposed the first Fully Homomorphic Encryption (FHE) scheme, an extremely powerful cryptographic primitive that enables to perform computations, i.e., to evaluate circuits, on encrypted data without decrypting them first. This has many applications, in particular in cloud computing. In all currently known FHE schemes, encryptions are associated to some (non-negative integer) noise level, and at each evaluation of an AND gate, the noise level increases. This is problematic bec...
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key k which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: { Introducing and formally dening...
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key $k$ which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: - Introducing and formally...
Regulations and Procedures Manual
Energy Technology Data Exchange (ETDEWEB)
Young, Lydia J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2011-07-25
The purpose of the Regulations and Procedures Manual (RPM) is to provide LBNL personnel with a reference to University and Lawrence Berkeley National Laboratory (LBNL or Laboratory) policies and regulations by outlining normal practices and answering most policy questions that arise in the day-to-day operations of Laboratory organizations. Much of the information in this manual has been condensed from detail provided in LBNL procedure manuals, Department of Energy (DOE) directives, and Contract DE-AC02-05CH11231. This manual is not intended, however, to replace any of those documents. RPM sections on personnel apply only to employees who are not represented by unions. Personnel policies pertaining to employees represented by unions may be found in their labor agreements. Questions concerning policy interpretation should be directed to the LBNL organization responsible for the particular policy. A link to the Managers Responsible for RPM Sections is available on the RPM home page. If it is not clear which organization is responsible for a policy, please contact Requirements Manager Lydia Young or the RPM Editor.
Regulations and Procedures Manual
Energy Technology Data Exchange (ETDEWEB)
Young, Lydia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2010-09-30
The purpose of the Regulations and Procedures Manual (RPM) is to provide Laboratory personnel with a reference to University and Lawrence Berkeley National Laboratory policies and regulations by outlining the normal practices and answering most policy questions that arise in the day-to-day operations of Laboratory departments. Much of the information in this manual has been condensed from detail provided in Laboratory procedure manuals, Department of Energy (DOE) directives, and Contract DE-AC02-05CH11231. This manual is not intended, however, to replace any of those documents. The sections on personnel apply only to employees who are not represented by unions. Personnel policies pertaining to employees represented by unions may be found in their labor agreements. Questions concerning policy interpretation should be directed to the department responsible for the particular policy. A link to the Managers Responsible for RPM Sections is available on the RPM home page. If it is not clear which department should be called, please contact the Associate Laboratory Director of Operations.
Designing Flight Deck Procedures
Degani, Asaf; Wiener, Earl
2005-01-01
Three reports address the design of flight-deck procedures and various aspects of human interaction with cockpit systems that have direct impact on flight safety. One report, On the Typography of Flight- Deck Documentation, discusses basic research about typography and the kind of information needed by designers of flight deck documentation. Flight crews reading poorly designed documentation may easily overlook a crucial item on the checklist. The report surveys and summarizes the available literature regarding the design and typographical aspects of printed material. It focuses on typographical factors such as proper typefaces, character height, use of lower- and upper-case characters, line length, and spacing. Graphical aspects such as layout, color coding, fonts, and character contrast are discussed; and several cockpit conditions such as lighting levels and glare are addressed, as well as usage factors such as angular alignment, paper quality, and colors. Most of the insights and recommendations discussed in this report are transferable to paperless cockpit systems of the future and computer-based procedure displays (e.g., "electronic flight bag") in aerospace systems and similar systems that are used in other industries such as medical, nuclear systems, maritime operations, and military systems.
Machado, Eustáquio José
2014-01-01
A equação hiperbólica, conhecida no contexto bioquímico como o modelo de Michaelis-Menten, é utilizada para descrever a velocidade de reações químicas envolvendo enzimas (cinética enzimática). Este estudo teve como objetivo comparar os ajustes do modelo de Michaelis-Menten (1913) que fez uso de dois modelos não-lineares e quatro modelos linearizados. Os dois modelos não-lineares (um utilizou o método clássico assintotico usual e o outro fez uso da abordagem "bootstrap"). Os modelos linearizad...
1997-01-01
l^m .:#,"■• y’’:>’£l’S ABO GROUP CONFIRMED |[ Rh OF .NEGATIVE UNITS CONFIRMED fe^/w’H’o’^Ä^ ■ •:.■.■:•• •■’ .Tv-v^l^.>’-V.^^V’ir;i-.^K-V.^’vV-i...using 0.02 ml of whole blood diluted with 5.98 ml of Drabkin’s reagent (1:251 dilution). 51 I. Measure the supernatant hemoglobin concentration...34NcUiod fef*ɜ 17-0? STANDARD OPERATING PROCEDURE Red Blood Cells Collected in the CPDA-1 800 ml Primary PVC Plastic Bag Collection System and
Some statistical properties of gene expression clustering for array data
DEFF Research Database (Denmark)
Abreu, G C G; Pinheiro, A; Drummond, R D;
2010-01-01
DNA array data without a corresponding statistical error measure. We propose an easy-to-implement and simple-to-use technique that uses bootstrap re-sampling to evaluate the statistical error of the nodes provided by SOM-based clustering. Comparisons between SOM and parametric clustering are presented......DNA arrays have been a rich source of data for the study of genomic expression of a wide variety of biological systems. Gene clustering is one of the paradigms quite used to assess the significance of a gene (or group of genes). However, most of the gene clustering techniques are applied to c...... for simulated as well as for two real data sets. We also implement a bootstrap-based pre-processing procedure for SOM, that improves the false discovery ratio of differentially expressed genes. Code in Matlab is freely available, as well as some supplementary material, at the following address: https...
The selective therapeutic apheresis procedures.
Sanchez, Amber P; Cunard, Robyn; Ward, David M
2013-02-01
Selective apheresis procedures have been developed to target specific molecules, antibodies, or cellular elements in a variety of diseases. The advantage of the selective apheresis procedures over conventional therapeutic plasmapheresis is preservation of other essential plasma components such as albumin, immunoglobulins, and clotting factors. These procedures are more commonly employed in Europe and Japan, and few are available in the USA. Apheresis procedures discussed in this review include the various technologies available for low-density lipoprotein (LDL) apheresis, double filtration plasmapheresis (DFPP), cryofiltration, immunoadsorption procedures, adsorption resins that process plasma, extracorporeal photopheresis, and leukocyte apheresis.
NASA trend analysis procedures
1993-01-01
This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.
Directory of Open Access Journals (Sweden)
Kleber Rogério Moreira Prado
2013-06-01
Full Text Available Current essay forwards a biodegradation model of a dye, used in the textile industry, based on a neural network propped by bootstrap remodeling. Bootstrapped neural network is set to generate estimates that are close to results obtained in an intrinsic experience in which a chemical process is applied. Pseudomonas oleovorans was used in the biodegradation of reactive Black 5. Results show a brief comparison between the information estimated by the proposed approach and the experimental data, with a coefficient of correlation between real and predicted values for a more than 0.99 biodegradation rate. Dye concentration and the solution’s pH failed to interfere in biodegradation index rates. A value above 90% of dye biodegradation was achieved between 1.000 and 1.841 mL 10 mL-1 of microorganism concentration and between 1.000 and 2.000 g 100 mL-1 of glucose concentration within the experimental conditions under analysis.
AGREED-UPON PROCEDURES, PROCEDURES FOR AUDITING EUROPEAN GRANTS
Directory of Open Access Journals (Sweden)
Daniel Petru VARTEIU
2016-12-01
The audit of EU-funded projects is an audit based on agreed-upon procedures, which are established by the Managing Authority or the Intermediate Body. Agreed-upon procedures can be defined as engagements made in accordance with ISRS 4400, applicable to agreed-upon procedures, where the auditor undertakes to carry out the agreed-upon procedures and issue a report on factual findings. The report provided by the auditor does not express any assurance. It allows users to form their own opinions about the conformity of the expenses with the project budget as well as the eligibility of the expenses.
Training for advanced endoscopic procedures.
Feurer, Matthew E; Draganov, Peter V
2016-06-01
Advanced endoscopy has evolved from diagnostic ERCP to an ever-increasing array of therapeutic procedures including EUS with FNA, ablative therapies, deep enteroscopy, luminal stenting, endoscopic suturing and endoscopic mucosal resection among others. As these procedures have become increasingly more complex, the risk of potential complications has also risen. Training in advanced endoscopy involves more than obtaining a minimum number of therapeutic procedures. The means of assessing a trainee's competence level and ability to practice independently continues to be a matter of debate. The use of quality indicators to measure performance levels may be beneficial as more advanced techniques and procedures become available.
Electronic Procedures for Medical Operations
2015-01-01
Electronic procedures are replacing text-based documents for recording the steps in performing medical operations aboard the International Space Station. S&K Aerospace, LLC, has developed a content-based electronic system-based on the Extensible Markup Language (XML) standard-that separates text from formatting standards and tags items contained in procedures so they can be recognized by other electronic systems. For example, to change a standard format, electronic procedures are changed in a single batch process, and the entire body of procedures will have the new format. Procedures can be quickly searched to determine which are affected by software and hardware changes. Similarly, procedures are easily shared with other electronic systems. The system also enables real-time data capture and automatic bookmarking of current procedure steps. In Phase II of the project, S&K Aerospace developed a Procedure Representation Language (PRL) and tools to support the creation and maintenance of electronic procedures for medical operations. The goal is to develop these tools in such a way that new advances can be inserted easily, leading to an eventual medical decision support system.
Procedural Personas for Player Decision Modeling and Procedural Content Generation
DEFF Research Database (Denmark)
Holmgård, Christoffer
2016-01-01
in specific games. It further explores how simple utility functions, easily defined and changed by game designers, can be used to construct agents expressing a variety of decision making styles within a game, using a variety of contemporary AI approaches, naming the resulting agents "Procedural Personas......." These methods for constructing procedural personas are then integrated with existing procedural content generation systems, acting as critics that shape the output of these systems, optimizing generated content for different personas and by extension, different kinds of players and their decision making styles...
Wang, Xiaoming; Kindzierski, Warren; Kaul, Padma
2015-01-01
Adverse associations between air pollution and myocardial infarction (MI) are widely reported in medical literature. However, inconsistency and sensitivity of the findings are still big concerns. An exploratory investigation was undertaken to examine associations between air pollutants and risk of acute MI (AMI) hospitalization in Alberta, Canada. A time stratified case-crossover design was used to assess the transient effect of five air pollutants (carbon monoxide (CO), nitrogen dioxide (NO2), nitric oxide (NO), ozone (O3) and particulate matter with an aerodynamic diameter ≤2.5 (PM2.5)) on the risk of AMI hospitalization over the period 1999-2009. Subgroups were predefined to see if any susceptible group of individuals existed. A three-step procedure, including univariate analysis, multivariate analysis, and bootstrap model averaging, was used. The multivariate analysis was used in an effort to address adjustment uncertainty; whereas the bootstrap technique was used as a way to account for regression model uncertainty. There were 25,894 AMI hospital admissions during the 11-year period. Estimating health effects that are properly adjusted for all possible confounding factors and accounting for model uncertainty are important for making interpretations of air pollution-health effect associations. The most robust findings included: (1) only 1-day lag NO2 concentrations (6-, 12- or 24-hour average), but not those of CO, NO, O3 or PM2.5, were associated with an elevated risk of AMI hospitalization; (2) evidence was suggested for an effect of elevated risk of hospitalization for NSTEMI (Non-ST Segment Elevation Myocardial Infarction), but not for STEMI (ST segment elevation myocardial infarction); and (3) susceptible subgroups included elders (age ≥65) and elders with hypertension. As this was only an exploratory study there is a need to replicate these findings with other methodologies and datasets.
Directory of Open Access Journals (Sweden)
Xiaoming Wang
Full Text Available Adverse associations between air pollution and myocardial infarction (MI are widely reported in medical literature. However, inconsistency and sensitivity of the findings are still big concerns. An exploratory investigation was undertaken to examine associations between air pollutants and risk of acute MI (AMI hospitalization in Alberta, Canada. A time stratified case-crossover design was used to assess the transient effect of five air pollutants (carbon monoxide (CO, nitrogen dioxide (NO2, nitric oxide (NO, ozone (O3 and particulate matter with an aerodynamic diameter ≤2.5 (PM2.5 on the risk of AMI hospitalization over the period 1999-2009. Subgroups were predefined to see if any susceptible group of individuals existed. A three-step procedure, including univariate analysis, multivariate analysis, and bootstrap model averaging, was used. The multivariate analysis was used in an effort to address adjustment uncertainty; whereas the bootstrap technique was used as a way to account for regression model uncertainty. There were 25,894 AMI hospital admissions during the 11-year period. Estimating health effects that are properly adjusted for all possible confounding factors and accounting for model uncertainty are important for making interpretations of air pollution-health effect associations. The most robust findings included: (1 only 1-day lag NO2 concentrations (6-, 12- or 24-hour average, but not those of CO, NO, O3 or PM2.5, were associated with an elevated risk of AMI hospitalization; (2 evidence was suggested for an effect of elevated risk of hospitalization for NSTEMI (Non-ST Segment Elevation Myocardial Infarction, but not for STEMI (ST segment elevation myocardial infarction; and (3 susceptible subgroups included elders (age ≥65 and elders with hypertension. As this was only an exploratory study there is a need to replicate these findings with other methodologies and datasets.
紧凑型聚变裂变混合堆自举电流的数值模拟研究%Simulation on bootstrap current for the compact fusion-fission hybrid reactor
Institute of Scientific and Technical Information of China (English)
陈美霞; 刘成岳; 舒双宝
2015-01-01
On the basis of the equilibrium code Jsolver, the compact fusion-fission hybrid reactor’s advanced equilibrium configuration design is carried out, especially for the reversed shear operation mode. And the calculation, distribution and fraction of bootstrap current are also simulated.%以平衡程序Jsolver为基础开展了紧凑型聚变裂变混合堆先进等离子体平衡位形设计，重点研究了反剪切运行模式，并在此位形下研究了自举电流的计算、分布及份额。
Laparoscopic reversal of Hartmann's procedure
DEFF Research Database (Denmark)
Svenningsen, Peter Olsen; Bulut, Orhan; Jess, Per
2010-01-01
INTRODUCTION: A change in procedure from open to laparoscopic reversal of Hartmann's colostomy was implemented at our department between May 2005 and December 2008. The aim of the study was to investigate if this change was beneficial for the patients. MATERIAL AND METHODS: The medical records...... of all patients who underwent reversal of a colostomy after a primary Hartmann's procedure during the period May 2005 to December 2008 were reviewed retrospectively in a case-control study. RESULTS: A total of 43 patients were included. Twenty-one had a laparoscopic and 22 an open procedure. The two...
Institute of Scientific and Technical Information of China (English)
周西峰; 孙浩; 王丹; 郭前岗
2011-01-01
This paper analyzes the main problems of brushless DC motor(BLDCM) driving control system using bootstrap driving method. It is noted that PWM scheme is restricted when bootstrap driving method is adopted,thus initiated problems of more torque ripple, the diode freewheeling of the inactive phase, difficulties in sensorless BLDCM control and the heat generation of power switches.The basic reasons of those problems are profoundly analyzed through theoratical and mathmatical methods,and the correctness of the conclusion is also verified by experiment. Meanwhile, solution is also proposed for those which have requirements on more restricted specifications.%详细分析了自举驱动方式下无刷直流电机(BLDCM)控制系统中存在的主要问题,指出了因采用自举驱动带来的PWM调制方式受限问题以及由此引发的转矩脉动、非导通相续流、无位置传感器控制实施困难和开关管发热问题.在此通过原理分析和公式推导,深入分析了产生上述问题的根本原因,并通过实验验证了所推结论的正确性,同时给出了在控制指标要求较高的场合解决上述问题的方法.
Assisted Medical Procedures (AMP) Project
National Aeronautics and Space Administration — Documentation and Development: The AMP was initially being developed as part the Advanced Integrated Clinical System (AICS)-Guided Medical Procedure System for the...
Surface Environmental Surveillance Procedures Manual
Energy Technology Data Exchange (ETDEWEB)
Hanf, RW; Dirkes, RL
1990-02-01
This manual establishes the procedures for the collection of environmental samples and the performance of radiation surveys and other field measurements. Responsibilities are defined for those personnel directly involved in the collection of samples and the performance of field measurements.
Interventional procedures in the chest.
Vollmer Torrubiano, I; Sánchez González, M
2016-05-01
Many thoracic conditions will require an interventional procedure for diagnosis and/or treatment. For this reason, radiologists need to know the indications and the technique for each procedure. In this article, we review the various interventional procedures that radiologists should know and the indications for each procedure. We place special emphasis on the potential differences in the diagnostic results and complications between fine-needle aspiration and biopsy. We also discuss the indications for radiofrequency ablation of lung tumors and review the concepts related to the drainage of pulmonary abscesses. We devote special attention to the management of pleural effusion, covering the indications for thoracocentesis and when to use imaging guidance, and to the protocol for pleural drainage. We also discuss the indications for percutaneous treatment of pericardial effusion and the possible complications of this treatment. Finally, we discuss the interventional management of mediastinal lesions and provide practical advice about how to approach these lesions to avoid serious complications.
Wang, Yunsheng; Weinacker, Holger; Koch, Barbara
2008-06-12
A procedure for both vertical canopy structure analysis and 3D single tree modelling based on Lidar point cloud is presented in this paper. The whole area of research is segmented into small study cells by a raster net. For each cell, a normalized point cloud whose point heights represent the absolute heights of the ground objects is generated from the original Lidar raw point cloud. The main tree canopy layers and the height ranges of the layers are detected according to a statistical analysis of the height distribution probability of the normalized raw points. For the 3D modelling of individual trees, individual trees are detected and delineated not only from the top canopy layer but also from the sub canopy layer. The normalized points are resampled into a local voxel space. A series of horizontal 2D projection images at the different height levels are then generated respect to the voxel space. Tree crown regions are detected from the projection images. Individual trees are then extracted by means of a pre-order forest traversal process through all the tree crown regions at the different height levels. Finally, 3D tree crown models of the extracted individual trees are reconstructed. With further analyses on the 3D models of individual tree crowns, important parameters such as crown height range, crown volume and crown contours at the different height levels can be derived.
Mei, S; Tonry, J L; Jordan, A; Peng, E W; Côté, P; Ferrarese, L; Merritt, D; Milosavljevic, M; West, M J; Mei, Simona; Blakeslee, John P.; Tonry, John L.; Jordan, Andres; Peng, Eric W.; Cote, Patrick; Ferrarese, Laura; Merritt, David; Milosavljevic, Milos; West, Michael J.
2005-01-01
The Advanced Camera for Surveys (ACS) Virgo Cluster Survey is a large program to image 100 early-type Virgo galaxies using the F475W and F850LP bandpasses of the Wide Field Channel of the ACS instrument on the Hubble Space Telescope (HST). The scientific goals of this survey include an exploration of the three-dimensional structure of the Virgo Cluster and a critical examination of the usefulness of the globular cluster luminosity function as a distance indicator. Both of these issues require accurate distances for the full sample of 100 program galaxies. In this paper, we describe our data reduction procedures and examine the feasibility of accurate distance measurements using the method of surface brightness fluctuations (SBF) applied to the ACS Virgo Cluster Survey F850LP imaging. The ACS exhibits significant geometrical distortions due to its off-axis location in the HST focal plane; correcting for these distortions by resampling the pixel values onto an undistorted frame results in pixel correlations tha...
Nominating Procedures in Democratic Polities
Kasapović, Mirjana
2002-01-01
One of the focuses of the study of political parties at the end of the 20th century has been the organizational structure and the relations within political parties, including the nominating procedures for the candidate selection for general elections. The manner in which parties fulfill their recruiting function and, eventually, the quality of the political and the governing elite in a “party” state directly depends on these procedures. Typologically, there are differences between the nomina...
Laparoscopic reversal of Hartmann's procedure
DEFF Research Database (Denmark)
Svenningsen, Peter Olsen; Bulut, Orhan; Jess, Per
2010-01-01
A change in procedure from open to laparoscopic reversal of Hartmann's colostomy was implemented at our department between May 2005 and December 2008. The aim of the study was to investigate if this change was beneficial for the patients.......A change in procedure from open to laparoscopic reversal of Hartmann's colostomy was implemented at our department between May 2005 and December 2008. The aim of the study was to investigate if this change was beneficial for the patients....
Safety analysis procedures for PHWR
Energy Technology Data Exchange (ETDEWEB)
Min, Byung Joo; Kim, Hyoung Tae; Yoo, Kun Joong
2004-03-01
The methodology of safety analyses for CANDU reactors in Canada, a vendor country, uses a combination of best-estimate physical models and conservative input parameters so as to minimize the uncertainty of the plant behavior predictions. As using the conservative input parameters, the results of the safety analyses are assured the regulatory requirements such as the public dose, the integrity of fuel and fuel channel, the integrity of containment and reactor structures, etc. However, there is not the comprehensive and systematic procedures for safety analyses for CANDU reactors in Korea. In this regard, the development of the safety analyses procedures for CANDU reactors is being conducted not only to establish the safety analyses system, but also to enhance the quality assurance of the safety assessment. In the first phase of this study, the general procedures of the deterministic safety analyses are developed. The general safety procedures are covered the specification of the initial event, selection of the methodology and accident sequences, computer codes, safety analysis procedures, verification of errors and uncertainties, etc. Finally, These general procedures of the safety analyses are applied to the Large Break Loss Of Coolant Accident (LBLOCA) in Final Safety Analysis Report (FSAR) for Wolsong units 2, 3, 4.
Collected radiochemical and geochemical procedures
Energy Technology Data Exchange (ETDEWEB)
Kleinberg, J [comp.
1990-05-01
This revision of LA-1721, 4th Ed., Collected Radiochemical Procedures, reflects the activities of two groups in the Isotope and Nuclear Chemistry Division of the Los Alamos National Laboratory: INC-11, Nuclear and radiochemistry; and INC-7, Isotope Geochemistry. The procedures fall into five categories: I. Separation of Radionuclides from Uranium, Fission-Product Solutions, and Nuclear Debris; II. Separation of Products from Irradiated Targets; III. Preparation of Samples for Mass Spectrometric Analysis; IV. Dissolution Procedures; and V. Geochemical Procedures. With one exception, the first category of procedures is ordered by the positions of the elements in the Periodic Table, with separate parts on the Representative Elements (the A groups); the d-Transition Elements (the B groups and the Transition Triads); and the Lanthanides (Rare Earths) and Actinides (the 4f- and 5f-Transition Elements). The members of Group IIIB-- scandium, yttrium, and lanthanum--are included with the lanthanides, elements they resemble closely in chemistry and with which they occur in nature. The procedures dealing with the isolation of products from irradiated targets are arranged by target element.
Procedural Personas for Player Decision Modeling and Procedural Content Generation
DEFF Research Database (Denmark)
Holmgård, Christoffer
2016-01-01
." These methods for constructing procedural personas are then integrated with existing procedural content generation systems, acting as critics that shape the output of these systems, optimizing generated content for different personas and by extension, different kinds of players and their decision making styles....... This thesis explores methods for creating low-complexity, easily interpretable, generative AI agents for use in game and simulation design. Based on insights from decision theory and behavioral economics, the thesis investigates how player decision making styles may be defined, operationalised, and measured...... in specific games. It further explores how simple utility functions, easily defined and changed by game designers, can be used to construct agents expressing a variety of decision making styles within a game, using a variety of contemporary AI approaches, naming the resulting agents "Procedural Personas...
Directory of Open Access Journals (Sweden)
Mabaso Musawenkosi LH
2007-09-01
Full Text Available Abstract Background Several malaria risk maps have been developed in recent years, many from the prevalence of infection data collated by the MARA (Mapping Malaria Risk in Africa project, and using various environmental data sets as predictors. Variable selection is a major obstacle due to analytical problems caused by over-fitting, confounding and non-independence in the data. Testing and comparing every combination of explanatory variables in a Bayesian spatial framework remains unfeasible for most researchers. The aim of this study was to develop a malaria risk map using a systematic and practicable variable selection process for spatial analysis and mapping of historical malaria risk in Botswana. Results Of 50 potential explanatory variables from eight environmental data themes, 42 were significantly associated with malaria prevalence in univariate logistic regression and were ranked by the Akaike Information Criterion. Those correlated with higher-ranking relatives of the same environmental theme, were temporarily excluded. The remaining 14 candidates were ranked by selection frequency after running automated step-wise selection procedures on 1000 bootstrap samples drawn from the data. A non-spatial multiple-variable model was developed through step-wise inclusion in order of selection frequency. Previously excluded variables were then re-evaluated for inclusion, using further step-wise bootstrap procedures, resulting in the exclusion of another variable. Finally a Bayesian geo-statistical model using Markov Chain Monte Carlo simulation was fitted to the data, resulting in a final model of three predictor variables, namely summer rainfall, mean annual temperature and altitude. Each was independently and significantly associated with malaria prevalence after allowing for spatial correlation. This model was used to predict malaria prevalence at unobserved locations, producing a smooth risk map for the whole country. Conclusion We have
Institute of Scientific and Technical Information of China (English)
王超; 孔凡让; 胡飞; 刘方
2014-01-01
为了提高高速列车道旁轴承状态监测的准确性和可靠性，提出了一种消除信号中多普勒效应的方法。该方法首先应用短时傅里叶变换-Crazy Climber算法求取信号瞬时频率估计，然后结合基于运动学模型的时域重采样原理对信号重采样，重采样得到的信号就是消除了多普勒效应的信号。阐述了短时傅里叶变换-Crazy Climber算法以及时域重采样原理，并与现有方法进行了比较。仿真和实验结果表明，上述方法在去除信号多普勒效应方面具有良好的效果，且易于在道旁轴承监测和故障诊断中实现。%Doppler effect widely exists in the signal from the moving acoustic source .With the rapid development and speed-up of modern rail transports ,the frequency shift and frequency band expansion resulted from Doppler effect bear heavily on the re-liability and accuracy of the wayside Acoustic Defective Bearing Detector (ADBD) system .In order to improve the performance of condition monitoring of the bearings on a passing train with a high speed ,the Short Time Fourier Transform-Crazy Climber Algorithm (STFT-CC) is first employed to obtain instantaneous frequency estimation of the distorted signal .Then the necessa-ry parameters for time domain interpolation re-sampling which is totally based on the kinematic analysis are acquired and then the re-sampling sequence could be established in the time domain .The effectiveness of this method is verified by means of sim-ulation studies and applications to the fault diagnosis of train roller bearing defects .The results of the simulation and the exper-iment indicate that the proposed method has an excellent performance in removing Doppler effect ,and could be easily imple-mented to the condition monitoring and fault diagnosis of train bearings with a high moving speed .
A Comparison of Two Strategies for Building an Exposure Prediction Model.
Heiden, Marina; Mathiassen, Svend Erik; Garza, Jennifer; Liv, Per; Wahlström, Jens
2016-01-01
Cost-efficient assessments of job exposures in large populations may be obtained from models in which 'true' exposures assessed by expensive measurement methods are estimated from easily accessible and cheap predictors. Typically, the models are built on the basis of a validation study comprising 'true' exposure data as well as an extensive collection of candidate predictors from questionnaires or company data, which cannot all be included in the models due to restrictions in the degrees of freedom available for modeling. In these situations, predictors need to be selected using procedures that can identify the best possible subset of predictors among the candidates. The present study compares two strategies for selecting a set of predictor variables. One strategy relies on stepwise hypothesis testing of associations between predictors and exposure, while the other uses cluster analysis to reduce the number of predictors without relying on empirical information about the measured exposure. Both strategies were applied to the same dataset on biomechanical exposure and candidate predictors among computer users, and they were compared in terms of identified predictors of exposure as well as the resulting model fit using bootstrapped resamples of the original data. The identified predictors were, to a large part, different between the two strategies, and the initial model fit was better for the stepwise testing strategy than for the clustering approach. Internal validation of the models using bootstrap resampling with fixed predictors revealed an equally reduced model fit in resampled datasets for both strategies. However, when predictor selection was incorporated in the validation procedure for the stepwise testing strategy, the model fit was reduced to the extent that both strategies showed similar model fit. Thus, the two strategies would both be expected to perform poorly with respect to predicting biomechanical exposure in other samples of computer users.
Complications of the Latarjet procedure.
Gupta, Ashish; Delaney, Ruth; Petkin, Kalojan; Lafosse, Laurent
2015-03-01
The Latarjet procedure is an operation performed either arthroscopically or open for recurrent anterior shoulder instability, in the setting of glenoid bone loss; with good to excellent functional results. Despite excellent clinical results, the complication rates are reported between 15 and 30 %. Intraoperative complications such as graft malpositioning, neurovascular injury, and graft fracture can all be mitigated with meticulous surgical technique and understanding of the local anatomy. Nonunion and screw breakage are intermediate-term complications that occur in less than 5 % of patients. The long-term complications such as graft osteolysis are still an unsolved problem, and future research is required to understand the etiology and best treatment option. Recurrent instability after the Latarjet procedure can be managed with iliac crest bone graft reconstruction of the anterior glenoid. Shoulder arthritis is another complication reported after the Latarjet procedure, which poses additional challenges to both the surgeon and patient.
Design Procedure for Hybrid Ventilation
DEFF Research Database (Denmark)
Heiselberg, Per; Tjelflaat, Per Olaf
Mechanical and natural ventilation systems have developed separately during many years. The natural next step in this development is development of ventilation concepts that utilises and combines the best features from each system into a new type of ventilation system - Hybrid Ventilation....... Buildings with hybrid ventilation often include other sustainable technologies and an energy optimisation requires an integrated approach in the design of the building and its mechanical systems. Therefore, the hybrid ventilation design procedure differs from the design procedure for conventional HVAC....... The first ideas on a design procedure for hybrid ventilation is presented and the different types of design methods, that is needed in different phases of the design process, is discussed....
Radiation control standards and procedures
Energy Technology Data Exchange (ETDEWEB)
1956-12-14
This manual contains the Radiation Control Standards'' and Radiation Control Procedures'' at Hanford Operations which have been established to provide the necessary control radiation exposures within Irradiation Processing Department. Provision is also made for including, in the form of Bulletins'', other radiological information of general interest to IPD personnel. The purpose of the standards is to establish firm radiological limits within which the Irradiation Processing Department will operate, and to outline our radiation control program in sufficient detail to insure uniform and consistent application throughout all IPD facilities. Radiation Control Procedures are intended to prescribe the best method of accomplishing an objective within the limitations of the Radiation Control Standards. A procedure may be changed at any time provided the suggested changes is generally agreeable to management involved, and is consistent with department policies and the Radiation Control Standards.
New procedure for departure formalities
HR & GS Departments
2011-01-01
As part of the process of simplifying procedures and rationalising administrative processes, the HR and GS Departments have introduced new personalised departure formalities on EDH. These new formalities have applied to students leaving CERN since last year and from 17 October 2011 this procedure will be extended to the following categories of CERN personnel: Staff members, Fellows and Associates. It is planned to extend this electronic procedure to the users in due course. What purpose do departure formalities serve? The departure formalities are designed to ensure that members of the personnel contact all the relevant services in order to return any necessary items (equipment, cards, keys, dosimeter, electronic equipment, books, etc.) and are aware of all the benefits to which they are entitled on termination of their contract. The new departure formalities on EDH have the advantage of tailoring the list of services that each member of the personnel must visit to suit his individual contractual and p...
Interventional chest procedures in pregnancy.
LENUS (Irish Health Repository)
Morgan, Ross K
2012-02-01
Interventional pulmonology encompasses diagnostic and therapeutic bronchoscopic procedures, and pleural interventions. In the last 10 years older techniques have been refined and exciting new technologies have extended the reach and application of the instruments used. The main areas within pulmonary medicine for which these interventions have a role are malignant and nonmalignant airway disease, pleural effusion, pneumothorax, and artificial airways. There are no data from well-designed prospective trials to guide recommendations for interventional pulmonary procedures in pregnancy. The recommendations provided in this article are based on critical review of reported case series, opinion from recognized experts, and personal observations.
Interventional chest procedures in pregnancy.
LENUS (Irish Health Repository)
Morgan, Ross K
2011-03-01
Interventional pulmonology encompasses diagnostic and therapeutic bronchoscopic procedures, and pleural interventions. In the last 10 years older techniques have been refined and exciting new technologies have extended the reach and application of the instruments used. The main areas within pulmonary medicine for which these interventions have a role are malignant and nonmalignant airway disease, pleural effusion, pneumothorax, and artificial airways. There are no data from well-designed prospective trials to guide recommendations for interventional pulmonary procedures in pregnancy. The recommendations provided in this article are based on critical review of reported case series, opinion from recognized experts, and personal observations.
Institute of Scientific and Technical Information of China (English)
杨翠丽; 秦荣; 杜灿谊
2011-01-01
将一种无转速测量信号,基于希尔伯特—黄变换等角度重采样的阶次分析方法应用于发动机凸轮轴故障诊断,从而有效消除转速波动的影响,使信号处理与分析结果更加准确.应用所提出的方法对某6缸柴油机缸体振动信号进行分析,对比凸轮轴轴瓦断裂修复前后振动信号特征的变化,有效地提取出了故障特征.%The order analysis method based on the Hilbert-Huang transformation constant angle re-sampling was applied to the fault diagnosis of camshaft, the disturbance of speed was effectively eliminated and the signal processing and analysis results were improved. By analyzing the vibration signal of a 6-cylinder diesel engine and comparing the variation of vibration signal before and after the camshaft bearing fracture repair, the fault characteristics were extracted effectively.
Institute of Scientific and Technical Information of China (English)
岳立文; 闵捷; 王灿楠; 胡丹; 李朝赟; 肖珊
2011-01-01
目的 应用Bootstrap方法对几种常见加工食品污染物(铅)残留浓度与原料食品污染物残留浓度的比值做出估计,并与文中定义的折算系数进行比较,对两者的关系及其意义进行探讨.方法 从中国2000～2006年全国农产品污染物监测数据库中筛选出常见的、频数较多的几种食品,运用Bootstrap方法估计铅在加工食品与其原料食品残留浓度之间的比值.结果 果汁、馒头、米粉、豆腐类等残留浓度的比值小于1,饼干、面包、豆腐干、鱼罐头残留浓度的比值基本等于1,猪肉干及皮蛋的残留浓度的比值明显超过1,这些食品残留浓度比值大多与折算系数不同.结论 使用Bootstrap方法估计常见食品污染物残留浓度的比值具有可行性.加工过程对食品的污染物残留有一定影响,不能仅用稀释与浓缩解释.%Objective Using Bootstrap method to estimate the ratio of residual pollutants in processed foods to their raw materials in several common foods, comparing the conversion coefficient defined in this article and exploring its significance for applying in the future. Methods Several common foods sorted from the national data base of monitoring pollutants in agricultural products from 2000 to 2006 were used to estimate the ratio of residual lead in processed foods to their raw food materials by Bootstrap method. Results The ratio of residuals in juice, steamed bun, rice flour and bean curd to their raw materials was less than one. That in biscuit, bread, dried tofu and canned fish was almost equal to one, and that in dried pork and preserved eggs was obviously higher than one. These ratios were frequently different from the conversion coefficients. Conclusion Using Bootstrap method to estimate the ratio of residual concentration of pollutants in several common food is feasible. The amount of pollutants remained in food can be influenced by food processing, while not can be explained only by dilution
Postoperative Complications of Beger Procedure
Directory of Open Access Journals (Sweden)
Nayana Samejima Peternelli
2015-01-01
Full Text Available Introduction. Chronic pancreatitis (CP is considered an inflammatory disease that may cause varying degrees of pancreatic dysfunction. Conservative and surgical treatment options are available depending on dysfunction severity. Presentation of Case. A 36-year-old male with history of heavy alcohol consumption and diagnosed CP underwent a duodenal-preserving pancreatic head resection (DPPHR or Beger procedure after conservative treatment failure. Refractory pain was reported on follow-up three months after surgery and postoperative imaging uncovered stones within the main pancreatic duct and intestinal dilation. The patient was subsequently subjected to another surgical procedure and intraoperative findings included protein plugs within the main pancreatic duct and pancreaticojejunal anastomosis stricture. A V-shaped enlargement and main pancreatic duct dilation in addition to the reconstruction of the previous pancreaticojejunal anastomosis were performed. The patient recovered with no further postoperative complications in the follow-up at an outpatient clinic. Discussion. Main duct and pancreaticojejunal strictures are an unusual complication of the Beger procedure but were identified intraoperatively as the cause of patient’s refractory pain and explained intraductal protein plugs accumulation. Conclusion. Patients that undergo Beger procedures should receive close outpatient clinical follow-up in order to guarantee postoperative conservative treatment success and therefore guarantee an early detection of postoperative complications.
Toddler test or procedure preparation
... care for other siblings or meals for the family so you can focus on supporting your child. Other considerations: Your child will probably resist the procedure and may even try to run away. A firm, direct approach from you and the health care ...
Environmental Impact Assessment: A Procedure.
Stover, Lloyd V.
Prepared by a firm of consulting engineers, this booklet outlines the procedural "whys and hows" of assessing environmental impact, particularly for the construction industry. Section I explores the need for environmental assessment and evaluation to determine environmental impact. It utilizes a review of the National Environmental Policy Act and…
77 FR 34186 - Appeal Procedures
2012-06-11
... pursuant to the Highly Erodible Land and Wetland Conservation (HELC/WC) provisions solely. ``Program... Conservation; (viii) Wetland Conservation; (ix) Wetlands Reserve Program and Wetlands Reserve Enhancement... Natural Resources Conservation Service 7 CFR Part 614 RIN 0578-AA59 Appeal Procedures AGENCY:...
Illustration of a Learning Procedure.
Borghouts-van Erp, J. W. M.
The paper describes evolution of an approach to teaching mathematically disabled and slow learning students through a Piagetian framework. It is explained that a step-by-step procedure is used to internalize material actions into mental actions via perception and verbalization. Formulae are introduced early, and emphasis is placed on promoting…
Security procedures in wireless networks
Institute of Scientific and Technical Information of China (English)
郑光
2009-01-01
In the paper, we will introduce the mechanisms and the weaknesses of the Wired Equivalent Privacy (WEP) and 802.1 li security procedures in the wireless networks. After that, the Wi-Fi Protected Access (WPA), a standards-based security mechanism that can eliminate most of 802.11 security problems will be introduced.
Directory of Open Access Journals (Sweden)
G. M. R. Manzella
Full Text Available A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000, six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT, was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1 Position control; (2 Elimination of spikes; (3 Re-sampling at a 1 metre vertical interval; (4 Filtering; (5 General malfunctioning check; (6 Comparison with climatology (and distance from this in terms of standard deviations; (7 Visual check; and (8 Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment.
Key words. Oceanography: physical (instruments and techniques; general circulation; hydrography
Structured programming: Principles, notation, procedure
JOST
1978-01-01
Structured programs are best represented using a notation which gives a clear representation of the block encapsulation. In this report, a set of symbols which can be used until binding directives are republished is suggested. Structured programming also allows a new method of procedure for design and testing. Programs can be designed top down, that is, they can start at the highest program plane and can penetrate to the lowest plane by step-wise refinements. The testing methodology also is adapted to this procedure. First, the highest program plane is tested, and the programs which are not yet finished in the next lower plane are represented by so-called dummies. They are gradually replaced by the real programs.
Invasive procedures with questionable indications
Directory of Open Access Journals (Sweden)
Sergei V. Jargin
2014-12-01
Full Text Available Insufficient coordination of medical research and partial isolation from the international scientific community can result in application of invasive methods without sufficient indications. Here is presented an overview of renal and pancreatic biopsy studies performed in the course of the operations of pancreatic blood shunting into the systemic blood flow in type 1 diabetic patients. Furthermore a surgical procedure of lung denervation as a treatment method of asthma as well as the use of bronchoscopy for research in asthmatics are discussed here. Today, the upturn in Russian economy enables acquisition of modern equipment; and medical research is on the increase. Under these circumstances, the purpose of this letter was to remind that, performing surgical or other invasive procedures, the risk-to-benefit ratio should be kept as low as possible.
Aesthetic Surgical Crown Lengthening Procedure
de Oliveira, Pablo Santos; Chiarelli, Fabio; Rodrigues, José A.; Shibli, Jamil A.; Zizzari, Vincenzo Luca; Piattelli, Adriano; Iezzi, Giovanna; Perrotti, Vittoria
2015-01-01
The aim of this case report was to describe the surgical sequence of crown lengthening to apically reposition the dentogingival complex, in addition to an esthetic restorative procedure. Many different causes can be responsible for short clinical crown. In these cases, the correct execution of a restorative or prosthetic rehabilitation requires an increasing of the crown length. According to the 2003 American Academy of Periodontology (Practice Profile Survey), crown lengthening is the most habitual surgical periodontal treatment. PMID:26609452
Aesthetic Surgical Crown Lengthening Procedure
Directory of Open Access Journals (Sweden)
Pablo Santos de Oliveira
2015-01-01
Full Text Available The aim of this case report was to describe the surgical sequence of crown lengthening to apically reposition the dentogingival complex, in addition to an esthetic restorative procedure. Many different causes can be responsible for short clinical crown. In these cases, the correct execution of a restorative or prosthetic rehabilitation requires an increasing of the crown length. According to the 2003 American Academy of Periodontology (Practice Profile Survey, crown lengthening is the most habitual surgical periodontal treatment.
Repair Types, Procedures - Part 1
2010-05-01
New Jersey, USA. Repair Types, Procedures – Part I RTO-EN-AVT-156 9 - 19 [5] Drieker R, Botello C, MacBeth S, and Grody J, “Aircraft Battle... MacBeth S, and Grody J, “Aircraft Battle Damage Assessment and Repair (ABDAR), Vol. III: Field Test Report,” AFRL-HE-WP-TR-2002-0039, July 2000. [8
Surface Environmental Surveillance Procedures Manual
Energy Technology Data Exchange (ETDEWEB)
RW Hanf; TM Poston
2000-09-20
Environmental surveillance data are used in assessing the impact of current and past site operations on human health and the environment, demonstrating compliance with applicable local, state, and federal environmental regulations, and verifying the adequacy of containment and effluent controls. SESP sampling schedules are reviewed, revised, and published each calendar year in the Hanford Site Environmental Surveillance Master Sampling Schedule. Environmental samples are collected by SESP staff in accordance with the approved sample collection procedures documented in this manual.
Flexible Execution of Cognitive Procedures.
1987-06-30
Rosenbloom. P. (1988). Symbolic Architectures. In Posner, M. (Ed.), FoundatiCns of Cogniive Science . Cambridge. MKA MIT Press. In preparation. Nii. P. , 1986...Procedures 00 Technical Report PCG-5 14 Kurt VanLehn and William Ball Departments of Psychology and Computer Science Carnegie-Mellon University...Psychology and Computer Science Carnegie-Mellon University Pittsburgh, PA 15213 U.S.A 30 June 1987 Running head: Flexible Execution of Cognitive
Procedures for restoring vestibular disorders
2005-01-01
This paper will discuss therapeutic possibilities for disorders of the vestibular organs and the neurons involved, which confront ENT clinicians in everyday practice. Treatment of such disorders can be tackled either symptomatically or causally. The possible strategies for restoring the body's vestibular sense, visual function and co-ordination include medication, as well as physical and surgical procedures. Prophylactic or preventive measures are possible in some disorders which involve vert...
Aesthetic Surgical Crown Lengthening Procedure.
de Oliveira, Pablo Santos; Chiarelli, Fabio; Rodrigues, José A; Shibli, Jamil A; Zizzari, Vincenzo Luca; Piattelli, Adriano; Iezzi, Giovanna; Perrotti, Vittoria
2015-01-01
The aim of this case report was to describe the surgical sequence of crown lengthening to apically reposition the dentogingival complex, in addition to an esthetic restorative procedure. Many different causes can be responsible for short clinical crown. In these cases, the correct execution of a restorative or prosthetic rehabilitation requires an increasing of the crown length. According to the 2003 American Academy of Periodontology (Practice Profile Survey), crown lengthening is the most habitual surgical periodontal treatment.
Aesthetic Surgical Crown Lengthening Procedure
2015-01-01
The aim of this case report was to describe the surgical sequence of crown lengthening to apically reposition the dentogingival complex, in addition to an esthetic restorative procedure. Many different causes can be responsible for short clinical crown. In these cases, the correct execution of a restorative or prosthetic rehabilitation requires an increasing of the crown length. According to the 2003 American Academy of Periodontology (Practice Profile Survey), crown lengthening is the most h...
Bootstrap tekniğine bağlı kümülatif toplam kontrol grafiklerinde değişim noktası analizi
Directory of Open Access Journals (Sweden)
Meral Yay
2016-12-01
Full Text Available Bu çalışmanın amacı değişim noktasının belirlenmesinde kullanılmak üzere “R” programlama dili için yeni geliştirilen “ChangePoint” isimli fonsiyonun tanıtılması ve bootstrap tekniğine bağlı kümülatif toplam (CUSUM kontrol grafikleri aracılığıyla 5 Eylül-13 Kasım 2015 tarihleri arasında İstanbul’da kaydedilen SO2 değerlerinden oluşan veri seti üzerinde uygulanmasıdır. Değişim noktasının belirlenmesinde kestirim yöntemlerinden Sm ve MSE, grafik yöntemlerinden ise V Maskesi ve Kümülatif Toplam çizelgesi (TCUSUM kullanılarak sonuçlar karşılaştırılacaktır.
Directory of Open Access Journals (Sweden)
José Antonio Pino Roque
2003-01-01
Full Text Available Uno de los problemas fundamentales de la Estadística es la estimación de parámetros de poblaciones. En ocaciones los investigadores cuentan con pequeños tamaños de muestra para sus experimentos, es aquí donde el Método Booststrap aporta una solución a la problemática planteada para la estimación de parámetros de la población. El desarollo del sistema automatizado Stima 1.0 cumple con los requerimientos del Método Bootstrap y con el formato del sistema, al ser un software de fácil manejo sobre ambiente Windows, el cual es utilizado para estimar parámetros en investigaciones de la Mecanización Agropecuaria (el tiempo de llenado de un trailer según el rendimiento del campo y el rendimiento de descarga de la combinada KTP-2, reflejando una mejor exactitud en las estimaciones que se analizan por esta vía.
Institute of Scientific and Technical Information of China (English)
丁晟春; 文能; 蒋婷; 孟美任
2012-01-01
During recent years, sentiment analysis about text in Chinese is becoming more and more popular in academic research. In this paper, sentiment analysis is processed on sentence level. Sentiment words published by HowNet is used as the original evaluated-word set, a large amount of evaluated-words are obtained by semi-supervised bootstrapping based on CRF model. Then sentiment sentence can be recognized by evaluated-words, and the polarity of sentiment sentence can be judged by the designed semantic rules.%本文从句子级的角度进行了中文文本的情感倾向分析,提出以HowNet中的情感词表为种子情感词集,采用基于CRF模型的半监督学习迭代方法获取大量评价词,然后依据中文词间的语义规则判断句子的极性的方法.将该方法应用于COAE2011中任务2-观点句识别,在评价词的识别和观点句极性判断都取得了很好的结果.
Correa M., Juan Carlos
2011-01-01
The 85th quantile plays an important role in transportation engineering. In this paper we present different statistical procedures for its estimation, considering both, parametric and nonparametric procedures. With an example, we illustrate the difference between them.
National Ignition Facility (NIF) operations procedures plan
Energy Technology Data Exchange (ETDEWEB)
Mantrom, D.
1998-05-06
The purpose of this Operations Procedures Plan is to establish a standard procedure which outlines how NIF Operations procedures will be developed (i.e , written, edited, reviewed, approved, published, revised) and accessed by the NIF Operations staff who must use procedures in order to accomplish their tasks. In addition, this Plan is designed to provide a guide to the NIF Project staff to assist them in planning and writing procedures. Also, resource and scheduling information is provided.
PRINCIPLES AND PROCEDURES ON FISCAL
Directory of Open Access Journals (Sweden)
Morar Ioan Dan
2011-07-01
Full Text Available Fiscal science advertise in most analytical situations, while the principles reiterated by specialists in the field in various specialized works The two components of taxation, the tax system relating to the theoretical and the practical procedures relating to tax are marked by frequent references and invocations of the underlying principles to tax. This paper attempts a return on equity fiscal general vision as a principle often invoked and used to justify tax policies, but so often violated the laws fiscality . Also want to emphasize the importance of devising procedures to ensure fiscal equitable treatment of taxpayers. Specific approach of this paper is based on the notion that tax equity is based on equality before tax and social policies of the executive that would be more effective than using the other tax instruments. I want to emphasize that if the scientific approach to justify the unequal treatment of the tax law is based on the various social problems of the taxpayers, then deviates from the issue of tax fairness justification explaining the need to promote social policies usually more attractive to taxpayers. Modern tax techniques are believed to be promoted especially in order to ensure an increasing level of high efficiency at the expense of the taxpayers obligations to ensure equality before the law tax. On the other hand, tax inequities reaction generates multiple recipients from the first budget plan, but finalities unfair measures can not quantify and no timeline for the reaction, usually not known. But while statistics show fluctuations in budgetary revenues and often find in literature reviews and analysis relevant to a connection between changes in government policies, budget execution and outcome. The effects of inequality on tax on tax procedures and budgetary revenues are difficult to quantify and is among others to this work. Providing tax equity without combining it with the principles of discrimination and neutrality
Instrument Calibration and Certification Procedure
Energy Technology Data Exchange (ETDEWEB)
Davis, R. Wesley [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2016-05-31
The Amptec 640SL-2 is a 4-wire Kelvin failsafe resistance meter, designed to reliably use very low-test currents for its resistance measurements. The 640SL-1 is a 2-wire version, designed to support customers using the Reynolds Industries type 311 connector. For both versions, a passive (analog) dual function DC Milliameter/Voltmeter allows the user to verify the actual 640SL output current level and the open circuit voltage on the test leads. This procedure includes tests of essential performance parameters. Any malfunction noticed during calibration, whether specifically tested for or not, shall be corrected before calibration continues or is completed.
Forthcoming indefinite contract review procedure
Human Resources Department
2011-01-01
The vacancy notices for posts opened with a view to the award of an indefinite contract will be published in early April 2011. In the meantime, the list of posts to be opened this spring is available at the following address: Indefinite contract posts - spring 2011 A second exercise will take place in autumn 2011 and, as of 2012, the indefinite contract award procedure will only be held once a year, in autumn. For more information please consult: https://hr-recruit.web.cern.ch/hr-recruit/staff/IndefiniteContracts.asp
EFFECTIVE PROCEDURES FOR EXECUTIVE'S PREPARATION
Directory of Open Access Journals (Sweden)
Elisabet Martínez Mondéjar
2014-05-01
The aim of this article is to present methodological procedure that facilitates the executives work with teachers that are preparing themselves to by promoting post from their workplace from a manual directed towards counseling executives on how to develop the focalization, the selection and the evaluation and promotion of its teachers that are preparing themselves to by promoting post from the work system itself of the different levels of management from the University of Pedagogical Sciences " Felix Varela Morales " of Villa Clara.
Linear regression in astronomy. II
Feigelson, Eric D.; Babu, Gutti J.
1992-01-01
A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.