A proof of fulfillment of the strong bootstrap condition
International Nuclear Information System (INIS)
Fadin, V.S.; Papa, A.
2002-01-01
It is shown that the kernel of the BFKL equation for the octet color state of two Reggeized gluons satisfies the strong bootstrap condition in the next-to-leading order. This condition is much more restrictive than the one obtained from the requirement of the Reggeized form for the elastic scattering amplitudes in the next-to-leading approximation. It is necessary, however, for self-consistency of the assumption of the Reggeized form of the production amplitudes in multi-Regge kinematics, which are used in the derivation of the BFKL equation. The fulfillment of the strong bootstrap condition for the kernel opens the way to a rigorous proof of the BFKL equation in the next-to-leading approximation. (author)
Bootstrap support is not first-order correct.
Susko, Edward
2009-04-01
The appropriate interpretation of bootstrap support for splits and the question of what constitutes large bootstrap support have received considerable attention. One desirable interpretation, indeed the interpretation that was put forward when bootstrap support for splits was first introduced, is that 1-minus bootstrap support is a P value for the hypothesis that the split is not well resolved. As a P value, bootstrap support has been argued to be first-order correct. By obtaining the limiting distribution of bootstrap support for a split when maximum likelihood estimation is conducted, it is shown that bootstrap support is not first-order correct and insight is provided into the nature of the problem. Borrowing from earlier results, it is also shown that similar results hold when the neighbor-joining algorithm is used. Examples suggest that bootstrap support is generally conservative as a P value and give insight as to why this is usually the case. The analysis indicates that the problem is largely due to the unusual nature of tree space where boundary trees always have at least 2 neighbors.
Spurious 99% bootstrap and jackknife support for unsupported clades.
Simmons, Mark P; Freudenstein, John V
2011-10-01
Quantifying branch support using the bootstrap and/or jackknife is generally considered to be an essential component of rigorous parsimony and maximum likelihood phylogenetic analyses. Previous authors have described how application of the frequency-within-replicates approach to treating multiple equally optimal trees found in a given bootstrap pseudoreplicate can provide apparent support for otherwise unsupported clades. We demonstrate how a similar problem may occur when a non-representative subset of equally optimal trees are held per pseudoreplicate, which we term the undersampling-within-replicates artifact. We illustrate the frequency-within-replicates and undersampling-within-replicates bootstrap and jackknife artifacts using both contrived and empirical examples, demonstrate that the artifacts can occur in both parsimony and likelihood analyses, and show that the artifacts occur in outputs from multiple different phylogenetic-inference programs. Based on our results, we make the following five recommendations, which are particularly relevant to supermatrix analyses, but apply to all phylogenetic analyses. First, when two or more optimal trees are found in a given pseudoreplicate they should be summarized using the strict-consensus rather than frequency-within-replicates approach. Second jackknife resampling should be used rather than bootstrap resampling. Third, multiple tree searches while holding multiple trees per search should be conducted in each pseudoreplicate rather than conducting only a single search and holding only a single tree. Fourth, branches with a minimum possible optimized length of zero should be collapsed within each tree search rather than collapsing branches only if their maximum possible optimized length is zero. Fifth, resampling values should be mapped onto the strict consensus of all optimal trees found rather than simply presenting the ≥ 50% bootstrap or jackknife tree or mapping the resampling values onto a single optimal tree
Bootstrap-based Support of HGT Inferred by Maximum Parsimony
Directory of Open Access Journals (Sweden)
Nakhleh Luay
2010-05-01
Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
Bootstrap-based support of HGT inferred by maximum parsimony.
Park, Hyun Jung; Jin, Guohua; Nakhleh, Luay
2010-05-05
Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/), and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
Directory of Open Access Journals (Sweden)
Müller Kai F
2005-10-01
Full Text Available Abstract Background For parsimony analyses, the most common way to estimate confidence is by resampling plans (nonparametric bootstrap, jackknife, and Bremer support (Decay indices. The recent literature reveals that parameter settings that are quite commonly employed are not those that are recommended by theoretical considerations and by previous empirical studies. The optimal search strategy to be applied during resampling was previously addressed solely via standard search strategies available in PAUP*. The question of a compromise between search extensiveness and improved support accuracy for Bremer support received even less attention. A set of experiments was conducted on different datasets to find an empirical cut-off point at which increased search extensiveness does not significantly change Bremer support and jackknife or bootstrap proportions any more. Results For the number of replicates needed for accurate estimates of support in resampling plans, a diagram is provided that helps to address the question whether apparently different support values really differ significantly. It is shown that the use of random addition cycles and parsimony ratchet iterations during bootstrapping does not translate into higher support, nor does any extension of the search extensiveness beyond the rather moderate effort of TBR (tree bisection and reconnection branch swapping plus saving one tree per replicate. Instead, in case of very large matrices, saving more than one shortest tree per iteration and using a strict consensus tree of these yields decreased support compared to saving only one tree. This can be interpreted as a small risk of overestimating support but should be more than compensated by other factors that counteract an enhanced type I error. With regard to Bremer support, a rule of thumb can be derived stating that not much is gained relative to the surplus computational effort when searches are extended beyond 20 ratchet iterations per
Susko, Edward
2010-07-01
The most frequent measure of phylogenetic uncertainty for splits is bootstrap support. Although large bootstrap support intuitively suggests that a split in a tree is well supported, it has not been clear how large bootstrap support needs to be to conclude that there is significant evidence that a hypothesized split is present. Indeed, recent work has shown that bootstrap support is not first-order correct and thus cannot be directly used for hypothesis testing. We present methods that adjust bootstrap support values in a maximum likelihood (ML) setting so that they have an interpretation corresponding to P values in conventional hypothesis testing; for instance, adjusted bootstrap support larger than 95% occurs only 5% of the time if the split is not present. Through examples and simulation settings, it is found that adjustments always increase the level of support. We also find that the nature of the adjustment is fairly constant across parameter settings. Finally, we consider adjustments that take into account the data-dependent nature of many hypotheses about splits: the hypothesis that they are present is being tested because they are in the tree estimated through ML. Here, in contrast, we find that bootstrap probability often needs to be adjusted downwards.
Bootstrap, Wild Bootstrap and Generalized Bootstrap
Mammen, Enno
1995-01-01
Some modifications and generalizations of the bootstrap procedurehave been proposed. In this note we will consider the wild bootstrap and the generalized bootstrap and we will give two arguments why it makes sense touse these modifications instead of the original bootstrap. The firstargument is that there exist examples where generalized and wild bootstrapwork, but where the original bootstrap fails and breaks down. The secondargument will be based on higher order considerations. We will show...
Combining Bootstrap Aggregation with Support Vector Regression for Small Blood Pressure Measurement.
Lee, Soojeong; Ahmad, Awais; Jeon, Gwanggil
2018-02-28
Blood pressure measurement based on oscillometry is one of the most popular techniques to check a health condition of individual subjects. This paper proposes a support vector using fusion estimator with a bootstrap technique for oscillometric blood pressure (BP) estimation. However, some inherent problems exist with this approach. First, it is not simple to identify the best support vector regression (SVR) estimator, and worthy information might be omitted when selecting one SVR estimator and discarding others. Additionally, our input feature data, acquired from only five BP measurements per subject, represent a very small sample size. This constitutes a critical limitation when utilizing the SVR technique and can cause overfitting or underfitting, depending on the structure of the algorithm. To overcome these challenges, a fusion with an asymptotic approach (based on combining the bootstrap with the SVR technique) is utilized to generate the pseudo features needed to predict the BP values. This ensemble estimator using the SVR technique can learn to effectively mimic the non-linear relations between the input data acquired from the oscillometry and the nurse's BPs.
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
Bootstrapping heteroskedastic regression models: wild bootstrap vs. pairs bootstrap
Emmanuel Flachaire
2005-01-01
In regression models, appropriate bootstrap methods for inference robust to heteroskedasticity of unknown form are the wild bootstrap and the pairs bootstrap. The finite sample performance of a heteroskedastic-robust test is investigated with Monte Carlo experiments. The simulation results suggest that one specific version of the wild bootstrap outperforms the other versions of the wild bootstrap and of the pairs bootstrap. It is the only one for which the bootstrap test gives always better r...
Pfiffner, H. J.
1969-01-01
Circuit can sample a number of transducers in sequence without drawing from them. This bootstrap unloader uses a differential amplifier with one input connected to a circuit which is the equivalent of the circuit to be unloaded, and the other input delivering the proper unloading currents.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
Wild bootstrap versus moment-oriented bootstrap
Sommerfeld, Volker
1997-01-01
We investigate the relative merits of a “moment-oriented” bootstrap method of Bunke (1997) in comparison with the classical wild bootstrap of Wu (1986) in nonparametric heteroscedastic regression situations. The “moment-oriented” bootstrap is a wild bootstrap based on local estimators of higher order error moments that are smoothed by kernel smoothers. In this paper we perform an asymptotic comparison of these two dierent bootstrap procedures. We show that the moment-oriented bootstrap is in ...
Ultrafast Approximation for Phylogenetic Bootstrap
Bui Quang Minh, [No Value; Nguyen, Thi; von Haeseler, Arndt
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and
Beran, Rudolf
1994-01-01
This essay is organized around the theoretical and computationalproblem of constructing bootstrap confidence sets, with forays into relatedtopics. The seven section headings are: Introduction; The Bootstrap World;Bootstrap Confidence Sets; Computing Bootstrap Confidence Sets; Quality ofBootstrap Confidence Sets; Iterated and Two-step Boostrap; Further Resources.
Ultrafast approximation for phylogenetic bootstrap.
Minh, Bui Quang; Nguyen, Minh Anh Thi; von Haeseler, Arndt
2013-05-01
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and the Shimodaira-Hasegawa-like approximate likelihood ratio test have been introduced to speed up the bootstrap. Here, we suggest an ultrafast bootstrap approximation approach (UFBoot) to compute the support of phylogenetic groups in maximum likelihood (ML) based trees. To achieve this, we combine the resampling estimated log-likelihood method with a simple but effective collection scheme of candidate trees. We also propose a stopping rule that assesses the convergence of branch support values to automatically determine when to stop collecting candidate trees. UFBoot achieves a median speed up of 3.1 (range: 0.66-33.3) to 10.2 (range: 1.32-41.4) compared with RAxML RBS for real DNA and amino acid alignments, respectively. Moreover, our extensive simulations show that UFBoot is robust against moderate model violations and the support values obtained appear to be relatively unbiased compared with the conservative standard bootstrap. This provides a more direct interpretation of the bootstrap support. We offer an efficient and easy-to-use software (available at http://www.cibiv.at/software/iqtree) to perform the UFBoot analysis with ML tree inference.
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
UFBoot2: Improving the Ultrafast Bootstrap Approximation.
Hoang, Diep Thi; Chernomor, Olga; von Haeseler, Arndt; Minh, Bui Quang; Vinh, Le Sy
2018-02-01
The standard bootstrap (SBS), despite being computationally intensive, is widely used in maximum likelihood phylogenetic analyses. We recently proposed the ultrafast bootstrap approximation (UFBoot) to reduce computing time while achieving more unbiased branch supports than SBS under mild model violations. UFBoot has been steadily adopted as an efficient alternative to SBS and other bootstrap approaches. Here, we present UFBoot2, which substantially accelerates UFBoot and reduces the risk of overestimating branch supports due to polytomies or severe model violations. Additionally, UFBoot2 provides suitable bootstrap resampling strategies for phylogenomic data. UFBoot2 is 778 times (median) faster than SBS and 8.4 times (median) faster than RAxML rapid bootstrap on tested data sets. UFBoot2 is implemented in the IQ-TREE software package version 1.6 and freely available at http://www.iqtree.org. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Adsorbate-mediated strong metal-support interactions in oxide-supported Rh catalysts.
Matsubu, John C; Zhang, Shuyi; DeRita, Leo; Marinkovic, Nebojsa S; Chen, Jingguang G; Graham, George W; Pan, Xiaoqing; Christopher, Phillip
2017-02-01
The optimization of supported metal catalysts predominantly focuses on engineering the metal site, for which physical insights based on extensive theoretical and experimental contributions have enabled the rational design of active sites. Although it is well known that supports can influence the catalytic properties of metals, insights into how metal-support interactions can be exploited to optimize metal active-site properties are lacking. Here we utilize in situ spectroscopy and microscopy to identify and characterize a support effect in oxide-supported heterogeneous Rh catalysts. This effect is characterized by strongly bound adsorbates (HCO x ) on reducible oxide supports (TiO 2 and Nb 2 O 5 ) that induce oxygen-vacancy formation in the support and cause HCO x -functionalized encapsulation of Rh nanoparticles by the support. The encapsulation layer is permeable to reactants, stable under the reaction conditions and strongly influences the catalytic properties of Rh, which enables rational and dynamic tuning of CO 2 -reduction selectivity.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
-law avalanches, while the continuous transition is characterized by truncated avalanches in a related sequential bootstrap process. We explain this behaviour on the basis of an analytical and numerical study of the avalanche distributions on ...
Energy Technology Data Exchange (ETDEWEB)
Castedo Echeverri, Alejandro [SISSA, Trieste (Italy); INFN, Trieste (Italy); Harling, Benedict von [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Serone, Marco [SISSA, Trieste (Italy); INFN, Trieste (Italy); ICTP, Trieste (Italy)
2016-06-15
We study the numerical bounds obtained using a conformal-bootstrap method where different points in the plane of conformal cross ratios z and anti z are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z= anti z=1/2, we can consistently ''integrate out'' higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this ''effective'' bootstrap by studying the 3D Ising and O(n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O(n) vector models, with n=2,3,4, which have not yet been computed using bootstrap techniques.
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger
new resampling method, the local fractional bootstrap, relies on simulating an auxiliary fractional Brownian motion that mimics the fine properties of high frequency differences of the Brownian semistationary process under the null hypothesis. We prove the first order validity of the bootstrap method...... to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data....
Energy Technology Data Exchange (ETDEWEB)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group
2017-02-15
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
International Nuclear Information System (INIS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-02-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2013-08-16
We implement the conformal bootstrap for N=4 superconformal field theories in four dimensions. The consistency of the four-point function of the stress-energy tensor multiplet imposes significant upper bounds for the scaling dimensions of unprotected local operators as functions of the central charge of the theory. At the threshold of exclusion, a particular operator spectrum appears to be singled out by the bootstrap constraints. We conjecture that this extremal spectrum is that of N=4 supersymmetric Yang-Mills theory at an S-duality invariant value of the complexified gauge coupling.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
by presenting an analytic and numerical study of the problem on a Bethe lattice. The Bethe lattice does not capture all the complexities of bootstrap dynamics on periodic lattices but it does provide useful insight into what makes the avalanche distributions different in the two cases. The following presentation is self- ...
The bootstrap and Bayesian bootstrap method in assessing bioequivalence
International Nuclear Information System (INIS)
Wan Jianping; Zhang Kongsheng; Chen Hui
2009-01-01
Parametric method for assessing individual bioequivalence (IBE) may concentrate on the hypothesis that the PK responses are normal. Nonparametric method for evaluating IBE would be bootstrap method. In 2001, the United States Food and Drug Administration (FDA) proposed a draft guidance. The purpose of this article is to evaluate the IBE between test drug and reference drug by bootstrap and Bayesian bootstrap method. We study the power of bootstrap test procedures and the parametric test procedures in FDA (2001). We find that the Bayesian bootstrap method is the most excellent.
Effects of parameter estimation on maximum-likelihood bootstrap analysis.
Ripplinger, Jennifer; Abdo, Zaid; Sullivan, Jack
2010-08-01
Bipartition support in maximum-likelihood (ML) analysis is most commonly assessed using the nonparametric bootstrap. Although bootstrap replicates should theoretically be analyzed in the same manner as the original data, model selection is almost never conducted for bootstrap replicates, substitution-model parameters are often fixed to their maximum-likelihood estimates (MLEs) for the empirical data, and bootstrap replicates may be subjected to less rigorous heuristic search strategies than the original data set. Even though this approach may increase computational tractability, it may also lead to the recovery of suboptimal tree topologies and affect bootstrap values. However, since well-supported bipartitions are often recovered regardless of method, use of a less intensive bootstrap procedure may not significantly affect the results. In this study, we investigate the impact of parameter estimation (i.e., assessment of substitution-model parameters and tree topology) on ML bootstrap analysis. We find that while forgoing model selection and/or setting substitution-model parameters to their empirical MLEs may lead to significantly different bootstrap values, it probably would not change their biological interpretation. Similarly, even though the use of reduced search methods often results in significant differences among bootstrap values, only omitting branch swapping is likely to change any biological inferences drawn from the data. Copyright 2010 Elsevier Inc. All rights reserved.
Hawkins, Jennifer S; Ramachandran, Dhanushya; Henderson, Ashley; Freeman, Jasmine; Carlise, Michael; Harris, Alex; Willison-Headley, Zachary
2015-08-01
Sorghum is an essential grain crop whose evolutionary placement within the Andropogoneae has been the subject of scrutiny for decades. Early studies using cytogenetic and morphological data point to a poly- or paraphyletic origin of the genus; however, acceptance of poly- or paraphyly has been met with resistance. This study aimed to address the species relationships within Sorghum, in addition to the placement of Sorghum within the tribe, using a phylogenetic approach and employing broad taxon sampling. From 16 diverse Sorghum species, eight low-copy nuclear loci were sequenced that are known to play a role in morphological diversity and have been previously used to study evolutionary relationships in grasses. Further, the data for four of these loci were combined with those from 57 members of the Andropogoneae in order to determine the placement of Sorghum within the tribe. Both maximum likelihood and Bayesian analyses were performed on multilocus concatenated data matrices. The Sorghum-specific topology provides strong support for two major lineages, in alignment with earlier studies employing chloroplast and internal transcribed spacer (ITS) markers. Clade I is composed of the Eu-, Chaeto- and Heterosorghum, while clade II contains the Stipo- and Parasorghum. When combined with data from the Andropogoneae, Clade II resolves as sister to a clade containing Miscanthus and Saccharum with high posterior probability and bootstrap support, and to the exclusion of Clade I. The results provide compelling evidence for a two-lineage polyphyletic ancestry of Sorghum within the larger Andropogoneae, i.e. the derivation of the two major Sorghum clades from a unique common ancestor. Rejection of monophyly in previous molecular studies is probably due to limited taxon sampling outside of the genus. The clade consisting of Para- and Stiposorghum resolves as sister to Miscanthus and Saccharum with strong node support. © The Author 2015. Published by Oxford University Press on
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
transition. The first-order transition encountered in bootstrap problems has often a mixed character in the sense that the discontinuous drop in magnetization is .... Pn = p z−1. ∑ k=0. ( z − 1 k. ) [Pn+1]k[1 − Pn+1]z−1−kpk+1, pk+1 = 1 if k + 1 ≥ m, pk+1 = 0 if k + 1 < m. (1). The rationale for the above equation is as follows.
Promoting Strong ISO 50001 Outcomes with Supportive National Infrastructure
Energy Technology Data Exchange (ETDEWEB)
McKane, Aimee, T.; Siciliano, Graziella; de los Reyes, Pamela
2015-08-04
The ISO 50001 standard is a key mechanism for reducing greenhouse gas emissions and improving energy efficiency globally. An increasing number of companies are seeking certification, creating the need for personnel that are competent to conduct ISO 50001 certification audits. The growth of ISO 50001 is expected to accelerate as more companies integrate ISO 50001 into their corporate sustainability strategies and supplier requirements. Robust implementation of ISO 50001 represents an important tool for countries with climate change mitigation goals. Because of its dual focus on continual improvement of an organization’s energy management system (EnMS) and its energy performance improvement, ISO 50001 requires skills of both implementers and certification auditors that are not well-supported by current credentials and training. This paper describes an effort to address skill gaps of certification auditors, a critical factor to ensure that ISO 50001 implementations are robust and result in continued energy performance improvement. A collaboration of governments through the Energy Management Working Group (EMWG), formerly under Global Superior Energy Performance (GSEP), has formed to build workforce capacity for ISO 50001 certification audits. The EMWG is leading the development of an internationally-relevant certification scheme for ISO 50001 Lead Auditor that meets requirements for ISO/IEC 17024 accreditation and ISO 50003 for defining ISO 50001 Lead Auditor competency. Wider availability of competent ISO 50001 Lead Auditors will ultimately increase the impact and market value of ISO 50001 certification and improve consistency of ISO 50001 certification outcomes by establishing a standardized and high level of knowledge and skills globally.
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Explorations in Statistics: the Bootstrap
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
The wild tapered block bootstrap
DEFF Research Database (Denmark)
Hounyo, Ulrich
-based method in terms of asymptotic accuracy of variance estimation and distribution approximation. For stationary time series, the asymptotic validity, and the favorable bias properties of the new bootstrap method are shown in two important cases: smooth functions of means, and M-estimators. The first......-order asymptotic validity of the tapered block bootstrap as well as the wild tapered block bootstrap approximation to the actual distribution of the sample mean is also established when data are assumed to satisfy a near epoch dependent condition. The consistency of the bootstrap variance estimator for the sample......In this paper, a new resampling procedure, called the wild tapered block bootstrap, is introduced as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The method consists in tapering each overlapping block...
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
Heptagons from the Steinmann cluster bootstrap
Energy Technology Data Exchange (ETDEWEB)
Dixon, Lance J.; McLeod, Andrew J. [Stanford Univ., CA (United States). SLAC National Accelerator Lab.; Drummond, James [Southampton Univ. (United Kingdom). School of Physics and Astronomy; Harrington, Thomas; Spradlin, Marcus [Brown Univ., Providence, RI (United States). Dept. of Physics; Papathanasiou, Georgios [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Stanford Univ., CA (United States). SLAC National Accelerator Lab.
2016-12-15
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Heptagons from the Steinmann cluster bootstrap
International Nuclear Information System (INIS)
Dixon, Lance J.; McLeod, Andrew J.; Drummond, James; Harrington, Thomas; Spradlin, Marcus; Papathanasiou, Georgios; Stanford Univ., CA
2016-12-01
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
MPBoot: fast phylogenetic maximum parsimony tree inference and bootstrap approximation.
Hoang, Diep Thi; Vinh, Le Sy; Flouri, Tomáš; Stamatakis, Alexandros; von Haeseler, Arndt; Minh, Bui Quang
2018-02-02
The nonparametric bootstrap is widely used to measure the branch support of phylogenetic trees. However, bootstrapping is computationally expensive and remains a bottleneck in phylogenetic analyses. Recently, an ultrafast bootstrap approximation (UFBoot) approach was proposed for maximum likelihood analyses. However, such an approach is still missing for maximum parsimony. To close this gap we present MPBoot, an adaptation and extension of UFBoot to compute branch supports under the maximum parsimony principle. MPBoot works for both uniform and non-uniform cost matrices. Our analyses on biological DNA and protein showed that under uniform cost matrices, MPBoot runs on average 4.7 (DNA) to 7 times (protein data) (range: 1.2-20.7) faster than the standard parsimony bootstrap implemented in PAUP*; but 1.6 (DNA) to 4.1 times (protein data) slower than the standard bootstrap with a fast search routine in TNT (fast-TNT). However, for non-uniform cost matrices MPBoot is 5 (DNA) to 13 times (protein data) (range:0.3-63.9) faster than fast-TNT. We note that MPBoot achieves better scores more frequently than PAUP* and fast-TNT. However, this effect is less pronounced if an intensive but slower search in TNT is invoked. Moreover, experiments on large-scale simulated data show that while both PAUP* and TNT bootstrap estimates are too conservative, MPBoot bootstrap estimates appear more unbiased. MPBoot provides an efficient alternative to the standard maximum parsimony bootstrap procedure. It shows favorable performance in terms of run time, the capability of finding a maximum parsimony tree, and high bootstrap accuracy on simulated as well as empirical data sets. MPBoot is easy-to-use, open-source and available at http://www.cibiv.at/software/mpboot .
Bootstrapping quarks and gluons
Energy Technology Data Exchange (ETDEWEB)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.
Bootstrapping quarks and gluons
International Nuclear Information System (INIS)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces
Bootstrapping phylogenies inferred from rearrangement data
Directory of Open Access Journals (Sweden)
Lin Yu
2012-08-01
Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its
Bootstrapping phylogenies inferred from rearrangement data.
Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me
2012-08-29
Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver
Can bootstrapping explain concept learning?
Beck, Jacob
2017-01-01
Susan Carey's account of Quinean bootstrapping has been heavily criticized. While it purports to explain how important new concepts are learned, many commentators complain that it is unclear just what bootstrapping is supposed to be or how it is supposed to work. Others allege that bootstrapping falls prey to the circularity challenge: it cannot explain how new concepts are learned without presupposing that learners already have those very concepts. Drawing on discussions of concept learning from the philosophical literature, this article develops a detailed interpretation of bootstrapping that can answer the circularity challenge. The key to this interpretation is the recognition of computational constraints, both internal and external to the mind, which can endow empty symbols with new conceptual roles and thus new contents. Copyright © 2016 Elsevier B.V. All rights reserved.
Bootstrapping pronunciation dictionaries: practical issues
CSIR Research Space (South Africa)
Davel, MH
2005-09-01
Full Text Available entries, increasing the size of the dictionary in an incremental fashion. 2.1. The bootstrapping process The bootstrapping system is initialised with a large word list (containing no pronunciation information), or with a pre- existing pronunciation.... The rule set is extracted in a straightforward fashion: for every letter (grapheme), a de- fault phoneme is derived as the phoneme to which the letter is most likely to map. ?Exceptional? cases ? words for which the expected phoneme is not correct...
Lowell, Anne; Kildea, Sue; Liddle, Marlene; Cox, Barbara; Paterson, Barbara
2015-02-05
The Strong Women, Strong Babies, Strong Culture Program (the Program) evolved from a recognition of the value of Aboriginal knowledge and practice in promoting maternal and child health (MCH) in remote communities of the Northern Territory (NT) of Australia. Commencing in 1993 it continues to operate today. In 2008, the NT Department of Health commissioned an evaluation to identify enabling factors and barriers to successful implementation of the Program, and to identify potential pathways for future development. In this paper we focus on the evaluation findings related specifically to the role of Aborignal cultural knowledge and practice within the Program. A qualitative evaluation utilised purposive sampling to maximise diversity in program history and Aboriginal culture. Semi-structured, in-depth interviews with 76 participants were recorded in their preferred language with a registered Interpreter when required. Thematic analysis of data was verified or modified through further discussions with participants and members of the evaluation team. Although the importance of Aboriginal knowledge and practice as a fundamental component of the Program is widely acknowledged, there has been considerable variation across time and location in the extent to which these cultural dimensions have been included in practice. Factors contributing to this variation are complex and relate to a number of broad themes including: location of control over Program activities; recognition and respect for Aboriginal knowledge and practice as a legitimate component of health care; working in partnership; communication within and beyond the Program; access to transport and working space; and governance and organisational support. We suggest that inclusion of Aboriginal knowledge and practice as a fundamental component of the Program is key to its survival over more than twenty years despite serious challenges. Respect for the legitimacy of Aboriginal knowledge and practice within health
Probabilistic tractography using Lasso bootstrap.
Ye, Chuyang; Prince, Jerry L
2017-01-01
Diffusion magnetic resonance imaging (dMRI) can be used for noninvasive imaging of white matter tracts. Using fiber tracking, which propagates fiber streamlines according to fiber orientations (FOs) computed from dMRI, white matter tracts can be reconstructed for investigation of brain diseases and the brain connectome. Because of image noise, probabilistic tractography has been proposed to characterize uncertainties in FO estimation. Bootstrap provides a nonparametric approach to the estimation of FO uncertainties and residual bootstrap has been used for developing probabilistic tractography. However, recently developed models have incorporated sparsity regularization to reduce the required number of gradient directions to resolve crossing FOs, and the residual bootstrap used in previous methods is not applicable to these models. In this work, we propose a probabilistic tractography algorithm named Lasso bootstrap tractography (LBT) for the models that incorporate sparsity. Using a fixed tensor basis and a sparsity assumption, diffusion signals are modeled using a Lasso formulation. With the residuals from the Lasso model, a distribution of diffusion signals is obtained according to a modified Lasso bootstrap strategy. FOs are then estimated from the synthesized diffusion signals by an algorithm that improves FO estimation by enforcing spatial consistency of FOs. Finally, streamlining fiber tracking is performed with the computed FOs. The LBT algorithm was evaluated on simulated and real dMRI data both qualitatively and quantitatively. Results demonstrate that LBT outperforms state-of-the-art algorithms. Copyright © 2016 Elsevier B.V. All rights reserved.
Spectral asymptotics of a strong δ′ interaction supported by a surface
International Nuclear Information System (INIS)
Exner, Pavel; Jex, Michal
2014-01-01
Highlights: • Attractive δ ′ interactions supported by a smooth surface are considered. • Surfaces can be either infinite and asymptotically planar, or compact and closed. • Spectral asymptotics is determined by the geometry of the interaction support. - Abstract: We derive asymptotic expansion for the spectrum of Hamiltonians with a strong attractive δ ′ interaction supported by a smooth surface in R 3 , either infinite and asymptotically planar, or compact and closed. Its second term is found to be determined by a Schrödinger type operator with an effective potential expressed in terms of the interaction support curvatures
Double-bootstrap methods that use a single double-bootstrap simulation
Chang, Jinyuan; Hall, Peter
2014-01-01
We show that, when the double bootstrap is used to improve performance of bootstrap methods for bias correction, techniques based on using a single double-bootstrap sample for each single-bootstrap sample can be particularly effective. In particular, they produce third-order accuracy for much less computational expense than is required by conventional double-bootstrap methods. However, this improved level of performance is not available for the single double-bootstrap methods that have been s...
On using the bootstrap for multiple comparisons.
Westfall, Peter H
2011-11-01
There are many ways to bootstrap data for multiple comparisons procedures. Methods described here include (i) bootstrap (parametric and nonparametric) as a generalization of classical normal-based MaxT methods, (ii) bootstrap as an approximation to exact permutation methods, (iii) bootstrap as a generator of realistic null data sets, and (iv) bootstrap as a generator of realistic non-null data sets. Resampling of MinP versus MaxT is discussed, and the use of the bootstrap for closed testing is also presented. Applications to biopharmaceutical statistics are given.
Czech Academy of Sciences Publication Activity Database
Exner, Pavel; Pankrashkin, K.
2014-01-01
Roč. 39, č. 2 (2014), s. 193-212 ISSN 0360-5302 R&D Projects: GA ČR GAP203/11/0701 Institutional support: RVO:61389005 Keywords : Eigenvalue * Schrödinger operator * singular interaction * strong coupling * 35Q40 * 35P15 * 35J10 Subject RIV: BE - Theoretical Physics Impact factor: 1.013, year: 2014
Spectral asymptotics of a strong delta ' interaction supported by a surface
Czech Academy of Sciences Publication Activity Database
Exner, Pavel; Jex, M.
2014-01-01
Roč. 378, 30-31 (2014), s. 2091-2095 ISSN 0375-9601 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : delta ' surface interaction * strong coupling expansion Subject RIV: BE - Theoretical Physics Impact factor: 1.683, year: 2014
On eigenvalue asymptotics for strong delta-interactions supported by surfaces with boundaries
Czech Academy of Sciences Publication Activity Database
Dittrich, Jaroslav; Exner, Pavel; Kuhn, C.; Pankrashkin, K.
2016-01-01
Roč. 97, 1-2 (2016), s. 1-25 ISSN 0921-7134 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : singular Schrodinger operator * delta-interaction * strong coupling * eigenvalue Subject RIV: BE - Theoretical Physics Impact factor: 0.933, year: 2016
Preparation of supported Au–Pd and Cu–Pd by the combined strong ...
Indian Academy of Sciences (India)
BOONTIDA PONGTHAWORNSAKUN
2017-10-25
Oct 25, 2017 ... Abstract. TiO2 supported Au–Pd and Cu–Pd catalysts were prepared by strong electrostatic adsorption (SEA) of Pd followed by electroless deposition (ED) of a second metal with incremental surface coverages of Au or. Cu. High dispersion of small Pd particles on the Pd/TiO2 prepared by SEA led to the ...
Beta limits of a completely bootstrapped tokamak
International Nuclear Information System (INIS)
Weening, R.H.; Bondeson, A.
1992-03-01
A beta limit is given for a completely bootstrapped tokamak. The beta limit is sensitive to the achievable Troyon factor and depends directly upon the strength of the tokamak bootstrap effect. (author) 16 refs
Weighted bootstrapping: a correction method for assessing the robustness of phylogenetic trees
Directory of Open Access Journals (Sweden)
Makarenkov Vladimir
2010-08-01
Full Text Available Abstract Background Non-parametric bootstrapping is a widely-used statistical procedure for assessing confidence of model parameters based on the empirical distribution of the observed data 1 and, as such, it has become a common method for assessing tree confidence in phylogenetics 2. Traditional non-parametric bootstrapping does not weigh each tree inferred from resampled (i.e., pseudo-replicated sequences. Hence, the quality of these trees is not taken into account when computing bootstrap scores associated with the clades of the original phylogeny. As a consequence, traditionally, the trees with different bootstrap support or those providing a different fit to the corresponding pseudo-replicated sequences (the fit quality can be expressed through the LS, ML or parsimony score contribute in the same way to the computation of the bootstrap support of the original phylogeny. Results In this article, we discuss the idea of applying weighted bootstrapping to phylogenetic reconstruction by weighting each phylogeny inferred from resampled sequences. Tree weights can be based either on the least-squares (LS tree estimate or on the average secondary bootstrap score (SBS associated with each resampled tree. Secondary bootstrapping consists of the estimation of bootstrap scores of the trees inferred from resampled data. The LS and SBS-based bootstrapping procedures were designed to take into account the quality of each "pseudo-replicated" phylogeny in the final tree estimation. A simulation study was carried out to evaluate the performances of the five weighting strategies which are as follows: LS and SBS-based bootstrapping, LS and SBS-based bootstrapping with data normalization and the traditional unweighted bootstrapping. Conclusions The simulations conducted with two real data sets and the five weighting strategies suggest that the SBS-based bootstrapping with the data normalization usually exhibits larger bootstrap scores and a higher robustness
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Energy Technology Data Exchange (ETDEWEB)
Chen, Zhaoting; Wang, Rong Hui; Chen, Li; Dong, Chung Uang [School of Civil Engineering and Transportation, South China University of Technology, Guangzhou (China)
2016-08-15
This article investigated the strongly nonlinear free vibration of four edges simply supported stiffened plates with geometric imperfections. The von Karman nonlinear strain-displacement relationships are applied. The nonlinear vibration of stiffened plate is reduced to a one-degree-of-freedom nonlinear system by assuming mode shapes. The Multiple scales Lindstedt-Poincare method (MSLP) and Modified Lindstedt-Poincare method (MLP) are used to solve the governing equations of vibration. Numerical examples for stiffened plates with different initial geometric imperfections are presented in order to discuss the influences to the strongly nonlinear free vibration of the stiffened plate. The results showed that: the frequency ratio reduced as the initial geometric imperfections of plate increased, which showed that the increase of the initial geometric imperfections of plate can lead to the decrease of nonlinear effect; by comparing the results calculated by MSLP method, using MS method to study strongly nonlinear vibration can lead to serious mistakes.
Dynamic Mode Decomposition based on Bootstrapping Extended Kalman Filter Application to Noisy data
Nonomura, Taku; Shibata, Hisaichi; Takaki, Ryoji
2017-11-01
In this study, dynamic mode decomposition (DMD) based on bootstrapping extended Kalman filter is proposed for time-series data. In this framework, state variables (x and y) are filtered as well as the parameter estimation (aij) which is conducted in the conventional DMD and the standard Kalman-filter-based DMD. The filtering process of state variables enables us to obtain highly accurate eigenvalue of the system with strong noise. In the presentation, formulation, advantages and disadvantages are discussed. This research is partially supported by Presto, JST (JPMJPR1678).
Bootstrapping N=3 superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena; Liendo, Pedro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Meneghelli, Carlo [Stony Brook Univ., Stony Brook, NY (United States). Simons Center for Geometry and Physics; Mitev, Vladimir [Mainz Univ. (Germany). PRISMA Cluster of Excellence
2016-12-15
We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of =3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.
Mobile first design : using Bootstrap
Bhusal, Bipin
2017-01-01
The aim of this project was to design and build a website for a company based in Australia. The business offers remedial massage therapy to its clients. It is a small business which works on the basis of calls and message reservation. The business currently has a temporary website designed with Wix, a cloud-based web development platform. The new website was built with responsive design using Bootstrap. This website was intended for the customers using mobile internet browsers. This design is...
More N =4 superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C.
2017-08-01
In this long overdue second installment, we continue to develop the conformal bootstrap program for N =4 superconformal field theories (SCFTs) in four dimensions via an analysis of the correlation function of four stress-tensor supermultiplets. We review analytic results for this correlator and make contact with the SCFT/chiral algebra correspondence of Beem et al. [Commun. Math. Phys. 336, 1359 (2015), 10.1007/s00220-014-2272-x]. We demonstrate that the constraints of unitarity and crossing symmetry require the central charge c to be greater than or equal to 3 /4 in any interacting N =4 SCFT. We apply numerical bootstrap methods to derive upper bounds on scaling dimensions and operator product expansion coefficients for several low-lying, unprotected operators as a function of the central charge. We interpret our bounds in the context of N =4 super Yang-Mills theories, formulating a series of conjectures regarding the embedding of the conformal manifold—parametrized by the complexified gauge coupling—into the space of scaling dimensions and operator product expansion coefficients. Our conjectures assign a distinguished role to points on the conformal manifold that are self-dual under a subgroup of the S -duality group. This paper contains a more detailed exposition of a number of results previously reported in Beem et al. [Phys. Rev. Lett. 111, 071601 (2013), 10.1103/PhysRevLett.111.071601] in addition to new results.
Greenblatt, Richard L.; And Others
1992-01-01
The diagnostic accuracy of nonadjusted and bootstrapped diagnosis was compared using a sample of 1,455 psychiatric patients who completed the Millon Clinical Multiaxial Inventory. The usefulness of bootstrapping depended on the criteria for accuracy. Conditions under which bootstrapping might increase diagnostic accuracy are detailed. (SLD)
Modified bootstrap consistency rates for U-quantiles
JANSSEN, Paul; SWANEPOEL, Jan; VERAVERBEKE, Noel
2001-01-01
We show that, compared to the classical bootstrap, the modified bootstrap provides faster consistency rates for the bootstrap distribution of U-quantiles. This shows that the modified bootstrap is useful, not only in cases where the classical bootstrap fails, but also in situations where it is valid.
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
2012-01-01
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
Comparing bootstrap and posterior probability values in the four-taxon case.
Cummings, Michael P; Handley, Scott A; Myers, Daniel S; Reed, David L; Rokas, Antonis; Winka, Katarina
2003-08-01
Assessment of the reliability of a given phylogenetic hypothesis is an important step in phylogenetic analysis. Historically, the nonparametric bootstrap procedure has been the most frequently used method for assessing the support for specific phylogenetic relationships. The recent employment of Bayesian methods for phylogenetic inference problems has resulted in clade support being expressed in terms of posterior probabilities. We used simulated data and the four-taxon case to explore the relationship between nonparametric bootstrap values (as inferred by maximum likelihood) and posterior probabilities (as inferred by Bayesian analysis). The results suggest a complex association between the two measures. Three general regions of tree space can be identified: (1) the neutral zone, where differences between mean bootstrap and mean posterior probability values are not significant, (2) near the two-branch corner, and (3) deep in the two-branch corner. In the last two regions, significant differences occur between mean bootstrap and mean posterior probability values. Whether bootstrap or posterior probability values are higher depends on the data in support of alternative topologies. Examination of star topologies revealed that both bootstrap and posterior probability values differ significantly from theoretical expectations; in particular, there are more posterior probability values in the range 0.85-1 than expected by theory. Therefore, our results corroborate the findings of others that posterior probability values are excessively high. Our results also suggest that extrapolations from single topology branch-length studies are unlikely to provide any general conclusions regarding the relationship between bootstrap and posterior probability values.
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key k which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: { Introducing and formally dening...... anonymous steganography, { A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), { A lower bound on the number of bits which are needed to bootstrap anonymous communication....
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key $k$ which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: - Introducing and formally...... defining anonymous steganography, - A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), - A lower bound on the number of bits which are needed to bootstrap anonymous communication....
Inverse bootstrapping conformal field theories
Li, Wenliang
2018-01-01
We propose a novel approach to study conformal field theories (CFTs) in general dimensions. In the conformal bootstrap program, one usually searches for consistent CFT data that satisfy crossing symmetry. In the new method, we reverse the logic and interpret manifestly crossing-symmetric functions as generating functions of conformal data. Physical CFTs can be obtained by scanning the space of crossing-symmetric functions. By truncating the fusion rules, we are able to concentrate on the low-lying operators and derive some approximate relations for their conformal data. It turns out that the free scalar theory, the 2d minimal model CFTs, the ϕ 4 Wilson-Fisher CFT, the Lee-Yang CFTs and the Ising CFTs are consistent with the universal relations from the minimal fusion rule ϕ 1 × ϕ 1 = I + ϕ 2 + T , where ϕ 1 , ϕ 2 are scalar operators, I is the identity operator and T is the stress tensor.
Bootstrapping SCFTs with Four Supercharges
Bobev, Nikolay; Mazac, Dalimil; Paulos, Miguel F
2015-01-01
We study the constraints imposed by superconformal symmetry, crossing symmetry, and unitarity for theories with four supercharges in spacetime dimension $2\\leq d\\leq 4$. We show how superconformal algebras with four Poincar\\'{e} supercharges can be treated in a formalism applicable to any, in principle continuous, value of $d$ and use this to construct the superconformal blocks for any $d\\leq 4$. We then use numerical bootstrap techniques to derive upper bounds on the conformal dimension of the first unprotected operator appearing in the OPE of a chiral and an anti-chiral superconformal primary. We obtain an intriguing structure of three distinct kinks. We argue that one of the kinks smoothly interpolates between the $d=2$, $\\mathcal N=(2,2)$ minimal model with central charge $c=1$ and the theory of a free chiral multiplet in $d=4$, passing through the critical Wess-Zumino model with cubic superpotential in intermediate dimensions.
Bootstrapping SCFTs with four supercharges
International Nuclear Information System (INIS)
Bobev, Nikolay; El-Showk, Sheer; Mazáč, Dalimil; Paulos, Miguel F.
2015-01-01
We study the constraints imposed by superconformal symmetry, crossing symmetry, and unitarity for theories with four supercharges in spacetime dimension 2≤d≤4. We show how superconformal algebras with four Poincaré supercharges can be treated in a formalism applicable to any, in principle continuous, value of d and use this to construct the superconformal blocks for any d≤4. We then use numerical bootstrap techniques to derive upper bounds on the conformal dimension of the first unprotected operator appearing in the OPE of a chiral and an anti-chiral superconformal primary. We obtain an intriguing structure of three distinct kinks. We argue that one of the kinks smoothly interpolates between the d=2, N=(2,2) minimal model with central charge c=1 and the theory of a free chiral multiplet in d=4, passing through the critical Wess-Zumino model with cubic superpotential in intermediate dimensions.
Better Confidence Intervals: The Double Bootstrap with No Pivot
David Letson; B.D. McCullough
1998-01-01
The double bootstrap is an important advance in confidence interval generation because it converges faster than the already popular single bootstrap. Yet the usual double bootstrap requires a stable pivot that is not always available, e.g., when estimating flexibilities or substitution elasticities. A recently developed double bootstrap does not require a pivot. A Monte Carlo analysis with the Waugh data finds the double bootstrap achieves nominal coverage whereas the single bootstrap does no...
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify usage extensively with several real-world applications. In passing we give a pedagogical introduction to the numerical bootstrap methods.
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
Strong mitochondrial DNA support for a Cretaceous origin of modern avian lineages
Directory of Open Access Journals (Sweden)
Sorenson Michael D
2008-01-01
speciation events or the K-Pg boundary that could systematically mislead inferences from genetic data. Conclusion The 'rock-clock' gap has been interpreted by some to be a result of the vagaries of molecular genetic divergence time estimates. However, despite measures to explore different forms of uncertainty in several key parameters, we fail to reconcile molecular genetic divergence time estimates with dates taken from the fossil record; instead, we find strong support for an ancient origin of modern bird lineages, with many extant orders and families arising in the mid-Cretaceous, consistent with previous molecular estimates. Although there is ample room for improvement on both sides of the 'rock-clock' divide (e.g. accounting for 'ghost' lineages in the fossil record and developing more realistic models of rate evolution for molecular genetic sequences, the consistent and conspicuous disagreement between these two sources of data more likely reflects a genuine difference between estimated ages of (i stem-group origins and (ii crown-group morphological diversifications, respectively. Further progress on this problem will benefit from greater communication between paleontologists and molecular phylogeneticists in accounting for error in avian lineage age estimates.
Definition of total bootstrap current in tokamaks
International Nuclear Information System (INIS)
Ross, D.W.
1995-01-01
Alternative definitions of the total bootstrap current are compared. An analogous comparison is given for the ohmic and auxiliary currents. It is argued that different definitions than those usually employed lead to simpler analyses of tokamak operating scenarios
Moving Block Bootstrap for Analyzing Longitudinal Data.
Ju, Hyunsu
In a longitudinal study subjects are followed over time. I focus on a case where the number of replications over time is large relative to the number of subjects in the study. I investigate the use of moving block bootstrap methods for analyzing such data. Asymptotic properties of the bootstrap methods in this setting are derived. The effectiveness of these resampling methods is also demonstrated through a simulation study.
A bootstrap lunar base: Preliminary design review 2
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
Prior Pronunciation Knowledge Bootstraps Word Learning
Directory of Open Access Journals (Sweden)
Khia Anne Johnson
2018-02-01
Full Text Available Learners often struggle with L2 sounds, yet little is known about the role of prior pronunciation knowledge and explicit articulatory training in language acquisition. This study asks if existing pronunciation knowledge can bootstrap word learning, and whether short-term audiovisual articulatory training for tongue position with and without a production component has an effect on lexical retention. Participants were trained and tested on stimuli with perceptually salient segments that are challenging to produce. Results indicate that pronunciation knowledge plays an important role in word learning. While much about the extent and shape of this role remains unclear, this study sheds light in three main areas. First, prior pronunciation knowledge leads to increased accuracy in word learning, as all groups trended toward lower accuracy on pseudowords with two novel segments, when compared with those with one or none. Second, all training and control conditions followed similar patterns, with training neither aiding nor inhibiting retention; this is a noteworthy result as previous work has found that the inclusion of production in training leads to decreased performance when testing for retention. Finally, higher production accuracy during practice led to higher retention after the word-learning task, indicating that individual differences and successful training are potentially important indicators of retention. This study provides support for the claim that pronunciation matters in L2 word learning.
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Roberto S. Flowers-Cano; Ruperto Ortiz-Gómez; Jesús Enrique León-Jiménez; Raúl López Rivera; Luis A. Perera Cruz
2018-01-01
Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP), bias-corrected bootstrap (BC), accelerated bias-corrected bootstrap (BCA) and a modified version of the standard bootstrap (MSB). Different simulation scenarios were analyzed. In some cases, the mother distributi...
Analyzing large datasets with bootstrap penalization.
Fang, Kuangnan; Ma, Shuangge
2017-03-01
Data with a large p (number of covariates) and/or a large n (sample size) are now commonly encountered. For many problems, regularization especially penalization is adopted for estimation and variable selection. The straightforward application of penalization to large datasets demands a "big computer" with high computational power. To improve computational feasibility, we develop bootstrap penalization, which dissects a big penalized estimation into a set of small ones, which can be executed in a highly parallel manner and each only demands a "small computer". The proposed approach takes different strategies for data with different characteristics. For data with a large p but a small to moderate n, covariates are first clustered into relatively homogeneous blocks. The proposed approach consists of two sequential steps. In each step and for each bootstrap sample, we select blocks of covariates and run penalization. The results from multiple bootstrap samples are pooled to generate the final estimate. For data with a large n but a small to moderate p, we bootstrap a small number of subjects, apply penalized estimation, and then conduct a weighted average over multiple bootstrap samples. For data with a large p and a large n, the natural marriage of the previous two methods is applied. Numerical studies, including simulations and data analysis, show that the proposed approach has computational and numerical advantages over the straightforward application of penalization. An R package has been developed to implement the proposed methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Czech Academy of Sciences Publication Activity Database
Hanková, Libuše; Holub, Ladislav; Jeřábek, Karel
2006-01-01
Roč. 66, č. 6 (2006), s. 592-598 ISSN 1381-5148 R&D Projects: GA ČR(CZ) GA104/02/1104 Institutional research plan: CEZ:AV0Z40720504 Keywords : catalysis * polymer supports * resin Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 1.561, year: 2006
Cohen-Adad, Julien; Descoteaux, Maxime; Wald, Lawrence L
2011-05-01
To develop a bootstrap method to assess the quality of High Angular Resolution Diffusion Imaging (HARDI) data using Q-Ball imaging (QBI) reconstruction. HARDI data were re-shuffled using regular bootstrap with jackknife sampling. For each bootstrap dataset, the diffusion orientation distribution function (ODF) was estimated voxel-wise using QBI reconstruction based on spherical harmonics functions. The reproducibility of the ODF was assessed using the Jensen-Shannon divergence (JSD) and the angular confidence interval was derived for the first and the second ODF maxima. The sensitivity of the bootstrap method was evaluated on a human subject by adding synthetic noise to the data, by acquiring a map of image signal-to-noise ratio (SNR) and by varying the echo time and the b-value. The JSD was directly linked to the image SNR. The impact of echo times and b-values was reflected by both the JSD and the angular confidence interval, proving the usefulness of the bootstrap method to evaluate specific features of HARDI data. The bootstrap method can effectively assess the quality of HARDI data and can be used to evaluate new hardware and pulse sequences, perform multifiber probabilistic tractography, and provide reliability metrics to support clinical studies. Copyright © 2011 Wiley-Liss, Inc.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
Early Stop Criterion from the Bootstrap Ensemble
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan; Fog, Torben L.
1997-01-01
This paper addresses the problem of generalization error estimation in neural networks. A new early stop criterion based on a Bootstrap estimate of the generalization error is suggested. The estimate does not require the network to be trained to the minimum of the cost function, as required...... by other methods based on asymptotic theory. Moreover, in contrast to methods based on cross-validation which require data left out for testing, and thus biasing the estimate, the Bootstrap technique does not have this disadvantage. The potential of the suggested technique is demonstrated on various time...
Bayesian inference and the parametric bootstrap
Efron, Bradley
2013-01-01
The parametric bootstrap can be used for the efficient computation of Bayes posterior distributions. Importance sampling formulas take on an easy form relating to the deviance in exponential families, and are particularly simple starting from Jeffreys invariant prior. Because of the i.i.d. nature of bootstrap sampling, familiar formulas describe the computational accuracy of the Bayes estimates. Besides computational methods, the theory provides a connection between Bayesian and frequentist analysis. Efficient algorithms for the frequentist accuracy of Bayesian inferences are developed and demonstrated in a model selection example. PMID:23843930
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker...... (1989). In many cases validity of bootstrap-based inference procedures is found to depend crucially on whether the bandwidth sequence satisfies a particular (asymptotic linearity) condition. An exception to this rule occurs for inference procedures involving a studentized estimator employing a "robust...
Bootstrap percolation: a renormalisation group approach
International Nuclear Information System (INIS)
Branco, N.S.; Santos, Raimundo R. dos; Queiroz, S.L.A. de.
1984-02-01
In bootstrap percolation, sites are occupied at random with probability p, but each site is considered active only if at least m of its neighbours are also active. Within an approximate position-space renormalization group framework on a square lattice we obtain the behaviour of the critical concentration p (sub)c and of the critical exponents ν and β for m = 0 (ordinary percolation), 1,2 and 3. We find that the bootstrap percolation problem can be cast into different universality classes, characterized by the values of m. (author) [pt
Pierre Ambroise-Thomas: a loyal friend and a strong supporter of tropical medicine in Brazil
Directory of Open Access Journals (Sweden)
Cláudio Tadeu Daniel-Ribeiro
Full Text Available Abstract: Our colleagues at the Sociedade Brasileira de Medicina Tropical have been informed of the demise of Professor Pierre Ambroise-Thomas (1937-2014. However, considering that the tribute we paid to him in 2015 - at the 20th anniversary of the Seminário Laveran & Deane sobre Malária - is equally true today, it is worth sharing it with the readers of the RSBMT, in recognition of his many virtues. Pierre Ambroise-Thomas (MD in 1963 and DSc in 1969 was Honorary Professor of Parasitology and Tropical Medicine at the Faculté de Médecine de Grenoble (France, Honorary President of the Académie Nationale de Médecine, member of the Académie Nationale de Pharmacie and Officier dans l'Ordre de La Légion d'Honneur. In addition to his important contributions to tropical medicine and parasitology, working in France during his long and productive career (50 years and 300 publications, Ambroise-Thomas became an admirer and supporter of Brazilian activities related to research, teaching and information in Tropical Medicine.
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
A Bootstrap Procedure of Propensity Score Estimation
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Bootstrapping Kernel-Based Semiparametric Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael
by accommodating a non-negligible bias. A noteworthy feature of the assumptions under which the result is obtained is that reliance on a commonly employed stochastic equicontinuity condition is avoided. The second main result shows that the bootstrap provides an automatic method of correcting for the bias even...... when it is non-negligible....
Climate time series analysis classical statistical and bootstrap methods
Mudelsee, Manfred
2010-01-01
This book presents bootstrap resampling as a computationally intensive method able to meet the challenges posed by the complexities of analysing climate data. It shows how the bootstrap performs reliably in the most important statistical estimation techniques.
Efficient generation of pronunciation dictionaries: human factors factors during bootstrapping
CSIR Research Space (South Africa)
Davel, MH
2004-10-01
Full Text Available Bootstrapping techniques have significant potential for the efficient generation of linguistic resources such as electronic pronunciation dictionaries. The authors describe a system and an approach to bootstrapping for the development...
Bootstrapping Relational Affordances of Object Pairs using Transfer
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert
2018-01-01
leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...
Higher Order Bootstrap likelihood | Ogbonmwam | Journal of the ...
African Journals Online (AJOL)
In this work, higher order optimal window width is used to generate bootstrap kernel density likelihood. A simulated study is conducted to compare the distributions of the higher order bootstrap likelihoods with the exact (empirical) bootstrap likelihood. Our results indicate that the optimal window width of orders 2 and 4 ...
The use of the bootstrap in the analysis of case-control studies with missing data
DEFF Research Database (Denmark)
Siersma, Volkert Dirk; Johansen, Christoffer
2004-01-01
nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study......nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study...
Murukesan, Gayathri; Leino, Hannu; Mäenpää, Pirkko; Ståhle, Kurt; Raksajit, Wuttinun; Lehto, Harry J.; Allahverdiyeva-Rinne, Yagut; Lehto, Kirsi
2016-03-01
Surviving of crews during future missions to Mars will depend on reliable and adequate supplies of essential life support materials, i.e. oxygen, food, clean water, and fuel. The most economical and sustainable (and in long term, the only viable) way to provide these supplies on Martian bases is via bio-regenerative systems, by using local resources to drive oxygenic photosynthesis. Selected cyanobacteria, grown in adequately protective containment could serve as pioneer species to produce life sustaining substrates for higher organisms. The very high (95.3 %) CO2 content in Martian atmosphere would provide an abundant carbon source for photo-assimilation, but nitrogen would be a strongly limiting substrate for bio-assimilation in this environment, and would need to be supplemented by nitrogen fertilizing. The very high supply of carbon, with rate-limiting supply of nitrogen strongly affects the growth and the metabolic pathways of the photosynthetic organisms. Here we show that modified, Martian-like atmospheric composition (nearly 100 % CO2) under various low pressure conditions (starting from 50 mbar to maintain liquid water, up to 200 mbars) supports strong cellular growth. Under high CO2 / low N2 ratio the filamentous cyanobacteria produce significant amount of H2 during light due to differentiation of high amount of heterocysts.
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Aspect Ratio Scaling of Ideal No-wall Stability Limits in High Bootstrap Fraction Tokamak Plasmas
International Nuclear Information System (INIS)
Menard, J.E.; Bell, M.G.; Bell, R.E.; Gates, D.A.; Kaye, S.M.; LeBlanc, B.P.; Maingi, R.; Sabbagh, S.A.; Soukhanovskii, V.; Stutman, D.
2003-01-01
Recent experiments in the low aspect ratio National Spherical Torus Experiment (NSTX) [M. Ono et al., Nucl. Fusion 40 (2000) 557] have achieved normalized beta values twice the conventional tokamak limit at low internal inductance and with significant bootstrap current. These experimental results have motivated a computational re-examination of the plasma aspect ratio dependence of ideal no-wall magnetohydrodynamic stability limits. These calculations find that the profile-optimized no-wall stability limit in high bootstrap fraction regimes is well described by a nearly aspect ratio invariant normalized beta parameter utilizing the total magnetic field energy density inside the plasma. However, the scaling of normalized beta with internal inductance is found to be strongly aspect ratio dependent at sufficiently low aspect ratio. These calculations and detailed stability analyses of experimental equilibria indicate that the nonrotating plasma no-wall stability limit has been exceeded by as much as 30% in NSTX in a high bootstrap fraction regime
A Bootstrap Approach to an Affordable Exploration Program
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and
Bootstrapping a change-point Cox model for survival data
Xu, Gongjun; Sen, Bodhisattva; Ying, Zhiliang
2014-01-01
This paper investigates the (in)-consistency of various bootstrap methods for making inference on a change-point in time in the Cox model with right censored survival data. A criterion is established for the consistency of any bootstrap method. It is shown that the usual nonparametric bootstrap is inconsistent for the maximum partial likelihood estimation of the change-point. A new model-based bootstrap approach is proposed and its consistency established. Simulation studies are carried out to assess the performance of various bootstrap schemes. PMID:25400719
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Bootstrap for the case-cohort design.
Huang, Yijian
2014-06-01
The case-cohort design facilitates economical investigation of risk factors in a large survival study, with covariate data collected only from the cases and a simple random subset of the full cohort. Methods that accommodate the design have been developed for various semiparametric models, but most inference procedures are based on asymptotic distribution theory. Such inference can be cumbersome to derive and implement, and does not permit confidence band construction. While bootstrap is an obvious alternative, how to resample is unclear because of complications from the two-stage sampling design. We establish an equivalent sampling scheme, and propose a novel and versatile nonparametric bootstrap for robust inference with an appealingly simple single-stage resampling. Theoretical justification and numerical assessment are provided for a number of procedures under the proportional hazards model.
Kepler Planet Detection Metrics: Statistical Bootstrap Test
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
A Bootstrap Test for Conditional Symmetry
Liangjun Su; Sainan Jin
2005-01-01
This paper proposes a simple consistent nonparametric test of conditional symmetry based on the principle of characteristic functions. The test statistic is shown to be asymptotically normal under the null hypothesis of conditional symmetry and consistent against any conditional asymmetric distributions. We also study the power against local alternatives, propose a bootstrap version of the test, and conduct a small Monte Carlo simulation to evaluate the finitesample performance of the test.
Estimasi Regresi Wavelet Thresholding Dengan Metode Bootstrap
Suparti, Suparti; Mustofa, Achmad; Rusgiyono, Agus
2007-01-01
Wavelet is a function that has the certainly characteristic for example, it oscillate about zero point ascillating, localized in the time and frequency domain and construct the orthogonal bases in L2(R) space. On of the wavelet application is to estimate non parametric regression function. There are two kinds of wavelet estimator, i.e., linear and non linear wavelet estimator. The non linear wavelet estimator is called a thresholding wavelet rstimator. The application of the bootstrap method...
DEFF Research Database (Denmark)
Davis, Jerrold I.; Stevenson, Dennis W.; Petersen, Gitte
2004-01-01
elements of Xyridaceae. A comparison was conducted of jackknife and bootstrap values, as computed using strict-consensus (SC) and frequency-within-replicates (FWR) approaches. Jackknife values tend to be higher than bootstrap values, and for each of these methods support values obtained with the FWR...
Bootstrap inference when using multiple imputation.
Schomaker, Michael; Heumann, Christian
2018-04-16
Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
Maximum non-extensive entropy block bootstrap for non-stationary processes
Czech Academy of Sciences Publication Activity Database
Bergamelli, M.; Novotný, Jan; Urga, G.
2015-01-01
Roč. 91, 1/2 (2015), s. 115-139 ISSN 0001-771X R&D Projects: GA ČR(CZ) GA14-27047S Institutional support: RVO:67985998 Keywords : maximum entropy * bootstrap * Monte Carlo simulations Subject RIV: AH - Economics
Abrupt change in mean using block bootstrap and avoiding variance estimation
Czech Academy of Sciences Publication Activity Database
Peštová, Barbora; Pešta, M.
2018-01-01
Roč. 33, č. 1 (2018), s. 413-441 ISSN 0943-4062 Grant - others:GA ČR(CZ) GJ15-04774Y Institutional support: RVO:67985807 Keywords : Block bootstrap * Change in mean * Change point * Hypothesis testing * Ratio type statistics * Robustness Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.434, year: 2016
Neoclassical tearing dynamo and self-sustainment of a bootstrapped tokamak
International Nuclear Information System (INIS)
Bhattacharjee, A.; Yuan, Y.
1993-01-01
It has been suggested by Boozer that a completely bootstrapped tokamak which requires no seed current is possible due to the open-quotes dynamo effectclose quotes caused by tearing modes. Numerical calculations have been carried out by Weening and Boozer confirming the feasibility of a completely bootstrapped tokamak. These calculations use the resistive MHD model, with the pressure profile held arbitrarily fixed. Several questions naturally arise. Is resistive MHD a good model in the low-collisionality regime of present-day tokamaks in which large bootstrap currents have been observed? Is it consistent to rely on pressure gradients to provide the bootstrap current, but then omit pressure gradients in investigating the tearing instabilities that provide the dynamo effect? And how realistic is it to assume that a strong pressure gradient is sustainable in the central region where current relaxation is expected to produce a dynamo effect? In this paper, the authors investigate the dynamo effect in a bootstrapped tokamak within the framework of the neoclassical MHD model which is more realistic than resistive MHD for the regime in question. Since neoclassical MHD includes trapped-particle effects, it can, in principle, provide an additional mechanism for exciting tearing modes which are known to be stabilized by temperature gradients. They investigate the properties of the dynamo field var-epsilon, and find that the original definition var-epsilon = 1 x b 1 > used in incompressible resistive MHD is no longer adequate; neoclassical MHD forces a redefinition of var-epsilon due to the requirements imposed by the helicity conservation constraint. Thus a completely steady-state bootstrapped tokamak sustained by a neoclassical tearing dynamo is realizable. However, they are pessimistic that such a tokamak, even if it were resistively stable, would be stable to ideal kink modes
Phylogenomics provides strong evidence for relationships of butterflies and moths.
Kawahara, Akito Y; Breinholt, Jesse W
2014-08-07
Butterflies and moths constitute some of the most popular and charismatic insects. Lepidoptera include approximately 160 000 described species, many of which are important model organisms. Previous studies on the evolution of Lepidoptera did not confidently place butterflies, and many relationships among superfamilies in the megadiverse clade Ditrysia remain largely uncertain. We generated a molecular dataset with 46 taxa, combining 33 new transcriptomes with 13 available genomes, transcriptomes and expressed sequence tags (ESTs). Using HaMStR with a Lepidoptera-specific core-orthologue set of single copy loci, we identified 2696 genes for inclusion into the phylogenomic analysis. Nucleotides and amino acids of the all-gene, all-taxon dataset yielded nearly identical, well-supported trees. Monophyly of butterflies (Papilionoidea) was strongly supported, and the group included skippers (Hesperiidae) and the enigmatic butterfly-moths (Hedylidae). Butterflies were placed sister to the remaining obtectomeran Lepidoptera, and the latter was grouped with greater than or equal to 87% bootstrap support. Establishing confident relationships among the four most diverse macroheteroceran superfamilies was previously challenging, but we recovered 100% bootstrap support for the following relationships: ((Geometroidea, Noctuoidea), (Bombycoidea, Lasiocampoidea)). We present the first robust, transcriptome-based tree of Lepidoptera that strongly contradicts historical placement of butterflies, and provide an evolutionary framework for genomic, developmental and ecological studies on this diverse insect order. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
Roberto S. Flowers-Cano
2018-02-01
Full Text Available Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP, bias-corrected bootstrap (BC, accelerated bias-corrected bootstrap (BCA and a modified version of the standard bootstrap (MSB. Different simulation scenarios were analyzed. In some cases, the mother distribution function was fit to the random samples that were generated. In other cases, a distribution function different to the mother distribution was fit to the samples. When the fitted distribution had three parameters, and was the same as the mother distribution, the intervals constructed with the four techniques had acceptable coverage. However, the bootstrap techniques failed in several of the cases in which the fitted distribution had two parameters.
Towards bootstrapping QED{sub 3}
Energy Technology Data Exchange (ETDEWEB)
Chester, Shai M.; Pufu, Silviu S. [Joseph Henry Laboratories, Princeton University,Princeton, NJ 08544 (United States)
2016-08-02
We initiate the conformal bootstrap study of Quantum Electrodynamics in 2+1 space-time dimensions (QED{sub 3}) with N flavors of charged fermions by focusing on the 4-point function of four monopole operators with the lowest unit of topological charge. We obtain upper bounds on the scaling dimension of the doubly-charged monopole operator, with and without assuming other gaps in the operator spectrum. Intriguingly, we find a (gap-dependent) kink in these bounds that comes reasonably close to the large N extrapolation of the scaling dimensions of the singly-charged and doubly-charged monopole operators down to N=4 and N=6.
A tauberian theorem for the conformal bootstrap
Qiao, Jiaxin; Rychkov, Slava
2017-12-01
For expansions in one-dimensional conformal blocks, we provide a rigorous link between the asymptotics of the spectral density of exchanged primaries and the leading singularity in the crossed channel. Our result has a direct application to systems of SL(2, ℝ)-invariant correlators (also known as 1d CFTs). It also puts on solid ground a part of the lightcone bootstrap analysis of the spectrum of operators of high spin and bounded twist in CFTs in d > 2. In addition, a similar argument controls the spectral density asymptotics in large N gauge theories.
A bootstrap approach to bump hunting
Silverman, B. W.
1982-01-01
An important question in cluster analysis and pattern recognition is the determination of the number of clusters into which a given population should be divided. Frequently, particularly when certain specific clustering methods are being used, the number of clusters is taken to be equal to the number of modes, or local maxima, in the probability density function underlying the given data set. The use of kernal density estimates in mode estimation is discussed. The test statistic to be used is defined and a bootstrap technique for assessing significance is given. An illustrative application is followed by an examination of the asymptotic behavior of the test statistic.
General bootstrap equations in 4D CFTs
Cuomo, Gabriel Francisco; Karateev, Denis; Kravchuk, Petr
2018-01-01
We provide a framework for generic 4D conformal bootstrap computations. It is based on the unification of two independent approaches, the covariant (embedding) formalism and the non-covariant (conformal frame) formalism. We construct their main ingredients (tensor structures and differential operators) and establish a precise connection between them. We supplement the discussion by additional details like classification of tensor structures of n-point functions, normalization of 2-point functions and seed conformal blocks, Casimir differential operators and treatment of conserved operators and permutation symmetries. Finally, we implement our framework in a Mathematica package and make it freely available.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Evidence of Bootstrap Financing among Small Start-Up Firms
Howard E. Van Auken; Lynn Neeley
1996-01-01
This study examines the use of bootstrap financing for a sample of 78 firms in a Midwestern state. The results show that traditional sources of capital accounted for 65% of the firms' start-up capital and 35% of the start-up capital was obtained from bootstrap sources. A Chi-squared analysis indicates a significant difference between the percentage of (!) sole proprietorship versus other firms and (2) construction/manufacturing versus other types of firms using bootstrap financing as compared...
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
Using the bootstrap in a multivariadte data problem: An example
International Nuclear Information System (INIS)
Glosup, J.G.; Axelrod, M.C.
1995-01-01
The use of the bootstrap in the multivariate version of the paired t-test is considered and demonstrated through an example. The problem of interest involves comparing two different techniques for measuring the chemical constituents of an sample item. The bootstrap is used to form an empirical significance level for Hotelling's one-sample T-squared statistic. The bootstrap was selected to determine empirical significance levels because the implicit assumption of multivariate normality in the classic Hotelling's one-sample test night not hold. The results of both the classic and bootstrap test are presented and contrasted
Fast, Exact Bootstrap Principal Component Analysis forp> 1 million.
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject ( p ) is much larger than the number of subjects ( n ), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n -dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n -dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p -dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings ( p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) ( p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods.
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...
Establishing Keypoint Matches on Multimodal Images with Bootstrap Strategy and Global Information.
Li, Yong; Jin, Hongbin; Wu, Jiatao; Liu, Jie
2017-04-19
This paper proposes an algorithm of building keypoint matches on multimodal images by combining a bootstrap process and global information. The correct ratio of keypoint matches built with descriptors is typically very low on multimodal images of large spectral difference. To identify correct matches, global information is utilized for evaluating keypoint matches and a bootstrap technique is employed to reduce the computational cost. A keypoint match determines a transformation T and a similarity metric between the reference and the transformed test image by T. The similarity metric encodes global information over entire images, and hence a higher similarity indicates the match can bring more image content into alignment, implying it tends to be correct. Unfortunately, exhausting triplets/quadruples of matches for affine/projective transformation is computationally intractable when the number of keypoints is large. To reduce the computational cost, a bootstrap technique is employed that starts from single matches for a translation and rotation model, and goes increasingly to quadruples of four matches for a projective model. The global information screens for "good" matches at each stage and the bootstrap strategy makes the screening process computationally feasible. Experimental results show that the proposed method can establish reliable keypoint matches on challenging multimodal images of strong multimodality.
Wedenberg, Minna
2013-11-15
To apply a statistical bootstrap analysis to assess the uncertainty in the dose-response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose-response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. The bootstrap analysis generates a family of curves representing the range of plausible dose-response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose-response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose-response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model. Copyright © 2013 Elsevier Inc. All rights reserved.
Bootstrap confidence intervals for the process capability index under half-logistic distribution
Wararit Panichkitkosolkul
2012-01-01
This study concerns the construction of bootstrap confidence intervals for theprocess capability index in the case of half-logistic distribution. The bootstrap confidence intervals applied consist of standard bootstrap confidence interval, percentile bootstrap confidence interval and bias-corrected percentile bootstrap confidence interval. Using Monte Carlo simulations, the estimated coverage probabilities and average widths ofbootstrap confidence intervals are compared, with results showing ...
Bootstrap Estimates of Standard Errors in Generalizability Theory
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Higher-order Gaussian kernel in bootstrap boosting algorithm ...
African Journals Online (AJOL)
The bootstrap boosting algorithm is a bias reduction scheme. The adoption of higher-order Gaussian kernel in a bootstrap boosting algorithm in kernel density estimation was investigated. The algorithm used the higher-order. Gaussian kernel instead of the regular fixed kernels. A comparison of the scheme with existing ...
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
Bootstrapping the O(N) Archipelago
Kos, Filip; Simmons-Duffin, David; Vichi, Alessandro
2015-01-01
We study 3d CFTs with an $O(N)$ global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension $O(N)$ vector $\\phi_i$ and the lowest dimension $O(N)$ singlet $s$, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions $(\\Delta_\\phi, \\Delta_s)$ to lie inside small islands. We also make rigorous determinations of current two-point functions in the $O(2)$ and $O(3)$ models, with applications to transport in condensed matter systems.
The analytic bootstrap in fermionic CFTs
van Loon, Mark
2018-01-01
We apply the method of the large spin bootstrap to analyse fermionic conformal field theories with weakly broken higher spin symmetry. Through the study of correlators of composite operators, we find the anomalous dimensions and OPE coefficients in the GrossNeveu model in d = 2 + ɛ dimensions and the Gross-Neveu-Yukawa model in d = 4 - ɛ dimensions, based only on crossing symmetry. Furthermore a non-trivial solution in the d = 2 + ɛ expansion is found for a fermionic theory in which the fundamental field is not part of the spectrum. The results are perturbative in ɛ and valid to all orders in the spin, reproducing known results for operator dimensions and providing some new results for operator dimensions and OPE coefficients.
Simplifying large spin bootstrap in Mellin space
Dey, Parijat; Ghosh, Kausik; Sinha, Aninda
2018-01-01
We set up the conventional conformal bootstrap equations in Mellin space and analyse the anomalous dimensions and OPE coefficients of large spin double trace operators. By decomposing the equations in terms of continuous Hahn polynomials, we derive explicit expressions as an asymptotic expansion in inverse conformal spin to any order, reproducing the contribution of any primary operator and its descendants in the crossed channel. The expressions are in terms of known mathematical functions and involve generalized Bernoulli (Nørlund) polynomials and the Mack polynomials and enable us to derive certain universal properties. Comparing with the recently introduced reformulated equations in terms of crossing symmetric tree level exchange Witten diagrams, we show that to leading order in anomalous dimension but to all orders in inverse conformal spin, the equations are the same as in the conventional formulation. At the next order, the polynomial ambiguity in the Witten diagram basis is needed for the equivalence and we derive the necessary constraints for the same.
Bootstrapping 3D fermions with global symmetries
Iliesiu, Luca; Kos, Filip; Poland, David; Pufu, Silviu S.; Simmons-Duffin, David
2018-01-01
We study the conformal bootstrap for 4-point functions of fermions 〈 ψ i ψ j ψ k ψ ℓ 〉 in parity-preserving 3d CFTs, where ψ i transforms as a vector under an O( N ) global symmetry. We compute bounds on scaling dimensions and central charges, finding features in our bounds that appear to coincide with the O( N ) symmetric Gross-Neveu-Yukawa fixed points. Our computations are in perfect agreement with the 1 /N expansion at large N and allow us to make nontrivial predictions at small N . For values of N for which the Gross-Neveu-Yukawa universality classes are relevant to condensed-matter systems, we compare our results to previous analytic and numerical results.
The ${\\mathcal N}=2$ superconformal bootstrap
Beem, Christopher; Liendo, Pedro; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
In this work we initiate the conformal bootstrap program for ${\\mathcal N}=2$ superconformal field theories in four dimensions. We promote an abstract operator-algebraic viewpoint in order to unify the description of Lagrangian and non-Lagrangian theories, and formulate various conjectures concerning the landscape of theories. We analyze in detail the four-point functions of flavor symmetry current multiplets and of ${\\mathcal N}=2$ chiral operators. For both correlation functions we review the solution of the superconformal Ward identities and describe their superconformal block decompositions. This provides the foundation for an extensive numerical analysis discussed in the second half of the paper. We find a large number of constraints for operator dimensions, OPE coefficients, and central charges that must hold for any ${\\mathcal N}=2$ superconformal field theory.
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
BOOTSTRAP-BASED INFERENCE FOR GROUPED DATA
Directory of Open Access Journals (Sweden)
Jorge Iván Vélez
2015-07-01
Full Text Available Grouped data refers to continuous variables that are partitioned in intervals, not necessarily of the same length, to facilitate its interpretation. Unlike in ungrouped data, estimating simple summary statistics as the mean and mode, or more complex ones as a percentile or the coefficient of variation, is a difficult endeavour in grouped data. When the probability distribution generating the data is unknown, inference in ungrouped data is carried out using parametric or nonparametric resampling methods. However, there are no equivalent methods in the case of grouped data. Here, a bootstrap-based procedure to estimate the parameters of an unknown distribution based on grouped data is proposed, described and illustrated.
Uncertainty estimation in diffusion MRI using the nonlocal bootstrap.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2014-08-01
In this paper, we propose a new bootstrap scheme, called the nonlocal bootstrap (NLB) for uncertainty estimation. In contrast to the residual bootstrap, which relies on a data model, or the repetition bootstrap, which requires repeated signal measurements, NLB is not restricted by the data structure imposed by a data model and obviates the need for time-consuming multiple acquisitions. NLB hinges on the observation that local imaging information recurs in an image. This self-similarity implies that imaging information coming from spatially distant (nonlocal) regions can be exploited for more effective estimation of statistics of interest. Evaluations using in silico data indicate that NLB produces distribution estimates that are in closer agreement with those generated using Monte Carlo simulations, compared with the conventional residual bootstrap. Evaluations using in vivo data demonstrate that NLB produces results that are in agreement with our knowledge on white matter architecture.
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Bootstrapping Relational Affordances of Object Pairs Using Transfers
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert
2018-01-01
a tool to retrieve a desired object. We investigate how these relational affordances could be learned by a robot from its own action experience. A major challenge in this approach is to reduce the number of training samples needed to achieve accuracy, and hence we investigate an approach which can...... leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn random forest-based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach [direct bootstrapping (DB)], the state-space for a new...
Conformal bootstrap in the Regge limit
Li, Daliang; Meltzer, David; Poland, David
2017-12-01
We analytically solve the conformal bootstrap equations in the Regge limit for large N conformal field theories. For theories with a parametrically large gap, the amplitude is dominated by spin-2 exchanges and we show how the crossing equations naturally lead to the construction of AdS exchange Witten diagrams. We also show how this is encoded in the anomalous dimensions of double-trace operators of large spin and large twist. We use the chaos bound to prove that the anomalous dimensions are negative. Extending these results to correlators containing two scalars and two conserved currents, we show how to reproduce the CEMZ constraint that the three-point function between two currents and one stress tensor only contains the structure given by Einstein-Maxwell theory in AdS, up to small corrections. Finally, we consider the case where operators of unbounded spin contribute to the Regge amplitude, whose net effect is captured by summing the leading Regge trajectory. We compute the resulting anomalous dimensions and corrections to OPE coefficients in the crossed channel and use the chaos bound to show that both are negative.
Quantum bootstrapping via compressed quantum Hamiltonian learning
International Nuclear Information System (INIS)
Wiebe, Nathan; Granade, Christopher; Cory, D G
2015-01-01
A major problem facing the development of quantum computers or large scale quantum simulators is that general methods for characterizing and controlling are intractable. We provide a new approach to this problem that uses small quantum simulators to efficiently characterize and learn control models for larger devices. Our protocol achieves this by using Bayesian inference in concert with Lieb–Robinson bounds and interactive quantum learning methods to achieve compressed simulations for characterization. We also show that the Lieb–Robinson velocity is epistemic for our protocol, meaning that information propagates at a rate that depends on the uncertainty in the system Hamiltonian. We illustrate the efficiency of our bootstrapping protocol by showing numerically that an 8 qubit Ising model simulator can be used to calibrate and control a 50 qubit Ising simulator while using only about 750 kilobits of experimental data. Finally, we provide upper bounds for the Fisher information that show that the number of experiments needed to characterize a system rapidly diverges as the duration of the experiments used in the characterization shrinks, which motivates the use of methods such as ours that do not require short evolution times. (fast track communication)
Directory of Open Access Journals (Sweden)
José Antonio Castorina
2005-12-01
Full Text Available El presente artículo expone la teoría explicativa propuesta por Carey para el cambio conceptual. Primeramente, se plantea la cuestión de la reorganización conceptual en la psicología cognitiva y la posición de Carey. En segundo lugar, se ponen de relieve las condiciones epistémica que deben cumplir las "teorías" infantiles para que la reestructuración conceptual sea posible, así como los modos que adopta esta última. En tercer lugar, se muestran los resultados de investigaciones que verifican el cambio conceptual entre teorías infantiles de biología intuitiva. En cuarto lugar, se plantean las dificultades de otras teorías del cambio conceptual, para luego formular los rasgos del mecanismo alternativo de bootstrapping y su pertinencia para interpretrar los datos de las indagaciones mencionadas. Finalmente, se evalúan la originalidad de la teoría del bootstrpping en el escenario de los debates contemporáneos. Muy especialmente, se esboza una posible aproximación con las tesis dialécticas de Piaget.This paper examines the Carey's theory of conceptual change. First, it describes the conceptual reorganization in cognitive psychology and the author position. Second, the epistemic conditions that children "theories" should fulfil to make conceptual restructuring possible, as well as the ways adopted by the latter, are analyzed. In third place, findings of researches testing the conceptual change among biology intuitive children theories are explained. Subsequently, it discusses the difficulties other theories of conceptual change present, in order to state features of bootstrapping as an alternative mechanism and its relevance for the interpretation of abovementioned researches results. Finally, it evaluates the originality of "bootstrapping" theory in the scene of contemporary debates. It particularly outlines a possible approach to Piaget's dialectic theses.
Climate time series analysis classical statistical and bootstrap methods
Mudelsee, Manfred
2014-01-01
Written for climatologists and applied statisticians, this book explains the bootstrap algorithms (including novel adaptions) and methods for confidence interval construction. The accuracy of the algorithms is tested by means of Monte Carlo experiments.
'Bootstrap' Configuration for Multistage Pulse-Tube Coolers
Nguyen, Bich; Nguyen, Lauren
2008-01-01
A bootstrap configuration has been proposed for multistage pulse-tube coolers that, for instance, provide final-stage cooling to temperatures as low as 20 K. The bootstrap configuration supplants the conventional configuration, in which customarily the warm heat exchangers of all stages reject heat at ambient temperature. In the bootstrap configuration, the warm heat exchanger, the inertance tube, and the reservoir of each stage would be thermally anchored to the cold heat exchanger of the next warmer stage. The bootstrapped configuration is superior to the conventional setup, in some cases increasing the 20 K cooler's coefficient of performance two-fold over that of an otherwise equivalent conventional layout. The increased efficiency could translate into less power consumption, less cooler mass, and/or lower cost for a given amount of cooling.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach of Jacod et al. (2009), where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre......-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995......)) is valid only when volatility is constant. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure...
Bootstrap prediction and Bayesian prediction under misspecified models
Fushiki, Tadayoshi
2005-01-01
We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...
Efficient generation of pronunciation dictionaries: machine learning factors during bootstrapping
CSIR Research Space (South Africa)
Davel, MH
2004-10-01
Full Text Available of Pronunciation Dictionaries: Machine Learning Factors during Bootstrapping Marelie Davel and Etienne Barnard CSIR / University of Pretoria Pretoria, South Africa mdavel@csir.co.za ebarnard@up.ac.za Abstract Several factors affect the efficiency... of bootstrapping approaches to the generation of pronunciation dictionaries. We focus on factors related to the underlying rule-extraction algorithms, and demonstrate variants of the Dynamically Expanding Context al- gorithm, which are beneficial...
Generalized bootstrap equations and possible implications for the NLO Odderon
Energy Technology Data Exchange (ETDEWEB)
Bartels, J. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Vacca, G.P. [INFN, Sezione di Bologna (Italy)
2013-07-15
We formulate and discuss generalized bootstrap equations in nonabelian gauge theories. They are shown to hold in the leading logarithmic approximation. Since their validity is related to the self-consistency of the Steinmann relations for inelastic production amplitudes they can be expected to be valid also in NLO. Specializing to the N=4 SYM, we show that the validity in NLO of these generalized bootstrap equations allows to find the NLO Odderon solution with intercept exactly at one.
Bootstrap current calculations for TJ-II stellarator
Martinell, Julio J.; Camacho, Katia
2016-10-01
Bootstrap current is stellarators is usually very small since they operate solely with the magnetic confinement provided by the external currents. Since plasma pressure gradients are always present the bootstrap current is always finite, but the magnetic design can be optimized to minimize it. In the helias configuration there is no optimization and therefore it is important to estimate the actual bootstrap current generated by given pressure profiles. Here, we use the configuration of the TJ-II helias to calculate the bootstrap current for various density regimes using the kinetic code DKES. We compute the monoenergetic transport coefficients D11 and D13 to find first the thermal ambipolar diffusion coefficients and the corresponding radial electric field and then the respective bootstrap current. This is made taking experimental density and electron and ion temperature profiles. In spite of the convergence problems of DKES at low collisionality, we can obtain bootstrap current values with acceptable uncertainties, without using Monte Carlo methods. The results are compared with axisymmetric neoclassical computations. The resulting rotational transform is used to obtain the rational surfaces location and predict the transport barriers observed in the experiments. Funded by projects PAPIIT IN109115 and Conacyt 152905.
A Bootstrap Approach to Martian Manufacturing
Dorais, Gregory A.
2004-01-01
In-Situ Resource Utilization (ISRU) is an essential element of any affordable strategy for a sustained human presence on Mars. Ideally, Martian habitats would be extremely massive to allow plenty of room to comfortably live and work, as well as to protect the occupants from the environment. Moreover, transportation and power generation systems would also require significant mass if affordable. For our approach to ISRU, we use the industrialization of the U.S. as a metaphor. The 19th century started with small blacksmith shops and ended with massive steel mills primarily accomplished by blacksmiths increasing their production capacity and product size to create larger shops, which produced small mills, which produced the large steel mills that industrialized the country. Most of the mass of a steel mill is comprised of steel in simple shapes, which are produced and repaired with few pieces of equipment also mostly made of steel in basic shapes. Due to this simplicity, we expect that the 19th century manufacturing growth can be repeated on Mars in the 21st century using robots as the primary labor force. We suggest a "bootstrap" approach to manufacturing on Mars that uses a "seed" manufacturing system that uses regolith to create major structural components and spare parts. The regolith would be melted, foamed, and sintered as needed to fabricate parts using casting and solid freeform fabrication techniques. Complex components, such as electronics, would be brought from Earth and integrated as needed. These parts would be assembled to create additional manufacturing systems, which can be both more capable and higher capacity. These subsequent manufacturing systems could refine vast amounts of raw materials to create large components, as well as assemble equipment, habitats, pressure vessels, cranes, pipelines, railways, trains, power generation stations, and other facilities needed to economically maintain a sustained human presence on Mars.
Energy Technology Data Exchange (ETDEWEB)
Urbanski, P.; Kowalska, E.
1997-12-31
The principle of the bootstrap methodology applied for the assessment of parameters and prediction ability of the linear regression models was presented. Application of this method was shown on the example of calibration of the radioisotope sulphuric acid concentration gauge. The bootstrap method allows to determine not only the numerical values of the regression coefficients, but also enables to investigate their distributions. (author). 11 refs, 12 figs, 3 tabs.
Czech Academy of Sciences Publication Activity Database
Exner, Pavel; Kondej, S.
2016-01-01
Roč. 77, č. 1 (2016), s. 1-17 ISSN 0034-4877 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : singular perturbations * eigenvalue asymptotics Subject RIV: BE - Theoretical Physics Impact factor: 0.604, year: 2016
Assessment of bootstrap resampling performance for PET data.
Markiewicz, P J; Reader, A J; Matthews, J C
2015-01-07
Bootstrap resampling has been successfully used for estimation of statistical uncertainty of parameters such as tissue metabolism, blood flow or displacement fields for image registration. The performance of bootstrap resampling as applied to PET list-mode data of the human brain and dedicated phantoms is assessed in a novel and systematic way such that: (1) the assessment is carried out in two resampling stages: the 'real world' stage where multiple reference datasets of varying statistical level are generated and the 'bootstrap world' stage where corresponding bootstrap replicates are generated from the reference datasets. (2) All resampled datasets were reconstructed yielding images from which multiple voxel and regions of interest (ROI) values were extracted to form corresponding distributions between the two stages. (3) The difference between the distributions from both stages was quantified using the Jensen-Shannon divergence and the first four moments. It was found that the bootstrap distributions are consistently different to the real world distributions across the statistical levels. The difference was explained by a shift in the mean (up to 33% for voxels and 14% for ROIs) being proportional to the inverse square root of the statistical level (number of counts). Other moments were well replicated by the bootstrap although for very low statistical levels the estimation of the variance was poor. Therefore, the bootstrap method should be used with care when estimating systematic errors (bias) and variance when very low statistical levels are present such as in early time frames of dynamic acquisitions, when the underlying population may not be sufficiently represented.
Song, Hangke; Liu, Zhi; Du, Huan; Sun, Guangling; Le Meur, Olivier; Ren, Tongwei
2017-09-01
This paper proposes a novel depth-aware salient object detection and segmentation framework via multiscale discriminative saliency fusion (MDSF) and bootstrap learning for RGBD images (RGB color images with corresponding Depth maps) and stereoscopic images. By exploiting low-level feature contrasts, mid-level feature weighted factors and high-level location priors, various saliency measures on four classes of features are calculated based on multiscale region segmentation. A random forest regressor is learned to perform the discriminative saliency fusion (DSF) and generate the DSF saliency map at each scale, and DSF saliency maps across multiple scales are combined to produce the MDSF saliency map. Furthermore, we propose an effective bootstrap learning-based salient object segmentation method, which is bootstrapped with samples based on the MDSF saliency map and learns multiple kernel support vector machines. Experimental results on two large datasets show how various categories of features contribute to the saliency detection performance and demonstrate that the proposed framework achieves the better performance on both saliency detection and salient object segmentation.
Neural Network Computed Bootstrap Current for Real Time Control in DIII-D
Tema Biwole, Arsene; Smith, Sterling P.; Meneghini, Orso; Belli, Emily; Candy, Jeff
2017-10-01
In an effort to provide a fast and accurate calculation of the bootstrap current density for use as a constraint in real-time equilibrium reconstructions, we have developed a neural network (NN) non-linear regression of the NEO code calculated bootstrap current jBS. A new formulation for jBS in NEO allows for a determination of the coefficients on the density and temperature scale lengths. The new formulation reduces the number of inputs to the NN, and the number of output coefficients is 2 times the number of species (including electrons). The NN can reproduce the NEO and Sauter coefficients to a high degree of accuracy (bootstrap current density calculated in NEO has been used as a constraint in an offline equilibrium reconstruction for comparison to the NN calculation. The computational time of this method (μs) makes it ideal for real time calculation in DIII-D. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656, DE-FC02-06ER54873.
Cole, J.J.; Carpenter, S.R.; Kitchell, J.; Pace, M.L.; Solomon, C.T.; Weidel, B.
2011-01-01
Cross-ecosystem subsidies to food webs can alter metabolic balances in the receiving (subsidized) system and free the food web, or particular consumers, from the energetic constraints of local primary production. Although cross-ecosystem subsidies between terrestrial and aquatic systems have been well recognized for benthic organisms in streams, rivers, and the littoral zones of lakes, terrestrial subsidies to pelagic consumers are more difficult to demonstrate and remain controversial. Here, we adopt a unique approach by using stable isotopes of H, C, and N to estimate terrestrial support to zooplankton in two contrasting lakes. Zooplankton (Holopedium, Daphnia, and Leptodiaptomus) are comprised of ???20-40% of organic material of terrestrial origin. These estimates are as high as, or higher than, prior measures obtained by experimentally manipulating the inorganic 13C content of these lakes to augment the small, natural contrast in 13C between terrestrial and algal photosynthesis. Our study gives credence to a growing literature, which we review here, suggesting that significant terrestrial support of pelagic crustaceans (zooplankton) is widespread.
Cole, Jonathan J.; Carpenter, Stephen R.; Kitchell, Jim; Pace, Michael L.; Solomon, Christopher T.; Weidel, Brian
2011-01-01
Cross-ecosystem subsidies to food webs can alter metabolic balances in the receiving (subsidized) system and free the food web, or particular consumers, from the energetic constraints of local primary production. Although cross-ecosystem subsidies between terrestrial and aquatic systems have been well recognized for benthic organisms in streams, rivers, and the littoral zones of lakes, terrestrial subsidies to pelagic consumers are more difficult to demonstrate and remain controversial. Here, we adopt a unique approach by using stable isotopes of H, C, and N to estimate terrestrial support to zooplankton in two contrasting lakes. Zooplankton (Holopedium, Daphnia, and Leptodiaptomus) are comprised of ≈20–40% of organic material of terrestrial origin. These estimates are as high as, or higher than, prior measures obtained by experimentally manipulating the inorganic 13C content of these lakes to augment the small, natural contrast in 13C between terrestrial and algal photosynthesis. Our study gives credence to a growing literature, which we review here, suggesting that significant terrestrial support of pelagic crustaceans (zooplankton) is widespread. PMID:21245299
Syntactic bootstrapping in children with Down syndrome: the impact of bilingualism.
Cleave, Patricia L; Kay-Raining Bird, Elizabeth; Trudeau, Natacha; Sutton, Ann
2014-01-01
The purpose of the study was to add to our knowledge of bilingual learning in children with Down syndrome (DS) using a syntactic bootstrapping task. Four groups of children and youth matched on non-verbal mental age participated. There were 14 bilingual participants with DS (DS-B, mean age 12;5), 12 monolingual participants with DS (DS-M, mean age 10;10), 9 bilingual typically developing children (TD-B; mean age 4;1) and 11 monolingual typically developing children (TD-M; mean age 4;1). The participants completed a computerized syntactic bootstrapping task involving unfamiliar nouns and verbs. The syntactic cues employed were a for the nouns and ing for the verbs. Performance was better on nouns than verbs. There was also a main effect for group. Follow-up t-tests revealed that there were no significant differences between the TD-M and TD-B or between the DS-M and DS-B groups. However, the DS-M group performed more poorly than the TD-M group with a large effect size. Analyses at the individual level revealed a similar pattern of results. There was evidence that Down syndrome impacted performance; there was no evidence that bilingualism negatively affected the syntactic bootstrapping skills of individuals with DS. These results from a dynamic language task are consistent with those of previous studies that used static or product measures. Thus, the results are consistent with the position that parents should be supported in their decision to provide bilingual input to their children with DS. Readers of this article will identify (1) research evidence regarding bilingual development in children with Down syndrome and (2) syntactic bootstrapping skills in monolingual and bilingual children who are typically developing or who have Down syndrome. Copyright © 2014 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Trang Chu
Full Text Available Upon invading the host erythrocyte, the human malaria parasite P. falciparum lives and replicates within a membrane bound compartment referred to as the parasitophorous vacuole. Recently, interest in this compartment and its protein content has grown, due to the important roles these play in parasite egress and protein traffic to the host cell. Surprisingly, the function of many proteins within this compartment has not been experimentally addressed. Here, we study the importance of one of these proteins, termed PfPV1, for intra-erythrocytic parasite survival. Despite numerous attempts to inactivate the gene encoding PfPV1, we were unable to recover deletion mutants. Control experiments verified that the pv1 gene locus was per se open for gene targeting experiments, allowing us to exclude technical limitations in our experimental strategy. Our data provide strong genetic evidence that PfPV1 is essential for survival of blood stage P. falciparum, and further highlight the importance of parasitophorous vacuole proteins in this part of the parasite's life cycle.
Locality, bulk equations of motion and the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Kabat, Daniel [Department of Physics and Astronomy, Lehman College, City University of New York,250 Bedford Park Blvd. W, Bronx NY 10468 (United States); Lifschytz, Gilad [Department of Mathematics, Faculty of Natural Science, University of Haifa,199 Aba Khoushy Ave., Haifa 31905 (Israel)
2016-10-18
We develop an approach to construct local bulk operators in a CFT to order 1/N{sup 2}. Since 4-point functions are not fixed by conformal invariance we use the OPE to categorize possible forms for a bulk operator. Using previous results on 3-point functions we construct a local bulk operator in each OPE channel. We then impose the condition that the bulk operators constructed in different channels agree, and hence give rise to a well-defined bulk operator. We refer to this condition as the “bulk bootstrap.” We argue and explicitly show in some examples that the bulk bootstrap leads to some of the same results as the regular conformal bootstrap. In fact the bulk bootstrap provides an easier way to determine some CFT data, since it does not require knowing the form of the conformal blocks. This analysis clarifies previous results on the relation between bulk locality and the bootstrap for theories with a 1/N expansion, and it identifies a simple and direct way in which OPE coefficients and anomalous dimensions determine the bulk equations of motion to order 1/N{sup 2}.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Unbiased bootstrap error estimation for linear discriminant analysis.
Vu, Thang; Sima, Chao; Braga-Neto, Ulisses M; Dougherty, Edward R
2014-12-01
Convex bootstrap error estimation is a popular tool for classifier error estimation in gene expression studies. A basic question is how to determine the weight for the convex combination between the basic bootstrap estimator and the resubstitution estimator such that the resulting estimator is unbiased at finite sample sizes. The well-known 0.632 bootstrap error estimator uses asymptotic arguments to propose a fixed 0.632 weight, whereas the more recent 0.632+ bootstrap error estimator attempts to set the weight adaptively. In this paper, we study the finite sample problem in the case of linear discriminant analysis under Gaussian populations. We derive exact expressions for the weight that guarantee unbiasedness of the convex bootstrap error estimator in the univariate and multivariate cases, without making asymptotic simplifications. Using exact computation in the univariate case and an accurate approximation in the multivariate case, we obtain the required weight and show that it can deviate significantly from the constant 0.632 weight, depending on the sample size and Bayes error for the problem. The methodology is illustrated by application on data from a well-known cancer classification study.
Bootstrap testing for cross-correlation under low firing activity.
González-Montoro, Aldana M; Cao, Ricardo; Espinosa, Nelson; Cudeiro, Javier; Mariño, Jorge
2015-06-01
A new cross-correlation synchrony index for neural activity is proposed. The index is based on the integration of the kernel estimation of the cross-correlation function. It is used to test for the dynamic synchronization levels of spontaneous neural activity under two induced brain states: sleep-like and awake-like. Two bootstrap resampling plans are proposed to approximate the distribution of the test statistics. The results of the first bootstrap method indicate that it is useful to discern significant differences in the synchronization dynamics of brain states characterized by a neural activity with low firing rate. The second bootstrap method is useful to unveil subtle differences in the synchronization levels of the awake-like state, depending on the activation pathway.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Energy Technology Data Exchange (ETDEWEB)
Wedenberg, Minna, E-mail: minna.wedenberg@raysearchlabs.com
2013-11-15
Purpose: To apply a statistical bootstrap analysis to assess the uncertainty in the dose–response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. Methods and Materials: The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose–response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. Results: The bootstrap analysis generates a family of curves representing the range of plausible dose–response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose–response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. Conclusions: The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose–response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model.
Investigations of dipole localization accuracy in MEG using the bootstrap.
Darvas, F; Rautiainen, M; Pantazis, D; Baillet, S; Benali, H; Mosher, J C; Garnero, L; Leahy, R M
2005-04-01
We describe the use of the nonparametric bootstrap to investigate the accuracy of current dipole localization from magnetoencephalography (MEG) studies of event-related neural activity. The bootstrap is well suited to the analysis of event-related MEG data since the experiments are repeated tens or even hundreds of times and averaged to achieve acceptable signal-to-noise ratios (SNRs). The set of repetitions or epochs can be viewed as a set of independent realizations of the brain's response to the experiment. Bootstrap resamples can be generated by sampling with replacement from these epochs and averaging. In this study, we applied the bootstrap resampling technique to MEG data from somatotopic experimental and simulated data. Four fingers of the right and left hand of a healthy subject were electrically stimulated, and about 400 trials per stimulation were recorded and averaged in order to measure the somatotopic mapping of the fingers in the S1 area of the brain. Based on single-trial recordings for each finger we performed 5000 bootstrap resamples. We reconstructed dipoles from these resampled averages using the Recursively Applied and Projected (RAP)-MUSIC source localization algorithm. We also performed a simulation for two dipolar sources with overlapping time courses embedded in realistic background brain activity generated using the prestimulus segments of the somatotopic data. To find correspondences between multiple sources in each bootstrap, sample dipoles with similar time series and forward fields were assumed to represent the same source. These dipoles were then clustered by a Gaussian Mixture Model (GMM) clustering algorithm using their combined normalized time series and topographies as feature vectors. The mean and standard deviation of the dipole position and the dipole time series in each cluster were computed to provide estimates of the accuracy of the reconstructed source locations and time series.
Testing Process Factor Analysis Models Using the Parametric Bootstrap.
Zhang, Guangjian
2018-01-01
Process factor analysis (PFA) is a latent variable model for intensive longitudinal data. It combines P-technique factor analysis and time series analysis. The goodness-of-fit test in PFA is currently unavailable. In the paper, we propose a parametric bootstrap method for assessing model fit in PFA. We illustrate the test with an empirical data set in which 22 participants rated their effects everyday over a period of 90 days. We also explore Type I error and power of the parametric bootstrap test with simulated data.
A critique of astrophysical applications of Hagedorn's bootstrap
Nahm, W
1980-01-01
It has been shown that Hagedorn's bootstrap should not be applied to hadronic matter at densities large against nuclear densities. The correct predictions of the thermodynamical model do not use any relation between the mass of the fireballs and their size, whereas the astrophysical applications depend on the unreasonable assumption that the size is independent of the mass. The most spectacular prediction of the bootstrap, namely violent black hole explosions yielding 10/sup 15/ g in the last millisecond, is shown to be completely unfounded, even if such an assumption is made. (21 refs).
Bootstrapped efficiency measures of oil blocks in Angola
International Nuclear Information System (INIS)
Barros, C.P.; Assaf, A.
2009-01-01
This paper investigates the technical efficiency of Angola oil blocks over the period 2002-2007. A double bootstrap data envelopment analysis (DEA) model is adopted composed in the first stage of a DEA-variable returns to scale (VRS) model and then followed in the second stage by a bootstrapped truncated regression. Results showed that on average, the technical efficiency has fluctuated over the period of study, but deep and ultradeep oil blocks have generally maintained a consistent efficiency level. Policy implications are derived.
Wang, Huai-Chun; Susko, Edward; Roger, Andrew J
2016-12-01
Assessing the robustness of an inferred phylogeny is an important element of phylogenetics. This is typically done with measures of stabilities at the internal branches and the variation of the positions of the leaf nodes. The bootstrap support for branches in maximum parsimony, distance and maximum likelihood estimation, or posterior probabilities in Bayesian inference, measure the uncertainty about a branch due to the sampling of the sites from genes or sampling genes from genomes. However, these measures do not reveal how taxon sampling affects branch support and the effects of taxon sampling on the estimated phylogeny. An internal branch in a phylogenetic tree can be viewed as a split that separates the taxa into two nonempty complementary subsets. We develop several split-specific measures of stability determined from bootstrap support for quartets. These include BPtaxon_split (average bootstrap percentage [BP] for all quartets involving a taxon within a split), BPsplit (BPtaxon_split averaged over taxa), BPtaxon (BPtaxon_split averaged over splits) and RBIC-taxon (average BP over all splits after removing a taxon). We also develop a pruned-tree distance metric. Application of our measures to empirical and simulated data illustrate that existing measures of overall stability can fail to detect taxa that are the primary source of a split-specific instability. Moreover, we show that the use of many reduced sets of quartets is important in being able to detect the influence of joint sets of taxa rather than individual taxa. These new measures are valuable diagnostic tools to guide taxon sampling in phylogenetic experimental design. Copyright © 2016 Elsevier Inc. All rights reserved.
Bootstrap Approach to Comparison of Alternative Methods of ...
African Journals Online (AJOL)
A bootstrap simulation approach was used to generate values for endogenous variables of a simultaneous equation model popularly known as Keynesian Model of Income Determination. Three sample sizes 20, 30 and 40 each replicated 10, 20 and 30 times were considered. Four different estimation techniques: Ordinary ...
Sieve bootstrapping in the Lee-Carter model
Heinemann, A.
2013-01-01
This paper studies an alternative approach to construct confidence intervals for parameter estimates of the Lee-Carter model. First, the procedure of obtaining confidence intervals using regular nonparametric i.i.d. bootstrap is specified. Empirical evidence seems to invalidate this approach as it
Properties of bootstrap tests for N-of-1 studies.
Lin, Sharon X; Morrison, Leanne; Smith, Peter W F; Hargood, Charlie; Weal, Mark; Yardley, Lucy
2016-11-01
N-of-1 study designs involve the collection and analysis of repeated measures data from an individual not using an intervention and using an intervention. This study explores the use of semi-parametric and parametric bootstrap tests in the analysis of N-of-1 studies under a single time series framework in the presence of autocorrelation. When the Type I error rates of bootstrap tests are compared to Wald tests, our results show that the bootstrap tests have more desirable properties. We compare the results for normally distributed errors with those for contaminated normally distributed errors and find that, except when there is relatively large autocorrelation, there is little difference between the power of the parametric and semi-parametric bootstrap tests. We also experiment with two intervention designs: ABAB and AB, and show the ABAB design has more power. The results provide guidelines for designing N-of-1 studies, in the sense of how many observations and how many intervention changes are needed to achieve a certain level of power and which test should be performed. © 2016 The Authors British Journal of Mathematical and Statistical Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Finite-Size Effects for Some Bootstrap Percolation Models
Enter, A.C.D. van; Adler, Joan; Duarte, J.A.M.S.
The consequences of Schonmann's new proof that the critical threshold is unity for certain bootstrap percolation models are explored. It is shown that this proof provides an upper bound for the finite-size scaling in these systems. Comparison with data for one case demonstrates that this scaling
Bootstrap quantification of estimation uncertainties in network degree distributions.
Gel, Yulia R; Lyubchich, Vyacheslav; Ramirez Ramirez, L Leticia
2017-07-19
We propose a new method of nonparametric bootstrap to quantify estimation uncertainties in functions of network degree distribution in large ultra sparse networks. Both network degree distribution and network order are assumed to be unknown. The key idea is based on adaptation of the "blocking" argument, developed for bootstrapping of time series and re-tiling of spatial data, to random networks. We first sample a set of multiple ego networks of varying orders that form a patch, or a network block analogue, and then resample the data within patches. To select an optimal patch size, we develop a new computationally efficient and data-driven cross-validation algorithm. The proposed fast patchwork bootstrap (FPB) methodology further extends the ideas for a case of network mean degree, to inference on a degree distribution. In addition, the FPB is substantially less computationally expensive, requires less information on a graph, and is free from nuisance parameters. In our simulation study, we show that the new bootstrap method outperforms competing approaches by providing sharper and better-calibrated confidence intervals for functions of a network degree distribution than other available approaches, including the cases of networks in an ultra sparse regime. We illustrate the FPB in application to collaboration networks in statistics and computer science and to Wikipedia networks.
Automatic shape model building based on principal geodesic analysis bootstrapping
DEFF Research Database (Denmark)
Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M
2008-01-01
shape representation is deformed into the training shapes followed by computation of the shape mean and modes of shape variation. In the first iteration, a generic shape model is used as starting point - in the following iterations in the bootstrap method, the resulting mean and modes from the previous...
Metastability thresholds for anisotropic bootstrap percolation in three dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
A Bootstrap Cointegration Rank Test for Panels of VAR Models
DEFF Research Database (Denmark)
Callot, Laurent
functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...
Finite-size effects for anisotropic bootstrap percolation : Logarithmic corrections
van Enter, Aernout C. D.; Hulshof, Tim
In this note we analyse an anisotropic, two-dimensional bootstrap percolation model introduced by Gravner and Griffeath. We present upper and lower bounds on the finite-size effects. We discuss the similarities with the semi-oriented model introduced by Duarte.
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Enter, Aernout C.D. van; Fey, Anne
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
Bootstrap confidence intervals for model-based surveys | Ouma ...
African Journals Online (AJOL)
To deal with the problem, Chambers and Dorfam (1994) suggested a n alternative method based on the bootstrap methodology. Their method is meant for model-based surveys. It starts by assuming a simple linear regression model as a working model in which the ratio estimator is optimal for estimating the population total.
Bootstrap confidence intervals for three-way methods
Kiers, Henk A.L.
Results from exploratory three-way analysis techniques such as CANDECOMP/PARAFAC and Tucker3 analysis are usually presented without giving insight into uncertainties due to sampling. Here a bootstrap procedure is proposed that produces percentile intervals for all output parameters. Special
Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method
Directory of Open Access Journals (Sweden)
Yi-Ming Hu
2013-01-01
Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Sidecoin: a snapshot mechanism for bootstrapping a blockchain
Krug, Joseph; Peterson, Jack
2015-01-01
Sidecoin is a mechanism that allows a snapshot to be taken of Bitcoin's blockchain. We compile a list of Bitcoin's unspent transaction outputs, then use these outputs and their corresponding balances to bootstrap a new blockchain. This allows the preservation of Bitcoin's economic state in the context of a new blockchain, which may provide new features and technical innovations.
Adaptive Kernel In The Bootstrap Boosting Algorithm In KDE ...
African Journals Online (AJOL)
This paper proposes the use of adaptive kernel in a bootstrap boosting algorithm in kernel density estimation. The algorithm is a bias reduction scheme like other existing schemes but uses adaptive kernel instead of the regular fixed kernels. An empirical study for this scheme is conducted and the findings are comparatively ...
Assessing blood flow control through a bootstrap method
Simpson, D.M.; Panerai, R.B.; Ramos, E.G.; Lopes, J.M.A.; Villar Marinatto, M.N.; Nadal, J.; Evans, D.H.
2004-01-01
In order to assess blood flow control, the relationship between blood pressure and blood flow can be modeled by linear filters. We present a bootstrap method, which allows the statistical analysis of an index of blood flow control that is obtained from constrained system identification using an established set of pre-defined filters.
Integrable deformations of conformal theories and bootstrap trees
International Nuclear Information System (INIS)
Mussardo, G.
1991-01-01
I present recent results in the study of massive integrable quantum field theories in (1+1) dimensions considered as perturbed conformal minimal models. The on mass-shell properties of such theories, with a particular emphasis on the bootstrap principle, are investigated. (orig.)
Kim, Ji Dang; Choi, Myong Yong; Choi, Hyun Chul
2017-10-01
Graphene-oxide-supported Pt (GO-Pt) nanoparticles were prepared by performing diimide-activated amidation and used in an electrocatalyst for hydrazine electro-oxidation in 0.5 M H2SO4 solution. The physico-chemical properties of the GO-Pt nanoparticles were characterized with various techniques, which revealed that highly dispersed Pt nanoparticles with an average size of 2.6 nm were densely deposited on the amidated GO due to their strong adhesion. Cyclic voltammograms were obtained and demonstrate that the GO-Pt catalyst exhibits significantly improved catalytic activity and long-term stability in hydrazine electro-oxidation in a strong acidic solution when compared to commercial Pt/C and Pt metal electrodes. These enhanced electrochemical properties are attributed to the large electrochemically active surface area that results from the smaller size and excellent dispersion of the Pt nanoparticles on amidated GO.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
traffic between a tenant’s IaaS resources. The scripts use the OpenStack API for IP address information and then build configurations for the Linux...provi- sion secrets and keys, tenant operators can instead use a dedicated secret management system that supports the full lifecycle of cryptographic
Embodied Language Learning and Cognitive Bootstrapping
DEFF Research Database (Denmark)
Lyon, C.E.; Nehaniv, C. L.; Saunders, Joe
2016-01-01
Co-development of action, conceptualization and social interaction mutually scaffold and support each other within a virtuous feedback cycle in the development of human language in children. Within this framework, the purpose of this article is to bring together diverse but complementary accounts...
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
RANDOM QUADRATIC-FORMS AND THE BOOTSTRAP FOR U-STATISTICS
DEHLING, H; MIKOSCH, T
1994-01-01
We study the bootstrap distribution for U-statistics with special emphasis on the degenerate case. For the Efron bootstrap we give a short proof of the consistency using Mallows' metrics. We also study the i.i.d. weighted bootstrap [GRAPHICS] where (X(i)) and (xi(i)) are two i.i.d. sequences,
International Nuclear Information System (INIS)
Wang, Guodong; He, Zhen; Xue, Li; Cui, Qingan; Lv, Shanshan; Zhou, Panpan
2017-01-01
Factors which significantly affect product reliability are of great interest to reliability practitioners. This paper proposes a bootstrap-based methodology for identifying significant factors when both location and scale parameters of the smallest extreme value distribution vary over experimental factors. An industrial thermostat experiment is presented, analyzed, and discussed as an illustrative example. The analysis results show that 1) the misspecification of a constant scale parameter may lead to misidentify spurious effects; 2) the important factors identified by different bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping) are different; 3) the number of factors affecting 10th percentile lifetime significantly is less than the number of important factors identified at 63.21th percentile. - Highlights: • Product reliability is improved by design of experiments under both scale and location parameters of smallest extreme value distribution vary with experimental factors. • A bootstrap-based methodology is proposed to identify important factors which affect 100pth lifetime percentile significantly. • Bootstrapping confidence intervals associating experimental factors are obtained by using three bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping). • The important factors identified by different bootstrap methods are different. • The number of factors affecting 10th percentile significantly is less than the number of important factors identified at 63.21th percentile.
Li, Siwei; Xu, Yao; Chen, Yifu; Li, Weizhen; Lin, Lili; Li, Mengzhu; Deng, Yuchen; Wang, Xiaoping; Ge, Binghui; Yang, Ce; Yao, Siyu; Xie, Jinglin; Li, Yongwang; Liu, Xi; Ma, Ding
2017-08-28
A one-step ligand-free method based on an adsorption-precipitation process was developed to fabricate iridium/cerium oxide (Ir/CeO 2 ) nanocatalysts. Ir species demonstrated a strong metal-support interaction (SMSI) with the CeO 2 substrate. The chemical state of Ir could be finely tuned by altering the loading of the metal. In the carbon dioxide (CO 2 ) hydrogenation reaction it was shown that the chemical state of Ir species-induced by a SMSI-has a major impact on the reaction selectivity. Direct evidence is provided indicating that a single-site catalyst is not a prerequisite for inhibition of methanation and sole production of carbon monoxide (CO) in CO 2 hydrogenation. Instead, modulation of the chemical state of metal species by a strong metal-support interaction is more important for regulation of the observed selectivity (metallic Ir particles select for methane while partially oxidized Ir species select for CO production). The study provides insight into heterogeneous catalysts at nano, sub-nano, and atomic scales. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jeffries, Mark; Phipps, Denham; Howard, Rachel L; Avery, Anthony; Rodgers, Sarah; Ashcroft, Darren
2017-05-10
Using strong structuration theory, we aimed to understand the adoption and implementation of an electronic clinical audit and feedback tool to support medicine optimisation for patients in primary care. This is a qualitative study informed by strong structuration theory. The analysis was thematic, using a template approach. An a priori set of thematic codes, based on strong structuration theory, was developed from the literature and applied to the transcripts. The coding template was then modified through successive readings of the data. Clinical commissioning group in the south of England. Four focus groups and five semi-structured interviews were conducted with 18 participants purposively sampled from a range of stakeholder groups (general practitioners, pharmacists, patients and commissioners). Using the system could lead to improved medication safety, but use was determined by broad institutional contexts; by the perceptions, dispositions and skills of users; and by the structures embedded within the technology. These included perceptions of the system as new and requiring technical competence and skill; the adoption of the system for information gathering; and interactions and relationships that involved individual, shared or collective use. The dynamics between these external, internal and technological structures affected the adoption and implementation of the system. Successful implementation of information technology interventions for medicine optimisation will depend on a combination of the infrastructure within primary care, social structures embedded in the technology and the conventions, norms and dispositions of those utilising it. Future interventions, using electronic audit and feedback tools to improve medication safety, should consider the complexity of the social and organisational contexts and how internal and external structures can affect the use of the technology in order to support effective implementation. © Article author(s) (or their
Bootstrapping the Three-Loop Hexagon
Energy Technology Data Exchange (ETDEWEB)
Dixon, Lance J.; /CERN /SLAC; Drummond, James M.; /CERN /Annecy, LAPTH; Henn, Johannes M.; /Humboldt U., Berlin /Santa Barbara, KITP
2011-11-08
We consider the hexagonal Wilson loop dual to the six-point MHV amplitude in planar N = 4 super Yang-Mills theory. We apply constraints from the operator product expansion in the near-collinear limit to the symbol of the remainder function at three loops. Using these constraints, and assuming a natural ansatz for the symbol's entries, we determine the symbol up to just two undetermined constants. In the multi-Regge limit, both constants drop out from the symbol, enabling us to make a non-trivial confirmation of the BFKL prediction for the leading-log approximation. This result provides a strong consistency check of both our ansatz for the symbol and the duality between Wilson loops and MHV amplitudes. Furthermore, we predict the form of the full three-loop remainder function in the multi-Regge limit, beyond the leading-log approximation, up to a few constants representing terms not detected by the symbol. Our results confirm an all-loop prediction for the real part of the remainder function in multi-Regge 3 {yields} 3 scattering. In the multi-Regge limit, our result for the remainder function can be expressed entirely in terms of classical polylogarithms. For generic six-point kinematics other functions are required.
Impurities in a non-axisymmetric plasma: Transport and effect on bootstrap current
Energy Technology Data Exchange (ETDEWEB)
Mollén, A., E-mail: albertm@chalmers.se [Department of Applied Physics, Chalmers University of Technology, Göteborg (Sweden); Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Landreman, M. [Institute for Research in Electronics and Applied Physics, University of Maryland, College Park, Maryland 20742 (United States); Smith, H. M.; Helander, P. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Braun, S. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); German Aerospace Center, Institute of Engineering Thermodynamics, Pfaffenwaldring 38-40, D-70569 Stuttgart (Germany)
2015-11-15
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21, 042503 (2014)] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/ν-scaling of the inter-species radial transport coefficient at low collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We also use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z{sub eff} of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.
On Comparison of Stochastic Reserving Methods with Bootstrapping
Directory of Open Access Journals (Sweden)
Liivika Tee
2017-01-01
Full Text Available We consider the well-known stochastic reserve estimation methods on the basis of generalized linear models, such as the (over-dispersed Poisson model, the gamma model and the log-normal model. For the likely variability of the claims reserve, bootstrap method is considered. In the bootstrapping framework, we discuss the choice of residuals, namely the Pearson residuals, the deviance residuals and the Anscombe residuals. In addition, several possible residual adjustments are discussed and compared in a case study. We carry out a practical implementation and comparison of methods using real-life insurance data to estimate reserves and their prediction errors. We propose to consider proper scoring rules for model validation, and the assessments will be drawn from an extensive case study.
A Double Parametric Bootstrap Test for Topic Models
Seto, Skyler; Tan, Sarah; Hooker, Giles; Wells, Martin T.
2017-01-01
Non-negative matrix factorization (NMF) is a technique for finding latent representations of data. The method has been applied to corpora to construct topic models. However, NMF has likelihood assumptions which are often violated by real document corpora. We present a double parametric bootstrap test for evaluating the fit of an NMF-based topic model based on the duality of the KL divergence and Poisson maximum likelihood estimation. The test correctly identifies whether a topic model based o...
Higgs Critical Exponents and Conformal Bootstrap in Four Dimensions
DEFF Research Database (Denmark)
Antipin, Oleg; Mølgaard, Esben; Sannino, Francesco
2015-01-01
We investigate relevant properties of composite operators emerging in nonsupersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points within a precise framework. The theories investigated in this work are structurally similar to the standard model of particle int...... bootstrap results are then compared to precise four dimensional conformal field theoretical results. To accomplish this, it was necessary to calculate explicitly the crossing symmetry relations for the global symmetry group SU($N$)$\\times$SU($N$)....
'Bootstrap' charging of surfaces composed of multiple materials
Stannard, P. R.; Katz, I.; Parks, D. E.
1981-01-01
The paper examines the charging of a checkerboard array of two materials, only one of which tends to acquire a negative potential alone, using the NASA Charging Analyzer Program (NASCAP). The influence of the charging material's field causes the otherwise 'non-charging' material to acquire a negative potential due to the suppression of its secondary emission ('bootstrap' charging). The NASCAP predictions for the equilibrium potential difference between the two materials are compared to results based on an analytical model.
TruSDN: Bootstrapping Trust in Cloud Network Infrastructure
Paladi, Nicolae; Gehrmann, Christian
2017-01-01
Software-Defined Networking (SDN) is a novel architectural model for cloud network infrastructure, improving resource utilization, scalability and administration. SDN deployments increasingly rely on virtual switches executing on commodity operating systems with large code bases, which are prime targets for adversaries attacking the net- work infrastructure. We describe and implement TruSDN, a framework for bootstrapping trust in SDN infrastructure using Intel Software Guard Extensions (SGX),...
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-08-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs ( e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Necessary Condition for Emergent Symmetry from the Conformal Bootstrap.
Nakayama, Yu; Ohtsuki, Tomoki
2016-09-23
We use the conformal bootstrap program to derive the necessary conditions for emergent symmetry enhancement from discrete symmetry (e.g., Z_{n}) to continuous symmetry [e.g., U(1)] under the renormalization group flow. In three dimensions, in order for Z_{2} symmetry to be enhanced to U(1) symmetry, the conformal bootstrap program predicts that the scaling dimension of the order parameter field at the infrared conformal fixed point must satisfy Δ_{1}>1.08. We also obtain the similar necessary conditions for Z_{3} symmetry with Δ_{1}>0.580 and Z_{4} symmetry with Δ_{1}>0.504 from the simultaneous conformal bootstrap analysis of multiple four-point functions. As applications, we show that our necessary conditions impose severe constraints on the nature of the chiral phase transition in QCD, the deconfinement criticality in Néel valence bond solid transitions, and anisotropic deformations in critical O(n) models. We prove that some fixed points proposed in the literature are unstable under the perturbation that cannot be forbidden by the discrete symmetry. In these situations, the second-order phase transition with enhanced symmetry cannot happen.
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Bootstrap method of interior-branch test for phylogenetic trees.
Sitnikova, T
1996-04-01
Statistical properties of the bootstrap test of interior branch lengths of phylogenetic trees have been studied and compared with those of the standard interior-branch test in computer simulations. Examination of the properties of the tests under the null hypothesis showed that both tests for an interior branch of a predetermined topology are quite reliable when the distribution of the branch length estimate approaches a normal distribution. Unlike the standard interior-branch test, the bootstrap test appears to retain this property even when the substitution rate varies among sites. In this case, the distribution of the branch length estimate deviates from a normal distribution, and the standard interior-branch test gives conservative confidence probability values. A simple correction method was developed for both interior-branch tests to be applied for testing the reliability of tree topologies estimated from sequence data. This correction for the standard interior-branch test appears to be as effective as that obtained in our previous study, though it is much simpler. The bootstrap and standard interior-branch tests for estimated topologies become conservative as the number of sequence groups in a star-like tree increases.
Generalised block bootstrap and its use in meteorology
Directory of Open Access Journals (Sweden)
L. Varga
2017-06-01
Full Text Available In an earlier paper, Rakonczai et al.(2014 emphasised the importance of investigating the effective sample size in case of autocorrelated data. The simulations were based on the block bootstrap methodology. However, the discreteness of the usual block size did not allow for exact calculations. In this paper we propose a new generalisation of the block bootstrap methodology, which allows for any positive real number as expected block size. We relate it to the existing optimisation procedures and apply it to a temperature data set. Our other focus is on statistical tests, where quite often the actual sample size plays an important role, even in the case of relatively large samples. This is especially the case for copulas. These are used for investigating the dependencies among data sets. As in quite a few real applications the time dependence cannot be neglected, we investigated the effect of this phenomenon on the used test statistic. The critical value can be computed by the proposed new block bootstrap simulation, where the block size is determined by fitting a VAR model to the observations. The results are illustrated for models of the used temperature data.
Soybean yield modeling using bootstrap methods for small samples
Energy Technology Data Exchange (ETDEWEB)
Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.
2016-11-01
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)
Stock Price Simulation Using Bootstrap and Monte Carlo
Directory of Open Access Journals (Sweden)
Pažický Martin
2017-06-01
Full Text Available In this paper, an attempt is made to assessment and comparison of bootstrap experiment and Monte Carlo experiment for stock price simulation. Since the stock price evolution in the future is extremely important for the investors, there is the attempt to find the best method how to determine the future stock price of BNP Paribas′ bank. The aim of the paper is define the value of the European and Asian option on BNP Paribas′ stock at the maturity date. There are employed four different methods for the simulation. First method is bootstrap experiment with homoscedastic error term, second method is blocked bootstrap experiment with heteroscedastic error term, third method is Monte Carlo simulation with heteroscedastic error term and the last method is Monte Carlo simulation with homoscedastic error term. In the last method there is necessary to model the volatility using econometric GARCH model. The main purpose of the paper is to compare the mentioned methods and select the most reliable. The difference between classical European option and exotic Asian option based on the experiment results is the next aim of tis paper.
Hensing, G; Holmgren, K; Rohdén, H
2015-01-01
Profound changes are taking place in the Swedish welfare state. The general population's attitudes are important insofar changes will be perceived as fair and effective to become implemented. The aim was to study attitudes to the strictness of the sick-leave rules, relocation to other work tasks after 3 months of sick leave and applications for new jobs after 6 months of sick leave. Eligible for this questionnaire study were 1,140 individuals aged 19 to 64 years. Their attitudes were analyzed in relation to age, gender, political ideology and health status. Health status was measured as sick-leave experiences, self-reported health and level of symptoms. Showed that 42% considered the sick-leave rules to be too strict, 60% found relocation to other work tasks to be good while 35% found that applications for new work were good. In logistic regression analyses, high sick-leave experience was associated with increased odds of finding the sick-leave rules too strict and disagreement with relocation to other work tasks or application for new jobs. In conclusion, strong support was found for relocation to other work tasks with the present employer. Earlier research on returning to work has found workplace interventions to be efficient. From a policy perspective it seems relevant to promote such interventions given the strong public opinion in their favor.
Directory of Open Access Journals (Sweden)
Gogarten J Peter
2002-02-01
Full Text Available Abstract Background Horizontal gene transfer (HGT played an important role in shaping microbial genomes. In addition to genes under sporadic selection, HGT also affects housekeeping genes and those involved in information processing, even ribosomal RNA encoding genes. Here we describe tools that provide an assessment and graphic illustration of the mosaic nature of microbial genomes. Results We adapted the Maximum Likelihood (ML mapping to the analyses of all detected quartets of orthologous genes found in four genomes. We have automated the assembly and analyses of these quartets of orthologs given the selection of four genomes. We compared the ML-mapping approach to more rigorous Bayesian probability and Bootstrap mapping techniques. The latter two approaches appear to be more conservative than the ML-mapping approach, but qualitatively all three approaches give equivalent results. All three tools were tested on mitochondrial genomes, which presumably were inherited as a single linkage group. Conclusions In some instances of interphylum relationships we find nearly equal numbers of quartets strongly supporting the three possible topologies. In contrast, our analyses of genome quartets containing the cyanobacterium Synechocystis sp. indicate that a large part of the cyanobacterial genome is related to that of low GC Gram positives. Other groups that had been suggested as sister groups to the cyanobacteria contain many fewer genes that group with the Synechocystis orthologs. Interdomain comparisons of genome quartets containing the archaeon Halobacterium sp. revealed that Halobacterium sp. shares more genes with Bacteria that live in the same environment than with Bacteria that are more closely related based on rRNA phylogeny . Many of these genes encode proteins involved in substrate transport and metabolism and in information storage and processing. The performed analyses demonstrate that relationships among prokaryotes cannot be accurately
Im, Subin; Min, Soonhong
2013-04-01
Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle
2013-01-01
A version of the nonparametric bootstrap, which resamples the entire subjects from original data, called the case bootstrap, has been increasingly used for estimating uncertainty of parameters in mixed-effects models. It is usually applied to obtain more robust estimates of the parameters and more realistic confidence intervals (CIs). Alternative bootstrap methods, such as residual bootstrap and parametric bootstrap that resample both random effects and residuals, have been proposed to better take into account the hierarchical structure of multi-level and longitudinal data. However, few studies have been performed to compare these different approaches. In this study, we used simulation to evaluate bootstrap methods proposed for linear mixed-effect models. We also compared the results obtained by maximum likelihood (ML) and restricted maximum likelihood (REML). Our simulation studies evidenced the good performance of the case bootstrap as well as the bootstraps of both random effects and residuals. On the other hand, the bootstrap methods that resample only the residuals and the bootstraps combining case and residuals performed poorly. REML and ML provided similar bootstrap estimates of uncertainty, but there was slightly more bias and poorer coverage rate for variance parameters with ML in the sparse design. We applied the proposed methods to a real dataset from a study investigating the natural evolution of Parkinson's disease and were able to confirm that the methods provide plausible estimates of uncertainty. Given that most real-life datasets tend to exhibit heterogeneity in sampling schedules, the residual bootstraps would be expected to perform better than the case bootstrap. Copyright © 2013 John Wiley & Sons, Ltd.
Fronstin, Paul; Helman, Ruth
2009-07-01
PUBLIC SUPPORT FOR HEALTH REFORM: Findings from the 2009 Health Confidence Survey--the 12th annual HCS--indicate that Americans have already formed strong opinions regarding various aspects of health reform, even before details have been released regarding various key factors. These issues include health insurance market reform, the availability of a public plan option, mandates on employers and individuals, subsidized coverage for the low-income population, changes to the tax treatment of job-based health benefits, and regulatory oversight of health care. These opinions may change as details surface, especially as they concern financing options. In the absence of such details, the 2009 HCS finds generally strong support for the concepts of health reform options that are currently on the table. U.S. HEALTH SYSTEM GETS POOR MARKS, BUT SO DOES A MAJOR OVERHAUL: A majority rate the nation's health care system as fair (30 percent) or poor (29 percent). Only a small minority rate it excellent (6 percent) or very good (10 percent). While 14 percent of Americans think the health care system needs a major overhaul, 51 percent agree with the statement "there are some good things about our health care system, but major changes are needed." NATIONAL HEALTH PLAN ELEMENTS RATED HIGHLY: Between 68 percent and 88 percent of Americans either strongly or somewhat support health reform ideas such as national health plans, a public plan option, guaranteed issue, expansion of Medicare and Medicaid, and employer and individual mandates. MIXED REACTION TO HEALTH BENEFITS TAX CAP: Reaction to capping the current tax exclusion of employment-based health benefits is mixed. Nearly one-half of Americans (47 percent) would switch to a lower-cost plan if the tax exclusion were capped, 38 percent would stay on their current plan and pay the additional taxes, and 9 percent don't know. CONTINUED FAITH IN EMPLOYMENT-BASED BENEFITS, BUT DOUBTS ON AFFORDABILITY: Individuals with employment
Two novel applications of bootstrap currents: Snakes and jitter stabilization
Thyagaraja, A.; Haas, F. A.
1993-09-01
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of noninductive current are considered in this work. In the first, an earlier model of the nonlinearly saturated m=1 tearing mode is extended to include the stabilizing effect of a bootstrap current inside the island. This is used to explain several observed features of the so-called ``snake'' reported in the Joint European Torus (JET) [R. D. Gill, A. W. Edwards, D. Pasini, and A. Weller, Nucl. Fusion 32, 723 (1992)]. The second application involves an alternating current (ac) form of bootstrap current, produced by pressure-gradient fluctuations. It is suggested that a time-dependent (in the plasma frame), radio-frequency (rf) power source can be used to produce localized pressure fluctuations of suitable frequency and amplitude to implement the dynamic stabilization method for suppressing gross modes in tokamaks suggested in a recent paper [A. Thyagaraja, R. D. Hazeltine, and A. Y. Aydemir, Phys. Fluids B 4, 2733 (1992)]. This method works by ``detuning'' the resonant layer by rapid current/shear fluctuations. Estimates made for the power source requirements both for small machines such as COMPASS and for larger machines like JET suggest that the method could be practically feasible. This ``jitter'' (i.e., dynamic) stabilization method could provide a useful form of active instability control to avoid both gross/disruptive and fine-scale/transportive instabilities, which may set severe operating/safety constraints in the reactor regime. The results are also capable, in principle, of throwing considerable light on the local properties of current generation and diffusion in tokamaks, which may be enhanced by turbulence, as has been suggested recently by several researchers.
Pak, Alexander J.; Hwang, Gyeong S.
2016-09-01
One important attribute of graphene that makes it attractive for high-performance electronics is its inherently large thermal conductivity (κ ) for the purposes of thermal management. Using a combined density-functional theory and classical molecular-dynamics approach, we predict that the κ of graphene supported on hexagonal boron nitride (h -BN) can be as large as 90% of the κ of suspended graphene, in contrast to the significant suppression of κ (more than 70% reduction) on amorphous silica. Interestingly, we find that this enhanced thermal transport is largely attributed to increased lifetimes of the in-plane acoustic phonon modes, which is a notable contrast from the dominant contribution of out-of-plane acoustic modes in suspended graphene. This behavior is possible due to the charge polarization throughout graphene that induces strong interlayer adhesion between graphene and h -BN. These findings highlight the potential benefit of layered dielectric substrates such as h -BN for graphene-based thermal management, in addition to their electronic advantages. Furthermore, our study brings attention to the importance of understanding the interlayer interactions of graphene with layered dielectric materials which may offer an alternative technological platform for substrates in electronics.
DEFF Research Database (Denmark)
Hounyo, Ulrich
covariance estimator. As an application of our results, we also consider the bootstrap for regression coefficients. We show that the wild blocks of bootstrap, appropriately centered, is able to mimic both the dependence and heterogeneity of the scores, thus justifying the construction of bootstrap percentile......We propose a bootstrap mehtod for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootsratp method suggested for the pre-averaged realized volatility estimator......-studentized statistics, our results justify using the bootstrap to esitmate the covariance matrix of a broad class of covolatility estimators. The bootstrap variance estimator is positive semi-definite by construction, an appealing feature that is not always shared by existing variance estimators of the integrated...
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, G.; Rahbek, Anders; Taylor, A.M.R.
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying vector autoregressive (VAR) model which obtain under the reduced rank null hypothesis. They propose methods based on an independent and individual distributed (i.i.d.) bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co......-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when time-varying behavior is present in either the conditional or unconditional variance of the innovations. We...
The S-matrix bootstrap II: two dimensional amplitudes
Paulos, Miguel F.; Penedones, Joao; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-01
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Check of the bootstrap conditions for the gluon Reggeization
International Nuclear Information System (INIS)
Papa, A.
2000-01-01
The property of gluon Reggeization plays an essential role in the derivation of the Balitsky-Fadin-Kuraev-Lipatov (BFKL) equation for the cross sections at high energy √s in perturbative QCD. This property has been proved to all orders of perturbation theory in the leading logarithmic approximation and it is assumed to be valid also in the next-to-leading logarithmic approximation, where it has been checked only to the first three orders of perturbation theory. From s-channel unitarity, however, very stringent 'bootstrap' conditions can be derived which, if fulfilled, leave no doubts that gluon Reggeization holds
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Dimensional Reduction via Noncommutative Spacetime: Bootstrap and Holography
Li, Miao
2002-05-01
Unlike noncommutative space, when space and time are noncommutative, it seems necessary to modify the usual scheme of quantum mechanics. We propose in this paper a simple generalization of the time evolution equation in quantum mechanics to incorporate the feature of a noncommutative spacetime. This equation is much more constraining than the usual Schrödinger equation in that the spatial dimension noncommuting with time is effectively reduced to a point in low energy. We thus call the new evolution equation the spacetime bootstrap equation, the dimensional reduction called for by this evolution seems close to what is required by the holographic principle. We will discuss several examples to demonstrate this point.
DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations
National Aeronautics and Space Administration — DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations in polar stereographic projection currently include Defense Meteorological Satellite...
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
Assessing statistical reliability of phylogenetic trees via a speedy double bootstrap method.
Ren, Aizhen; Ishida, Takashi; Akiyama, Yutaka
2013-05-01
Evaluating the reliability of estimated phylogenetic trees is of critical importance in the field of molecular phylogenetics, and for other endeavors that depend on accurate phylogenetic reconstruction. The bootstrap method is a well-known computational approach to phylogenetic tree assessment, and more generally for assessing the reliability of statistical models. However, it is known to be biased under certain circumstances, calling into question the accuracy of the method. Several advanced bootstrap methods have been developed to achieve higher accuracy, one of which is the double bootstrap approach, but the computational burden of this method has precluded its application to practical problems of phylogenetic tree selection. We address this issue by proposing a simple method called the speedy double bootstrap, which circumvents the second-tier resampling step in the regular double bootstrap approach. We also develop an implementation of the regular double bootstrap for comparison with our speedy method. The speedy double bootstrap suffers no significant loss of accuracy compared with the regular double bootstrap, while performing calculations significantly more rapidly (at minimum around 371 times faster, based on analysis of mammalian mitochondrial amino acid sequences and 12S and 16S rRNA genes). Our method thus enables, for the first time, the practical application of the double bootstrap technique in the context of molecular phylogenetics. The approach can also be used more generally for model selection problems wherever the maximum likelihood criterion is used. Copyright © 2013 Elsevier Inc. All rights reserved.
Optical Flow of Small Objects Using Wavelets, Bootstrap Methods, and Synthetic Discriminant Filters
National Research Council Canada - National Science Library
Hewer, Gary
1997-01-01
...) targets in highly cluttered and noisy environments. In this paper; we present a novel wavelet detection algorithm which incorporates adaptive CFAR detection statistics using the bootstrap method...
Energy confinement of tokamak plasma with consideration of bootstrap current effect
International Nuclear Information System (INIS)
Yuan Ying; Gao Qingdi
1992-01-01
Based on the η i -mode induced anomalous transport model of Lee et al., the energy confinement of tokamak plasmas with auxiliary heating is investigated with consideration of bootstrap current effect. The results indicate that energy confinement time increases with plasma current and tokamak major radius, and decreases with heating power, toroidal field and minor radius. This is in reasonable agreement with the Kaye-Goldston empirical scaling law. Bootstrap current always leads to an improvement of energy confinement and the contraction of inversion radius. When γ, the ratio between bootstrap current and total plasma current, is small, the part of energy confinement time contributed from bootstrap current will be about γ/2
Interaction of bootstrap-current-driven magnetic islands
International Nuclear Information System (INIS)
Hegna, C.C.; Callen, J.D.
1991-10-01
The formation and interaction of fluctuating neoclassical pressure gradient driven magnetic islands is examined. The interaction of magnetic islands produces a stochastic region around the separatrices of the islands. This interaction causes the island pressure profile to be broadened, reducing the island bootstrap current and drive for the magnetic island. A model is presented that describes the magnetic topology as a bath of interacting magnetic islands with low to medium poloidal mode number (m congruent 3-30). The islands grow by the bootstrap current effect and damp due to the flattening of the pressure profile near the island separatrix caused by the interaction of the magnetic islands. The effect of this sporadic growth and decay of the islands (''magnetic bubbling'') is not normally addressed in theories of plasma transport due to magnetic fluctuations. The nature of the transport differs from statistical approaches to magnetic turbulence since the radial step size of the plasma transport is now given by the characteristic island width. This model suggests that tokamak experiments have relatively short-lived, coherent, long wavelength magnetic oscillations present in the steep pressure-gradient regions of the plasma. 42 refs
Quantifying uncertainty on sediment loads using bootstrap confidence intervals
Slaets, Johanna I. F.; Piepho, Hans-Peter; Schmitter, Petra; Hilger, Thomas; Cadisch, Georg
2017-01-01
Load estimates are more informative than constituent concentrations alone, as they allow quantification of on- and off-site impacts of environmental processes concerning pollutants, nutrients and sediment, such as soil fertility loss, reservoir sedimentation and irrigation channel siltation. While statistical models used to predict constituent concentrations have been developed considerably over the last few years, measures of uncertainty on constituent loads are rarely reported. Loads are the product of two predictions, constituent concentration and discharge, integrated over a time period, which does not make it straightforward to produce a standard error or a confidence interval. In this paper, a linear mixed model is used to estimate sediment concentrations. A bootstrap method is then developed that accounts for the uncertainty in the concentration and discharge predictions, allowing temporal correlation in the constituent data, and can be used when data transformations are required. The method was tested for a small watershed in Northwest Vietnam for the period 2010-2011. The results showed that confidence intervals were asymmetric, with the highest uncertainty in the upper limit, and that a load of 6262 Mg year-1 had a 95 % confidence interval of (4331, 12 267) in 2010 and a load of 5543 Mg an interval of (3593, 8975) in 2011. Additionally, the approach demonstrated that direct estimates from the data were biased downwards compared to bootstrap median estimates. These results imply that constituent loads predicted from regression-type water quality models could frequently be underestimating sediment yields and their environmental impact.
A wild bootstrap approach for the Aalen-Johansen estimator.
Bluhmki, Tobias; Schmoor, Claudia; Dobler, Dennis; Pauly, Markus; Finke, Juergen; Schumacher, Martin; Beyersmann, Jan
2018-02-16
We suggest a wild bootstrap resampling technique for nonparametric inference on transition probabilities in a general time-inhomogeneous Markov multistate model. We first approximate the limiting distribution of the Nelson-Aalen estimator by repeatedly generating standard normal wild bootstrap variates, while the data is kept fixed. Next, a transformation using a functional delta method argument is applied. The approach is conceptually easier than direct resampling for the transition probabilities. It is used to investigate a non-standard time-to-event outcome, currently being alive without immunosuppressive treatment, with data from a recent study of prophylactic treatment in allogeneic transplanted leukemia patients. Due to non-monotonic outcome probabilities in time, neither standard survival nor competing risks techniques apply, which highlights the need for the present methodology. Finite sample performance of time-simultaneous confidence bands for the outcome probabilities is assessed in an extensive simulation study motivated by the clinical trial data. Example code is provided in the web-based Supplementary Materials. © 2018, The International Biometric Society.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design. PMID:26845694
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.
Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.
Bootstrap equations for N=4 SYM with defects
Energy Technology Data Exchange (ETDEWEB)
Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Meneghelli, Carlo [Simons Center for Geometry and Physics, Stony Brook University,Stony Brook, NY 11794-3636 (United States)
2017-01-27
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4{sup ∗}|4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
Bootstrap equations for N=4 SYM with defects
International Nuclear Information System (INIS)
Liendo, Pedro; Meneghelli, Carlo
2017-01-01
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4 ∗ |4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.
Strongly Agree or Strongly Disagree?
DEFF Research Database (Denmark)
Carrizosa, Emilio; Nogales-Gómez, Amaya; Morales, Dolores Romero
2016-01-01
In linear classifiers, such as the Support Vector Machine (SVM), a score is associated with each feature and objects are assigned to classes based on the linear combination of the scores and the values of the features. Inspired by discrete psychometric scales, which measure the extent to which...... a factor is in agreement with a statement, we propose the Discrete Level Support Vector Machine (DILSVM) where the feature scores can only take on a discrete number of values, defined by the so-called feature rating levels. The DILSVM classifier benefits from interpretability and it has visual appeal...
Kim, Se-Kang
The effect of bootstrapping was studied by examining whether major profile patterns were replicated when sample sizes were reduced. Profile patterns estimated from the original sample (n=645) of the Wechsler Preschool and Primary Scale of IntelligenceThird Edition (WPPSI-III) Standardization Data were considered major profiles. For bootstrapping,…
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Internal validation of risk models in clustered data: a comparison of bootstrap schemes
Bouwmeester, W.; Moons, K.G.M.; Kappen, T.H.; van Klei, W.A.; Twisk, J.W.R.; Eijkemans, M.J.C.; Vergouwe, Y.
2013-01-01
Internal validity of a risk model can be studied efficiently with bootstrapping to assess possible optimism in model performance. Assumptions of the regular bootstrap are violated when the development data are clustered. We compared alternative resampling schemes in clustered data for the estimation
On the consistency of bootstrap testing for a parameter on the boundary of the parameter space
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Heino Bohn; Rahbek, Anders
2017-01-01
It is well known that with a parameter on the boundary of the parameter space, such as in the classic cases of testing for a zero location parameter or no autoregressive conditional heteroskedasticity (ARCH) effects, the classic nonparametric bootstrap – based on unrestricted parameter estimates...... the standard and bootstrap Lagrange multiplier tests as well as the asymptotic quasi-likelihood ratio test....
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
A bootstrap method for estimating uncertainty of water quality trends
Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura
2015-01-01
Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.
Integral equations of hadronic correlation functions a functional- bootstrap approach
Manesis, E K
1974-01-01
A reasonable 'microscopic' foundation of the Feynman hadron-liquid analogy is offered, based on a class of models for hadron production. In an external field formalism, the equivalence (complementarity) of the exclusive and inclusive descriptions of hadronic reactions is specifically expressed in a functional-bootstrap form, and integral equations between inclusive and exclusive correlation functions are derived. Using the latest CERN-ISR data on the two-pion inclusive correlation function, and assuming rapidity translational invariance for the exclusive one, the simplest integral equation is solved in the 'central region' and an exclusive correlation length in rapidity predicted. An explanation is also offered for the unexpected similarity observed between pi /sup +/ pi /sup -/ and pi /sup -/ pi /sup -/ inclusive correlations. (31 refs).
arXiv Bootstrapping the QCD soft anomalous dimension
Almelid, Øyvind; Gardi, Einan; McLeod, Andrew; White, Chris D.
2017-09-18
The soft anomalous dimension governs the infrared singularities of scattering amplitudes to all orders in perturbative quantum field theory, and is a crucial ingredient in both formal and phenomenological applications of non-abelian gauge theories. It has recently been computed at three-loop order for massless partons by explicit evaluation of all relevant Feynman diagrams. In this paper, we show how the same result can be obtained, up to an overall numerical factor, using a bootstrap procedure. We first give a geometrical argument for the fact that the result can be expressed in terms of single-valued harmonic polylogarithms. We then use symmetry considerations as well as known properties of scattering amplitudes in collinear and high-energy (Regge) limits to constrain an ansatz of basis functions. This is a highly non-trivial cross-check of the result, and our methods pave the way for greatly simplified higher-order calculations.
Performance of Bootstrap MCEWMA: Study case of Sukuk Musyarakah data
Safiih, L. Muhamad; Hila, Z. Nurul
2014-07-01
Sukuk Musyarakah is one of several instruments of Islamic bond investment in Malaysia, where the form of this sukuk is actually based on restructuring the conventional bond to become a Syariah compliant bond. The Syariah compliant is based on prohibition of any influence of usury, benefit or fixed return. Despite of prohibition, daily returns of sukuk are non-fixed return and in statistic, the data of sukuk returns are said to be a time series data which is dependent and autocorrelation distributed. This kind of data is a crucial problem whether in statistical and financing field. Returns of sukuk can be statistically viewed by its volatility, whether it has high volatility that describing the dramatically change of price and categorized it as risky bond or else. However, this crucial problem doesn't get serious attention among researcher compared to conventional bond. In this study, MCEWMA chart in Statistical Process Control (SPC) is mainly used to monitor autocorrelated data and its application on daily returns of securities investment data has gained widespread attention among statistician. However, this chart has always been influence by inaccurate estimation, whether on base model or its limit, due to produce large error and high of probability of signalling out-of-control process for false alarm study. To overcome this problem, a bootstrap approach used in this study, by hybridise it on MCEWMA base model to construct a new chart, i.e. Bootstrap MCEWMA (BMCEWMA) chart. The hybrid model, BMCEWMA, will be applied to daily returns of sukuk Musyarakah for Rantau Abang Capital Bhd. The performance of BMCEWMA base model showed that its more effective compare to real model, MCEWMA based on smaller error estimation, shorter the confidence interval and smaller false alarm. In other word, hybrid chart reduce the variability which shown by smaller error and false alarm. It concludes that the application of BMCEWMA is better than MCEWMA.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
. The bootstrap could also offer confidence limits for contribution plots with acceptable fault diagnostic power. The performance of bootstrap-based and asymptotic confidence limits was compared in batch MSPC (Paper III). Real and simulated batch process datasets were used to build the limits for five PCA....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...... be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q...
Bootstrapping realized volatility and realized beta under a local Gaussianity assumption
DEFF Research Database (Denmark)
Hounyo, Ulrich
The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap...... matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able...
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
We provide a new resampling procedure - the local stable bootstrap - that is able to mimic the dependence properties of realized power variations for pure-jump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity...... index of the underlying process, β, as well as a bootstrap test for whether it obeys a jump-diffusion or a pure-jump process, that is, of the null hypothesis H₀: β=2 against the alternative H₁: βbootstrap power variations, activity index...... estimator, and diffusion test for H0. Moreover, the finite sample size and power properties of the proposed diffusion test are compared to those of benchmark tests using Monte Carlo simulations. Unlike existing procedures, our bootstrap test is correctly sized in general settings. Finally, we illustrate use...
EBW-Bootstrap Current Synergy in the National Spherical Torus Experiment (NSTX)
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code, to determine the degree of synergy between them. A target β = 40% NSTX plasma is examined. A simple bootstrap model in the CQL3D Fokker-Planck code is used in these studies: the transiting electron distributions are connected in velocity-space at the trapped-passing boundary to trapped-electron distributions which are displaced radially by a half-banana width outwards/inwards for the co-/counter-passing regions. This model agrees well with standard bootstrap current calculations, over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
The Finite Population Bootstrap - From the Maximum Likelihood to the Horvitz-Thompson Approach
Directory of Open Access Journals (Sweden)
Andreas Quatember
2014-06-01
Full Text Available The finite population bootstrap method is used as a computer-intensive alternative to estimate the sampling distribution of a sample statis-tic. The generation of a so-called “bootstrap population” is the necessarystep between the original sample drawn and the resamples needed to mimicthis distribution. The most important question for researchers to answer ishow to create an adequate bootstrap population, which may serve as a close-to-reality basis for the resampling process. In this paper, a review of someapproaches to answer this fundamental question is presented. Moreover, anapproach based on the idea behind the Horvitz-Thompson estimator allow-ing not only whole units in the bootstrap population but also parts of wholeunits is proposed. In a simulation study, this method is compared with a moreheuristic technique from the bootstrap literature.
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
A Smooth Bootstrap Procedure towards Deriving Confidence Intervals for the Relative Risk.
Wang, Dongliang; Hutson, Alan D
Given a pair of sample estimators of two independent proportions, bootstrap methods are a common strategy towards deriving the associated confidence interval for the relative risk. We develop a new smooth bootstrap procedure, which generates pseudo-samples from a continuous quantile function. Under a variety of settings, our simulation studies show that our method possesses a better or equal performance in comparison with asymptotic theory based and existing bootstrap methods, particularly for heavily unbalanced data in terms of coverage probability and power. We illustrate our procedure as applied to several published data sets.
Validation of Nonparametric Two-Sample Bootstrap in ROC Analysis on Large Datasets.
Wu, Jin Chu; Martin, Alvin F; Kacker, Raghu N
The nonparametric two-sample bootstrap is applied to computing uncertainties of measures in ROC analysis on large datasets in areas such as biometrics, speaker recognition, etc., when the analytical method cannot be used. Its validation was studied by computing the SE of the area under ROC curve using the well-established analytical Mann-Whitney-statistic method and also using the bootstrap. The analytical result is unique. The bootstrap results are expressed as a probability distribution due to its stochastic nature. The comparisons were carried out using relative errors and hypothesis testing. They match very well. This validation provides a sound foundation for such computations.
ASAL BİLEŞENLER ANALİZİNE BOOTSTRAP YAKLAŞIMI
AKTÜKÜN, Dr. Aylin
2011-01-01
Bu çalışmada, bootstrap yöntemlerin asal bileşenler analizine uygulanma sürecini sunduk. Hipotetik bir veri ile asal bileşenler analizinde başvurulan bazı güven aralıklarının bootstrap yöntemlerle nasıl gerçekleştirilebileceğini gösterdik. Makaledeki tüm bootstrap süreçleri Mathematica dilinde yazdığımız bir programla gerçekleştirdik.
DEFF Research Database (Denmark)
Aro, A R; Nyberg, N; Absetz, P
2001-01-01
The association of socioeconomic factors, health-related factors, and social support with depressive symptoms has been extensively studied. However, most epidemiological studies have focused on a few factors such as marital status, social class, and employment. In this study of middle-aged women we....... Socioeconomic, health-related, and social support factors were all measured with single items. All variables, except level of urbanization, were significantly associated with depressive symptoms in univariate analyses. Multivariate associations were examined with standard multiple regression analyses in three...
A voltage biased superconducting quantum interference device bootstrap circuit
International Nuclear Information System (INIS)
Xie Xiaoming; Wang Huiwu; Wang Yongliang; Dong Hui; Jiang Mianheng; Zhang Yi; Krause, Hans-Joachim; Braginski, Alex I; Offenhaeusser, Andreas; Mueck, Michael
2010-01-01
We present a dc superconducting quantum interference device (SQUID) readout circuit operating in the voltage bias mode and called a SQUID bootstrap circuit (SBC). The SBC is an alternative implementation of two existing methods for suppression of room-temperature amplifier noise: additional voltage feedback and current feedback. Two circuit branches are connected in parallel. In the dc SQUID branch, an inductively coupled coil connected in series provides the bias current feedback for enhancing the flux-to-current coefficient. The circuit branch parallel to the dc SQUID branch contains an inductively coupled voltage feedback coil with a shunt resistor in series for suppressing the preamplifier noise current by increasing the dynamic resistance. We show that the SBC effectively reduces the preamplifier noise to below the SQUID intrinsic noise. For a helium-cooled planar SQUID magnetometer with a SQUID inductance of 350 pH, a flux noise of about 3 μΦ 0 Hz -1/2 and a magnetic field resolution of less than 3 fT Hz -1/2 were obtained. The SBC leads to a convenient direct readout electronics for a dc SQUID with a wider adjustment tolerance than other feedback schemes.
High efficiency fusion reactor based on bootstrap current
International Nuclear Information System (INIS)
Kikuchi, Mitsuru
1990-01-01
The establishment of the steady operation technology which has been the largest subject of the research on the nuclear fusion reactors utilizing tokamak type confinement principle advanced largely by the recent research, and the concept of the power reactor which can be made steady with high efficiency was established. This is to utilize positively the bootstrap current naturally flowing in tokamak plasma. This power reactor can be materialized with the technologies in near future, and there is the possibility that it can become the clear target of developing nuclear fusion reactors as electric power generation plants. In this report, explanation is made centering around the high efficiency nuclear fusion reactor SSTR, for which Japan Atomic Energy Research Institute advances the conceptual design as the prototype power reactor. In nuclear fusion energy CO 2 gas is not generated, essentially nuclear runaway never occurs, and radioactive wastes can be relatively reduced, therefore, it is expected to become a powerful substitute energy source. The main parameters and features of the steady state tokamak nuclear fusion power reactor (SSTR) are reported. (K.I.)
N = 4 superconformal bootstrap of the K3 CFT
Lin, Ying-Hsuan; Shao, Shu-Heng; Simmons-Duffin, David; Wang, Yifan; Yin, Xi
2017-05-01
We study two-dimensional (4, 4) superconformal field theories of central charge c = 6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N = 4 superconformal blocks with c = 6 and bosonic Virasoro conformal blocks with c = 28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A 1 N = 4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence between a class of BPS N = 2 superconformal blocks and Virasoro conformal blocks in two dimensions, and an upper bound on the four-point functions of operators of sufficiently low scaling dimension in three and four dimensional CFTs.
N=4 Superconformal Bootstrap of the K3 CFT
CERN. Geneva
2015-01-01
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear σ models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A1 N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence...
Bias Reversal Technique in SQUID Bootstrap Circuit (SBC) Scheme
Rong, Liangliang; Zhang, Yi; Zhang, Guofeng; Wu, Jun; Dong, Hui; Qiu, Longqing; Xie, Xiaoming; Offenhüusser, Andreas
Recently, a SQUID direct readout scheme called voltage-biased SQUID Bootstrap Circuit (SBC) is introduced to reduce preamplifier noise contribution. In this paper, we describe a concept of SBC with bias reversal technique which can suppress SQUID intrinsic 1/f noise. When applying a symmetrically rectangular voltage across SBC, two I-Φ characteristics appear at the amplifier output. In order to return to one I - Φ curve, a demodulation technique is required. Because of the asymmetry of typical SBC I-Φ curve, the demodulation method is realized by using a flux compensation of one half Φ0 flux shift. The output signal is then filtered and returned to one I-Φ curve for ordinary FLL readout. It was found, the reversal frequency fR can be dramatically enhanced when using a preamplifier consisting of two operational amplifiers. A planar Nb SQUID magnetometer with a loop-inductance of 350 pH, fR =50 kHz and a second order low pass filter with 10 kHz cut off frequency was employed in our experiment. Results prove the feasibility of SBC bias reversal method. Comparative experiment on noise performance will be carried out in further studies.
Czech Academy of Sciences Publication Activity Database
Calore, L.; Cavinato, g.; Canton, P.; Peruzzo, L.; Banavali, R.; Jeřábek, Karel; Corain, B.
2012-01-01
Roč. 391, AUG 30 (2012), s. 114-120 ISSN 0020-1693 Institutional support: RVO:67985858 Keywords : strongly acidic cross-linked polymer * frameworks * gold(0) nanoclusters Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 1.687, year: 2012
BoCluSt: Bootstrap Clustering Stability Algorithm for Community Detection.
Garcia, Carlos
2016-01-01
The identification of modules or communities in sets of related variables is a key step in the analysis and modeling of biological systems. Procedures for this identification are usually designed to allow fast analyses of very large datasets and may produce suboptimal results when these sets are of a small to moderate size. This article introduces BoCluSt, a new, somewhat more computationally intensive, community detection procedure that is based on combining a clustering algorithm with a measure of stability under bootstrap resampling. Both computer simulation and analyses of experimental data showed that BoCluSt can outperform current procedures in the identification of multiple modules in data sets with a moderate number of variables. In addition, the procedure provides users with a null distribution of results to evaluate the support for the existence of community structure in the data. BoCluSt takes individual measures for a set of variables as input, and may be a valuable and robust exploratory tool of network analysis, as it provides 1) an estimation of the best partition of variables into modules, 2) a measure of the support for the existence of modular structures, and 3) an overall description of the whole structure, which may reveal hierarchical modular situations, in which modules are composed of smaller sub-modules.
Bootstrapped Discovery and Ranking of Relevant Services and Information in Context-aware Systems
Directory of Open Access Journals (Sweden)
Preeti Bhargava
2015-08-01
Full Text Available A context-aware system uses context to provide relevant information and services to the user, where relevancy depends on the user’s situation. This relevant information could include a wide range of heterogeneous content. Many existing context-aware systems determine this information based on pre-defined ontologies or rules. In addition, they rely on users’ context history to filter it. Moreover, they often provide domain-specific information. Such systems are not applicable to a large and varied set of user situations and information needs, and may suffer from cold start for new users. In this paper, we address these limitations and propose a novel, general and flexible approach for bootstrapped discovery and ranking of heterogeneous relevant services and information in context-aware systems. We design and implement four variations of a base algorithm that ranks candidate relevant services, and the information to be retrieved from them, based on the semantic relatedness between the information provided by the services and the user’s situation description. We conduct a live deployment with 14 subjects to evaluate the efficacy of our algorithms. We demonstrate that they have strong positive correlation with human supplied relevance rankings and can be used as an effective means to discover and rank relevant services and information. We also show that our approach is applicable to a wide set of users’ situations and to new users without requiring any user interaction history.
Kantar, E.; Deviren, B.; Keskin, M.
2011-11-01
We present a study, within the scope of econophysics, of the hierarchical structure of 98 among the largest international companies including 18 among the largest Turkish companies, namely Banks, Automobile, Software-hardware, Telecommunication Services, Energy and the Oil-Gas sectors, viewed as a network of interacting companies. We analyze the daily time series data of the Boerse-Frankfurt and Istanbul Stock Exchange. We examine the topological properties among the companies over the period 2006-2010 by using the concept of hierarchical structure methods (the minimal spanning tree (MST) and the hierarchical tree (HT)). The period is divided into three subperiods, namely 2006-2007, 2008 which was the year of global economic crisis, and 2009-2010, in order to test various time-windows and observe temporal evolution. We carry out bootstrap analyses to associate the value of statistical reliability to the links of the MSTs and HTs. We also use average linkage clustering analysis (ALCA) in order to better observe the cluster structure. From these studies, we find that the interactions among the Banks/Energy sectors and the other sectors were reduced after the global economic crisis; hence the effects of the Banks and Energy sectors on the correlations of all companies were decreased. Telecommunication Services were also greatly affected by the crisis. We also observed that the Automobile and Banks sectors, including Turkish companies as well as some companies from the USA, Japan and Germany were strongly correlated with each other in all periods.
Isenberg, James
2017-01-01
The Hawking-Penrose theorems tell us that solutions of Einstein's equations are generally singular, in the sense of the incompleteness of causal geodesics (the paths of physical observers). These singularities might be marked by the blowup of curvature and therefore crushing tidal forces, or by the breakdown of physical determinism. Penrose has conjectured (in his `Strong Cosmic Censorship Conjecture`) that it is generically unbounded curvature that causes singularities, rather than causal breakdown. The verification that ``AVTD behavior'' (marked by the domination of time derivatives over space derivatives) is generically present in a family of solutions has proven to be a useful tool for studying model versions of Strong Cosmic Censorship in that family. I discuss some of the history of Strong Cosmic Censorship, and then discuss what is known about AVTD behavior and Strong Cosmic Censorship in families of solutions defined by varying degrees of isometry, and discuss recent results which we believe will extend this knowledge and provide new support for Strong Cosmic Censorship. I also comment on some of the recent work on ``Weak Null Singularities'', and how this relates to Strong Cosmic Censorship.
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
National Research Council Canada - National Science Library
Lack, Lindsey
2003-01-01
.... Early tiger teams recognized the possibility of this design and compared it to the two-card bootstrap loader used in mainframes since both exhibit the characteristics of compactness and adaptability...
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
The non-local bootstrap--estimation of uncertainty in diffusion MRI.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2013-01-01
Diffusion MRI is a noninvasive imaging modality that allows for the estimation and visualization of white matter connectivity patterns in the human brain. However, due to the low signal-to-noise ratio (SNR) nature of diffusion data, deriving useful statistics from the data is adversely affected by different sources of measurement noise. This is aggravated by the fact that the sampling distribution of the statistic of interest is often complex and unknown. In situations as such, the bootstrap, due to its distribution-independent nature, is an appealing tool for the estimation of the variability of almost any statistic, without relying on complicated theoretical calculations, but purely on computer simulation. In this work, we present new bootstrap strategies for variability estimation of diffusion statistics in association with noise. In contrast to the residual bootstrap, which relies on a predetermined data model, or the repetition bootstrap, which requires repeated signal measurements, our approach, called the non-local bootstrap (NLB), is non-parametric and obviates the need for time-consuming multiple acquisitions. The key assumption of NLB is that local image structures recur in the image. We exploit this self-similarity via a multivariate non-parametric kernel regression framework for bootstrap estimation of uncertainty. Evaluation of NLB using a set of high-resolution diffusion-weighted images, with lower than usual SNR due to the small voxel size, indicates that NLB is markedly more robust to noise and results in more accurate inferences.
In vivo precision of bootstrap algorithms applied to diffusion tensor imaging data.
Vorburger, Robert S; Reischauer, Carolin; Dikaiou, Katerina; Boesiger, Peter
2012-10-01
To determine the precision for in vivo applications of model and non-model-based bootstrap algorithms for estimating the measurement uncertainty of diffusion parameters derived from diffusion tensor imaging data. Four different bootstrap methods were applied to diffusion datasets acquired during 10 repeated imaging sessions. Measurement uncertainty was derived in eight manually selected regions of interest and in the entire brain white matter and gray matter. The precision of the bootstrap methods was analyzed using coefficients of variation and intra-class correlation coefficients. Comprehensive simulations were performed to validate the results. All bootstrap algorithms showed similar precision which slightly varied in dependence of the selected region of interest. The averaged coefficient of variation in the selected regions of interest was 13.81%, 12.35%, and 17.93% with respect to the apparent diffusion coefficient, the fractional anisotropy value, and the cone of uncertainty, respectively. The repeated measurements showed a very high similarity with intraclass-correlation coefficients larger than 0.96. The simulations confirmed most of the in vivo findings. All investigated bootstrap methods perform with a similar, high precision in deriving the measurement uncertainty of diffusion parameters. Thus, the time-efficient model-based bootstrap approaches should be the method of choice in clinical practice. Copyright © 2012 Wiley Periodicals, Inc.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, A.M. Robert
Empirical evidence from time series methods which assume the usual I(0)/I(1) paradigm suggests that the efficient market hypothesis, stating that spot and futures prices of a commodity should cointegrate with a unit slope on futures prices, does not hold. However, these statistical methods...... fractionally integrated model we are able to find a body of evidence in support of the efficient market hypothesis for a number of commodities. Our new tests are wild bootstrap implementations of score-based tests for the order of integration of a fractionally integrated time series. These tests are designed...... principle do. A Monte Carlo simulation study demonstrates that very significant improvements infinite sample behaviour can be obtained by the bootstrap vis-à-vis the corresponding asymptotic tests in both heteroskedastic and homoskedastic environments....
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
Energy Technology Data Exchange (ETDEWEB)
Neitzel, P.L.; Walther, W. [Dresden University of Technology, Institute for Groundwater Managemant, Dresden (Germany); Nestler, W. [Institute for Technology and Economics, Department of Civil Engineering and Architecture, Dresden (Germany)
1998-06-01
Strongly polar organic substances like halogenated acetic acids have been analyzed in surface water and groundwater in the catchment area of the upper Elbe river in Saxony since 1992. Coming directly from anthropogenic sources like industry, agriculture and indirectly by rainfall, their concentrations can increase up to 100 {mu}g/L in the aquatic environment of this catchment area. A new static headspace GC-MSD method without a manual pre-concentration step is presented to analyze the chlorinated acetic acids relevant to the Elbe river as their volatile methyl esters. Using an ion-pairing agent as modifier for the in-situ methylation of the analytes by dimethylsulfate, a minimal detection limit of 1 {mu}g/L can be achieved. Problems like the thermal degradation of chlorinated acetic acids to halogenated hydrocarbons and changing reaction yields during the headspace methylation, could be effectively reduced. The method has been successfully applied to monitoring bank infiltrate, surface water, groundwater and water works pumped raw water according to health provision principles. (orig.) With 3 figs., 2 tabs., 29 refs.
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.
Ibaraki, Masanobu; Matsubara, Keisuke; Nakamura, Kazuhiro; Yamaguchi, Hiroshi; Kinoshita, Toshibumi
2014-02-01
Accurate and validated methods for estimating regional PET image noise are helpful for optimizing image processing. The bootstrap is a data-based simulation method for statistical inference, which can be used to estimate the PET image noise without repeated measurements. The aim of this study was to experimentally validate bootstrap-based methods as a tool for estimating PET image noise and demonstrate its usefulness for evaluating image reconstruction algorithms. Two bootstrap-based method, the list-mode data bootstrap (LMBS) and the sinogram bootstrap (SNBS), were implemented on a clinical PET scanner. A uniform cylindrical phantom filled with (18)F solution was scanned using list-mode acquisition. A reference standard deviation (SD) map was calculated from 60 statistically independent measured list-mode data. Using one of the 60 list-mode data, 60 bootstrap replicates were generated and used to calculate bootstrap SD maps. Brain (18)F-FDG data from a healthy volunteer were also processed as an example of the bootstrap application. Three reconstruction algorithms, FBP 2D and both 2D and 3D versions of dynamic row-action maximum likelihood algorithm (DRAMA), were assessed. For all the reconstruction algorithms used, the bootstrap SD maps agreed well with the reference SD map, confirming the validity of the bootstrap methods for assessing image noise. The two bootstrap methods were equivalent with respect to the performance of image noise estimation. The bootstrap analysis of the FDG data showed the better contrast-noise relation curve for DRAMA 3D compared to DRAMA 2D and FBP 2D. The bootstrap methods provide the estimates of image noise for various reconstruction algorithms with reasonable accuracy, require only a single measurement, not repeated measures, and are, therefore, applicable for a human PET study.
Volpp, Kevin G; Troxel, Andrea B; Mehta, Shivan J; Norton, Laurie; Zhu, Jingsan; Lim, Raymond; Wang, Wenli; Marcus, Noora; Terwiesch, Christian; Caldarella, Kristen; Levin, Tova; Relish, Mike; Negin, Nathan; Smith-McLallen, Aaron; Snyder, Richard; Spettell, Claire M; Drachman, Brian; Kolansky, Daniel; Asch, David A
2017-08-01
Adherence to medications prescribed after acute myocardial infarction (AMI) is low. Wireless technology and behavioral economic approaches have shown promise in improving health behaviors. To determine whether a system of medication reminders using financial incentives and social support delays subsequent vascular events in patients following AMI compared with usual care. Two-arm, randomized clinical trial with a 12-month intervention conducted from 2013 through 2016. Investigators were blinded to study group, but participants were not. Design was a health plan-intermediated intervention for members of several health plans. We recruited 1509 participants from 7179 contacted AMI survivors (insured with 5 large US insurers nationally or with Medicare fee-for-service at the University of Pennsylvania Health System). Patients aged 18 to 80 years were eligible if currently prescribed at least 2 of 4 study medications (statin, aspirin, β-blocker, antiplatelet agent), and were hospital inpatients for 1 to 180 days and discharged home with a principal diagnosis of AMI. Patients were randomized 2:1 to an intervention using electronic pill bottles combined with lottery incentives and social support for medication adherence (1003 patients), or to usual care (506 patients). Primary outcome was time to first vascular rehospitalization or death. Secondary outcomes were time to first all-cause rehospitalization, total number of repeated hospitalizations, medication adherence, and total medical costs. A total of 35.5% of participants were female (n = 536); mean (SD) age was 61.0 (10.3) years. There were no statistically significant differences between study arms in time to first rehospitalization for a vascular event or death (hazard ratio, 1.04; 95% CI, 0.71 to 1.52; P = .84), time to first all-cause rehospitalization (hazard ratio, 0.89; 95% CI, 0.73 to 1.09; P = .27), or total number of repeated hospitalizations (hazard ratio, 0.94; 95% CI, 0.60 to 1.48; P
El-Showk, Sheer; Poland, David; Rychkov, Slava; Simmons-Duffin, David; Vichi, Alessandro
2014-01-01
We use the conformal bootstrap to perform a precision study of the operator spectrum of the critical 3d Ising model. We conjecture that the 3d Ising spectrum minimizes the central charge c in the space of unitary solutions to crossing symmetry. Because extremal solutions to crossing symmetry are uniquely determined, we are able to precisely reconstruct the first several Z2-even operator dimensions and their OPE coefficients. We observe that a sharp transition in the operator spectrum occurs at the 3d Ising dimension Delta_sigma=0.518154(15), and find strong numerical evidence that operators decouple from the spectrum as one approaches the 3d Ising point. We compare this behavior to the analogous situation in 2d, where the disappearance of operators can be understood in terms of degenerate Virasoro representations.
Garnier-Laplace, J; Vandenhove, H; Beresford, N; Muikku, M; Real, A
2018-03-01
The ALLIANCE 6 Strategic Research Agenda (SRA) initiated by the STAR 7 Network of Excellence and integrated in the research strategy implemented by the COMET consortium, defines a long-term vision of the needs for, and implementation of, research in radioecology. This reference document, reflecting views from many stakeholders groups and researchers, serves as an input to those responsible for defining EU research call topics through the ALLIANCE SRA statement delivered each year to the EJP-CONCERT 8 (2015-2020). This statement highlights a focused number of priorities for funding. Research in radioecology and related sciences is justified by various drivers, such as policy changes, scientific advances and knowledge gaps, radiological risk perception by the public, and a growing awareness of interconnections between human and ecosystem health. The SRA is being complemented by topical roadmaps that have been initiated by the COMET 9 EC-funded project, with the help and endorsement of the ALLIANCE. The strategy underlying roadmap development is driven by the need for improved mechanistic understanding across radioecology. By meeting this need, we can provide fit-for-purpose human and environmental impact/risk assessments in support of the protection of man and the environment in interaction with society and for the three exposure situations defined by the ICRP (i.e., planned, existing and emergency). Within the framework of the EJP-CONCERT the development of a joint roadmap is under discussion among all the European research platforms and will highlight the major research needs for the whole radiation protection field and how these are likely to be addressed by 2030.
Liu, Yuan; Yan, Haijing; Zhou, Xiaoguang; Li, Mingxia; Fu, Honggang
2015-12-07
The anchoring of small-sized WN (tungsten nitride) nanoparticles (NPs) with good dispersion on carbon nanotubes (CNTs) offers an effective means of obtaining promising materials for use in electrocatalysis. Herein, an effective method based on grinding treatment followed by a nitridation process is proposed to realize this goal. In the synthesis, a solution containing H4 [SiO4 (W3 O9 )4 ] (SiW12 ) and CNTs modified with polyethylenimine (PEI-CNTs) was ground to dryness. Small-sized WN NPs were anchored onto the CNTs with good dispersion after calcination under NH3 . Under hydrothermal assembly conditions (absence of grinding), WN particles of larger size and with inferior dispersion were obtained, demonstrating the important role of the grinding process. The benefit of the small-sized WN has been demonstrated by using WN/CNTs as a support for Pt to catalyze the methanol electro-oxidation reaction. The mass activity of Pt-WN/CNTs-G-70 (where G denotes the grinding treatment, and 70 is the loading amount (%) of WN in the WN/CNTs) was evaluated as about 817 mA mg(-1) Pt , better that those of commercial Pt/C (340 mA mg(-1) Pt ) and Pt/CNTs (162 mA mg(-1) Pt ). The Pt-WN/CNTs-G also displayed good CO tolerance. In contrast, Pt-WN/CNTs prepared without the grinding process displayed an activity of 344 mA mg(-1) Pt , verifying the key role of grinding treatment in the preparation of WN/CNTs with good co-catalytic effect. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle
2014-02-01
Bootstrap methods are used in many disciplines to estimate the uncertainty of parameters, including multi-level or linear mixed-effects models. Residual-based bootstrap methods which resample both random effects and residuals are an alternative approach to case bootstrap, which resamples the individuals. Most PKPD applications use the case bootstrap, for which software is available. In this study, we evaluated the performance of three bootstrap methods (case bootstrap, nonparametric residual bootstrap and parametric bootstrap) by a simulation study and compared them to that of an asymptotic method (Asym) in estimating uncertainty of parameters in nonlinear mixed-effects models (NLMEM) with heteroscedastic error. This simulation was conducted using as an example of the PK model for aflibercept, an anti-angiogenic drug. As expected, we found that the bootstrap methods provided better estimates of uncertainty for parameters in NLMEM with high nonlinearity and having balanced designs compared to the Asym, as implemented in MONOLIX. Overall, the parametric bootstrap performed better than the case bootstrap as the true model and variance distribution were used. However, the case bootstrap is faster and simpler as it makes no assumptions on the model and preserves both between subject and residual variability in one resampling step. The performance of the nonparametric residual bootstrap was found to be limited when applying to NLMEM due to its failure to reflate the variance before resampling in unbalanced designs where the Asym and the parametric bootstrap performed well and better than case bootstrap even with stratification.
Automated modal parameter estimation using correlation analysis and bootstrap sampling
Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.
2018-02-01
The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to
Buonaccorsi, John P; Romeo, Giovanni; Thoresen, Magne
2018-03-01
When fitting regression models, measurement error in any of the predictors typically leads to biased coefficients and incorrect inferences. A plethora of methods have been proposed to correct for this. Obtaining standard errors and confidence intervals using the corrected estimators can be challenging and, in addition, there is concern about remaining bias in the corrected estimators. The bootstrap, which is one option to address these problems, has received limited attention in this context. It has usually been employed by simply resampling observations, which, while suitable in some situations, is not always formally justified. In addition, the simple bootstrap does not allow for estimating bias in non-linear models, including logistic regression. Model-based bootstrapping, which can potentially estimate bias in addition to being robust to the original sampling or whether the measurement error variance is constant or not, has received limited attention. However, it faces challenges that are not present in handling regression models with no measurement error. This article develops new methods for model-based bootstrapping when correcting for measurement error in logistic regression with replicate measures. The methodology is illustrated using two examples, and a series of simulations are carried out to assess and compare the simple and model-based bootstrap methods, as well as other standard methods. While not always perfect, the model-based approaches offer some distinct improvements over the other methods. © 2017, The International Biometric Society.
Study of the separate exposure method for bootstrap sensitometry on X-ray cine film
International Nuclear Information System (INIS)
Matsuda, Eiji; Sanada, Taizo; Hitomi, Go; Kakuba, Koki; Kangai, Yoshiharu; Ishii, Koushi
1997-01-01
We developed a new method for bootstrap sensitometry that obtained the characteristic curve from a wide range, with a smaller number of aluminum steps than the conventional bootstrap method. In this method, the density-density curve was obtained from standard and multiplied exposures to the aluminum step wedge and used for bootstrap manipulation. The curve was acquired from two regions separated and added together, e.g., lower and higher photographic density regions. In this study, we evaluated the usefulness of a new cinefluorography method in comparison with N.D. filter sensitometry. The shape of the characteristic curve and the gradient curve obtained with the new method were highly similar to that obtained with N.D. filter sensitometry. Also, the average gradient obtained with the new bootstrap sensitometry method was not significantly different from that obtained by the N.D. filter method. The study revealed that the reliability of the characteristic curve was improved by increasing the measured value used to calculate the density-density curve. This new method was useful for obtaining a characteristic curve with a sufficient density range, and the results suggested that this new method could be applied to specific systems to which the conventional bootstrap method is not applicable. (author)
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
Bootstrap Restricted Likelihood Ratio Test for the Detection of Rare Variants
Zeng, Ping; Wang, Ting
2015-01-01
In this paper the detection of rare variants association with continuous phenotypes of interest is investigated via the likelihood-ratio based variance component test under the framework of linear mixed models. The hypothesis testing is challenging and nonstandard, since under the null the variance component is located on the boundary of its parameter space. In this situation the usual asymptotic chisquare distribution of the likelihood ratio statistic does not necessarily hold. To circumvent the derivation of the null distribution we resort to the bootstrap method due to its generic applicability and being easy to implement. Both parametric and nonparametric bootstrap likelihood ratio tests are studied. Numerical studies are implemented to evaluate the performance of the proposed bootstrap likelihood ratio test and compare to some existing methods for the identification of rare variants. To reduce the computational time of the bootstrap likelihood ratio test we propose an effective approximation mixture for the bootstrap null distribution. The GAW17 data is used to illustrate the proposed test. PMID:26069459
Electron Bernstein wave-bootstrap current synergy in the National Spherical Torus Experiment
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code to determine the degree of synergy between them. A target β=40% NSTX [M. Ono, S. Kaye, M. Peng et al., Proceedings of the 17th IAEA Fusion Energy Conference, edited by M. Spak (IAEA, Vienna, Austria, 1999), Vol. 3, p. 1135] plasma is examined. A simple bootstrap model in the collisional-quasilinear CQL3D Fokker-Planck code (National Technical Information Service document No. DE93002962) is used in these studies: the transiting electron distributions are connected in velocity space at the trapped-passing boundary to trapped-electron distributions that are displaced radially by a half-banana-width outwards/inwards for the co-passing/counter-passing regions. This model agrees well with standard bootstrap current calculations over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
Austin, Peter C
2008-10-01
Researchers have proposed using bootstrap resampling in conjunction with automated variable selection methods to identify predictors of an outcome and to develop parsimonious regression models. Using this method, multiple bootstrap samples are drawn from the original data set. Traditional backward variable elimination is used in each bootstrap sample, and the proportion of bootstrap samples in which each candidate variable is identified as an independent predictor of the outcome is determined. The performance of this method for identifying predictor variables has not been examined. Monte Carlo simulation methods were used to determine the ability of bootstrap model selection methods to correctly identify predictors of an outcome when those variables that are selected for inclusion in at least 50% of the bootstrap samples are included in the final regression model. We compared the performance of the bootstrap model selection method to that of conventional backward variable elimination. Bootstrap model selection tended to result in an approximately equal proportion of selected models being equal to the true regression model compared with the use of conventional backward variable elimination. Bootstrap model selection performed comparatively to backward variable elimination for identifying the true predictors of a binary outcome.
Introduction of Bootstrap Current Reduction in the Stellarator Optimization Using the Algorithm DAB
International Nuclear Information System (INIS)
Castejón, F.; Gómez-Iglesias, A.; Velasco, J. L.
2015-01-01
This work is devoted to introduce new optimization criterion in the DAB (Distributed Asynchronous Bees) code. With this new criterion, we have now in DAB the equilibrium and Mercier stability criteria, the minimization of Bxgrad(B) criterion, which ensures the reduction of neoclassical transport and the improvement of the confinement of fast particles, and the reduction of bootstrap current. We have started from a neoclassically optimised configuration of the helias type and imposed the reduction of bootstrap current. The obtained configuration only presents a modest reduction of total bootstrap current, but the local current density is reduced along the minor radii. Further investigations are developed to understand the reason of this modest improvement.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...... the recommended decisions) to build rational confidence limits were given. Two NIR datasets were used to study the effect of outliers and bimodal distributions on the bootstrap-based limits. The results showed that bootstrapping can give reasonable estimate of distributions for scores and loadings. It can also...... on assumptions that do not always hold in practice. The aim of this thesis was to illustrate the concept of bootstrap-based confidence estimation in PCA and MSPC. It particularly shows how to build bootstrapbased confidence limits in these areas to be used as alternative to the traditional/asymptotic limits...
The S-matrix bootstrap. Part I: QFT in AdS
Paulos, Miguel F.; Penedones, Joao; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-01
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Constructing Optimal Prediction Intervals by Using Neural Networks and Bootstrap Method.
Khosravi, Abbas; Nahavandi, Saeid; Srinivasan, Dipti; Khosravi, Rihanna
2015-08-01
This brief proposes an efficient technique for the construction of optimized prediction intervals (PIs) by using the bootstrap technique. The method employs an innovative PI-based cost function in the training of neural networks (NNs) used for estimation of the target variance in the bootstrap method. An optimization algorithm is developed for minimization of the cost function and adjustment of NN parameters. The performance of the optimized bootstrap method is examined for seven synthetic and real-world case studies. It is shown that application of the proposed method improves the quality of constructed PIs by more than 28% over the existing technique, leading to narrower PIs with a coverage probability greater than the nominal confidence level.
arXiv The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F.; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-21
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Closure of the operator product expansion in the non-unitary bootstrap
Energy Technology Data Exchange (ETDEWEB)
Esterlis, Ilya [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States); Fitzpatrick, A. Liam [Department of Physics, Boston University,Commonwealth Ave, Boston, MA, 02215 (United States); Ramirez, David M. [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States)
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in http://arxiv.org/abs/1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
A bootstrap test for comparing two variances: simulation of size and power in small samples.
Sun, Jiajing; Chernick, Michael R; LaBudde, Robert A
2011-11-01
An F statistic was proposed by Good and Chernick ( 1993 ) in an unpublished paper, to test the hypothesis of the equality of variances from two independent groups using the bootstrap; see Hall and Padmanabhan ( 1997 ), for a published reference where Good and Chernick ( 1993 ) is discussed. We look at various forms of bootstrap tests that use the F statistic to see whether any or all of them maintain the nominal size of the test over a variety of population distributions when the sample size is small. Chernick and LaBudde ( 2010 ) and Schenker ( 1985 ) showed that bootstrap confidence intervals for variances tend to provide considerably less coverage than their theoretical asymptotic coverage for skewed population distributions such as a chi-squared with 10 degrees of freedom or less or a log-normal distribution. The same difficulties may be also be expected when looking at the ratio of two variances. Since bootstrap tests are related to constructing confidence intervals for the ratio of variances, we simulated the performance of these tests when the population distributions are gamma(2,3), uniform(0,1), Student's t distribution with 10 degrees of freedom (df), normal(0,1), and log-normal(0,1) similar to those used in Chernick and LaBudde ( 2010 ). We find, surprisingly, that the results for the size of the tests are valid (reasonably close to the asymptotic value) for all the various bootstrap tests. Hence we also conducted a power comparison, and we find that bootstrap tests appear to have reasonable power for testing equivalence of variances.
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio
2014-01-01
measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Improving Web Learning through model Optimization using Bootstrap for a Tour-Guide Robot
Directory of Open Access Journals (Sweden)
Rafael León
2012-09-01
Full Text Available We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability
Orthogonal projections and bootstrap resampling procedures in the study of infraspecific variation
Directory of Open Access Journals (Sweden)
Luiza Carla Duarte
1998-12-01
Full Text Available The effect of an increase in quantitative continuous characters resulting from indeterminate growth upon the analysis of population differentiation was investigated using, as an example, a set of continuous characters measured as distance variables in 10 populations of a rodent species. The data before and after correction for allometric size effects using orthogonal projections were analyzed with a parametric bootstrap resampling procedure applied to canonical variate analysis. The variance component of the distance measures attributable to indeterminate growth within the populations was found to be substantial, although the ordination of the populations was not affected, as evidenced by the relative and absolute positions of the centroids. The covariance pattern of the distance variables used to infer the nature of the morphological differences was strongly influenced by indeterminate growth. The uncorrected data produced a misleading picture of morphological differentiation by indicating that groups of populations differed in size. However, the data corrected for allometric effects clearly demonstrated that populations differed morphologically both in size and shape. These results are discussed in terms of the analysis of morphological differentiation among populations and the definition of infraspecific geographic units.A influência do aumento em caracteres quantitativos contínuos devido ao crescimento indeterminado sobre a análise de diferenciação entre populações foi investigado utilizando como exemplo um conjunto de dados de variáveis craniométricas em 10 populações de uma espécie de roedor. Dois conjuntos de dados, um não corrigido para o efeito alométrico do tamanho e um outro corrigido para o efeito alométrico do tamanho utilizando um método de projeção ortogonal, foram analisados por um procedimento "bootstrap" de reamostragem aplicado à análise de variáveis canônicas. O componente de variância devido ao
Stability of response characteristics of a Delphi panel: application of bootstrap data expansion
Directory of Open Access Journals (Sweden)
Cole Bryan R
2005-12-01
Full Text Available Abstract Background Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. Methods The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. Results The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. Conclusion Panels of similarly trained experts (who possess a general understanding in the field of interest provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making.
Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters
Kim, T.; Kim, Y. S.
2017-12-01
The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
A model for bootstrap current calculations with bounce averaged Fokker-Planck codes
Westerhof, E.; Peeters, A.G.
1996-01-01
A model is presented that allows the calculation of the neoclassical bootstrap current originating from the radial electron density and pressure gradients in standard (2+1)D bounce averaged Fokker-Planck codes. The model leads to an electron momentum source located almost exclusively at the
Barrera, Begoña Barrios; Figalli, Alessio; Valdinoci, Enrico
2012-01-01
We prove that $C^{1,\\alpha}$ $s$-minimal surfaces are automatically $C^\\infty$. For this, we develop a new bootstrap regularity theory for solutions of integro-differential equations of very general type, which we believe is of independent interest.
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Bootstrapping Malmquist indices for Danish seiners in the North Sea and Skagerrak
DEFF Research Database (Denmark)
Hoff, Ayoe
2006-01-01
DEA scores or related parameters. The bootstrap method for estimating confidence intervals of deterministic parameters can however be applied to estimate confidence intervals for DEA scores. This method is applied in the present paper for assessing TFP changes between 1987 and 1999 for the fleet...
Bootstrapping Multifractals: Surrogate Data from Random Cascades on Wavelet Dyadic Trees
Czech Academy of Sciences Publication Activity Database
Paluš, Milan
2008-01-01
Roč. 101, č. 13 (2008), 134101-1-134101-4 ISSN 0031-9007 EU Projects: European Commission(XE) 517133 - BRACCIA Grant - others:GA AV ČR(CZ) 1ET110190504 Institutional research plan: CEZ:AV0Z10300504 Keywords : multifractal * bootstrap * hypothesis testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 7.180, year: 2008
Bootstrap Approach To Compare the Slopes of Two Calibrations When Few Standards Are Available.
Estévez-Pérez, Graciela; Andrade, Jose M; Wilcox, Rand R
2016-02-16
Comparing the slopes of aqueous-based and standard addition calibration procedures is almost a daily task in analytical laboratories. As usual protocols imply very few standards, sound statistical inference and conclusions are hard to obtain for current classical tests (e.g., the t-test), which may greatly affect decision-making. Thus, there is a need for robust statistics that are not distorted by small samples of experimental values obtained from analytical studies. Several promising alternatives based on bootstrapping are studied in this paper under the typical constraints common in laboratory work. The impact of number of standards, homoscedasticity or heteroscedasticity, three variance patterns, and three error distributions on least-squares fits were considered (in total, 144 simulation scenarios). The Student's t-test is the most valuable procedure when the normality assumption is true and homoscedasticity is present, although it can be highly affected by outliers. A wild bootstrap method leads to average rejection percentages that are closer to the nominal level in almost every situation, and it is recommended for laboratories working with a small number of standards. Finally, it was seen that the Theil-Sen percentile bootstrap statistic is very robust but its rejection percentages depart from the nominal ones (bootstrap principles to compare the slopes of two calibration lines.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
A common-gate bootstrapped CMOS rectifier for VHF isolated DC-DC converter
Pan, Dongfang; Zhang, Feng; Huang, Lu; Li, Jinliang
2017-06-01
A common-gate bootstrapped CMOS rectifier dedicated for VHF (very high frequency) isolated DC-DC converter is proposed. It uses common-gate bootstrapped technique to compensate the power loss due to the threshold voltage, and to solve the reflux problem in the conventional rectifier circuit. As a result, it improves the power conversion efficiency (PCE) and voltage conversion ratio (VCR). The design saves almost 90% of the area compared to a previously reported double capacitor structure. In addition, we compare the previous rectifier with the proposed common-gate bootstrapped rectifier in the case of the same area; simulation results show that the PCE and VCR of the proposed structure are superior to other structures. The proposed common-gate bootstrapped rectifier was fabricated by using CSMC 0.5 μm BCD process. The measured maximum PCE is 86% and VCR achieves 77% at the operating frequency of 20 MHz. The average PCE is about 79% and average VCR achieves 71% in the frequency range of 30-70 MHz. Measured PCE and VCR have been improved compared to previous results.
Forward Kinematic Analysis of Tip-Tilt-Piston Parallel Manipulator using Secant-Bootstrap Method
Majidian, A.; Amani, A.; Golipour, M.; Amraei, A.
2014-01-01
This paper, deals with application of the Secant-Bootstrap Method (SBM) to solve the Closed-form forward kinematics of a new three degree-of-freedom (DOF) parallel manipulator with inextensible limbs and base-mounted actuators. The manipulator has higher resolution and precision than the existing
Czech Academy of Sciences Publication Activity Database
Kyselý, Jan
2008-01-01
Roč. 47, č. 12 (2008), s. 3236-3251 ISSN 1558-8424 R&D Projects: GA ČR GA205/06/1535 Institutional research plan: CEZ:AV0Z30420517 Keywords : bootstrap * resampling * extreme value analysis * Generalized extreme value distribution * Gumbel distribution Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.761, year: 2008
A bootstrap procedure to select hyperspectral wavebands related to tannin content
Ferwerda, J.G.; Skidmore, A.K.; Stein, A.
2006-01-01
Detection of hydrocarbons in plants with hyperspectral remote sensing is hampered by overlapping absorption pits, while the `optimal' wavebands for detecting some surface characteristics (e.g. chlorophyll, lignin, tannin) may shift. We combined a phased regression with a bootstrap procedure to find
Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B
2017-08-01
Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This
Calculating Power by Bootstrap, with an Application to Cluster-Randomized Trials.
Kleinman, Ken; Huang, Susan S
2016-01-01
A key requirement for a useful power calculation is that the calculation mimic the data analysis that will be performed on the actual data, once that data is observed. Close approximations may be difficult to achieve using analytic solutions, however, and thus Monte Carlo approaches, including both simulation and bootstrap resampling, are often attractive. One setting in which this is particularly true is cluster-randomized trial designs. However, Monte Carlo approaches are useful in many additional settings as well. Calculating power for cluster-randomized trials using analytic or simulation-based methods is frequently unsatisfactory due to the complexity of the data analysis methods to be employed and to the sparseness of data to inform the choice of important parameters in these methods. We propose that among Monte Carlo methods, bootstrap approaches are most likely to generate data similar to the observed data. In bootstrap approaches, real data are resampled to build complete data sets based on real data that resemble the data for the intended analyses. In contrast, simulation methods would use the real data to estimate parameters for the data, and would then simulate data using these parameters. We describe means of implementing bootstrap power calculation. We demonstrate bootstrap power calculation for a cluster-randomized trial with a censored survival outcome and a baseline observation period. Bootstrap power calculation is a natural application of resampling methods. It provides a relatively simple solution to power calculation that is likely to be more accurate than analytic solutions or simulation-based calculations, in the sense that the bootstrap approach does not rely on the assumptions inherent in analytic calculations. This method of calculation has several important strengths. Notably, it is simple to achieve great fidelity to the proposed data analysis method and there is no requirement for parameter estimates, or estimates of their variability
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
Loop equations and bootstrap methods in the lattice
Directory of Open Access Journals (Sweden)
Peter D. Anderson
2017-08-01
Full Text Available Pure gauge theories can be formulated in terms of Wilson Loops by means of the loop equation. In the large-N limit this equation closes in the expectation value of single loops. In particular, using the lattice as a regulator, it becomes a well defined equation for a discrete set of loops. In this paper we study different numerical approaches to solving this equation. Previous ideas gave good results in the strong coupling region. Here we propose an alternative method based on the observation that certain matrices ρˆ of Wilson loop expectation values are positive definite. They also have unit trace (ρˆ⪰0,Trρˆ=1, in fact they can be defined as reduced density matrices in the space of open loops after tracing over color indices and can be used to define an entropy associated with the loss of information due to such trace SWL=−Tr[ρˆlnρˆ]. The condition that such matrices are positive definite allows us to study the weak coupling region which is relevant for the continuum limit. In the exactly solvable case of two dimensions this approach gives very good results by considering just a few loops. In four dimensions it gives good results in the weak coupling region and therefore is complementary to the strong coupling expansion. We compare the results with standard Monte Carlo simulations.
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
Bootstrapping Domain Knowledge Exploration using Conceptual Mapping of Wikipedia
Mai Eldefrawi; Ahmed Sharaf eldin Ahmed; Adel Elsayed
2013-01-01
Wikipedia is one of the largest online encyclopedias that exist in a hypertext form. This nature prevents Wikipedia’s potential to be fully discovered. Therefore the focus of this paper is on the role of domain knowledge in supporting the exploration of classical encyclopedic content, which in this case is Wikipedia. A main contribution provided by the author of this work is a methodology for identifying the nature, the form and the role of domain knowledge expressed in conceptual form. It’s ...
Loop equation in Lattice gauge theories and bootstrap methods
Directory of Open Access Journals (Sweden)
Anderson Peter
2018-01-01
Full Text Available In principle the loop equation provides a complete formulation of a gauge theory purely in terms ofWilson loops. In the case of lattice gauge theories the loop equation is a well defined equation for a discrete set of quantities and can be easily solved at strong coupling either numerically or by series expansion. At weak coupling, however, we argue that the equations are not well defined unless a certain set of positivity constraints is imposed. Using semi-definite programming we show numerically that, for a pure Yang Mills theory in two, three and four dimensions, these constraints lead to good results for the mean value of the energy at weak coupling. Further, the positivity constraints imply the existence of a positive definite matrix whose entries are expectation values of Wilson loops. This matrix allows us to define a certain entropy associated with theWilson loops. We compute this entropy numerically and describe some of its properties. Finally we discuss some preliminary ideas for extending the results to supersymmetric N = 4 SYM.
Varouchakis, Emmanouil; Hristopulos, Dionissios
2015-04-01
Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs
Testing strong interaction theories
International Nuclear Information System (INIS)
Ellis, J.
1979-01-01
The author discusses possible tests of the current theories of the strong interaction, in particular, quantum chromodynamics. High energy e + e - interactions should provide an excellent means of studying the strong force. (W.D.L.)
Estimating Parameter Uncertainty in Binding-Energy Models by the Frequency-Domain Bootstrap
Bertsch, G. F.; Bingham, Derek
2017-12-01
We propose using the frequency-domain bootstrap (FDB) to estimate errors of modeling parameters when the modeling error is itself a major source of uncertainty. Unlike the usual bootstrap or the simple χ2 analysis, the FDB can take into account correlations between errors. It is also very fast compared to the Gaussian process Bayesian estimate as often implemented for computer model calibration. The method is illustrated with a simple example, the liquid drop model of nuclear binding energies. We find that the FDB gives a more conservative estimate of the uncertainty in liquid drop parameters than the χ2 method, and is in fair accord with more empirical estimates. For the nuclear physics application, there are no apparent obstacles to apply the method to the more accurate and detailed models based on density-functional theory.
Bootstrapping six-gluon scattering in planar ${\\cal N}=4$ super-Yang-Mills theory
Dixon, Lance J; Duhr, Claude; von Hippel, Matt; Pennington, Jeffrey
2014-01-01
We describe the hexagon function bootstrap for solving for six-gluon scattering amplitudes in the large $N_c$ limit of ${\\cal N}=4$ super-Yang-Mills theory. In this method, an ansatz for the finite part of these amplitudes is constrained at the level of amplitudes, not integrands, using boundary information. In the near-collinear limit, the dual picture of the amplitudes as Wilson loops leads to an operator product expansion which has been solved using integrability by Basso, Sever and Vieira. Factorization of the amplitudes in the multi-Regge limit provides additional boundary data. This bootstrap has been applied successfully through four loops for the maximally helicity violating (MHV) configuration of gluon helicities, and through three loops for the non-MHV case.
Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut
2017-05-01
This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
A bootstrap test for instrument validity in heterogeneous treatment effect models
Kitagawa, Toru
2013-01-01
This paper develops a specification test for the instrument validity conditions in the heterogeneous treatment effect model with a binary treatment and a discrete instrument. A necessary testable implication for the joint restriction of instrument exogeneity and instrument monotonicity is given by nonnegativity of point-identifiable complier's outcome densities. Our specification test infers this testable implication using a Kolmogorov-Smirnov type test statistic. We provide a bootstrap algor...
Asymptotic Expansions and Bootstrapping Distributions for Dependent Variables: A Martingale Approach
Mykland, Per Aslak
1992-01-01
The paper develops a one-step triangular array asymptotic expansion for continuous martingales which are asymptotically normal. Mixing conditions are not required, but the quadratic variations of the martingales must satisfy a law of large numbers and a central limit type condition. From this result we derive expansions for the distributions of estimators in asymptotically ergodic differential equation models, and also for the bootstrapping estimators of these distributions.
Parametric bootstrap methods for testing multiplicative terms in GGE and AMMI models.
Forkman, Johannes; Piepho, Hans-Peter
2014-09-01
The genotype main effects and genotype-by-environment interaction effects (GGE) model and the additive main effects and multiplicative interaction (AMMI) model are two common models for analysis of genotype-by-environment data. These models are frequently used by agronomists, plant breeders, geneticists and statisticians for analysis of multi-environment trials. In such trials, a set of genotypes, for example, crop cultivars, are compared across a range of environments, for example, locations. The GGE and AMMI models use singular value decomposition to partition genotype-by-environment interaction into an ordered sum of multiplicative terms. This article deals with the problem of testing the significance of these multiplicative terms in order to decide how many terms to retain in the final model. We propose parametric bootstrap methods for this problem. Models with fixed main effects, fixed multiplicative terms and random normally distributed errors are considered. Two methods are derived: a full and a simple parametric bootstrap method. These are compared with the alternatives of using approximate F-tests and cross-validation. In a simulation study based on four multi-environment trials, both bootstrap methods performed well with regard to Type I error rate and power. The simple parametric bootstrap method is particularly easy to use, since it only involves repeated sampling of standard normally distributed values. This method is recommended for selecting the number of multiplicative terms in GGE and AMMI models. The proposed methods can also be used for testing components in principal component analysis. © 2014, The International Biometric Society.
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Forecasting Model for IPTV Service in Korea Using Bootstrap Ridge Regression Analysis
Lee, Byoung Chul; Kee, Seho; Kim, Jae Bum; Kim, Yun Bae
The telecom firms in Korea are taking new step to prepare for the next generation of convergence services, IPTV. In this paper we described our analysis on the effective method for demand forecasting about IPTV broadcasting. We have tried according to 3 types of scenarios based on some aspects of IPTV potential market and made a comparison among the results. The forecasting method used in this paper is the multi generation substitution model with bootstrap ridge regression analysis.
Gaussian process regression bootstrapping: exploring the effects of uncertainty in time course data.
Kirk, Paul D W; Stumpf, Michael P H
2009-05-15
Although widely accepted that high-throughput biological data are typically highly noisy, the effects that this uncertainty has upon the conclusions we draw from these data are often overlooked. However, in order to assign any degree of confidence to our conclusions, we must quantify these effects. Bootstrap resampling is one method by which this may be achieved. Here, we present a parametric bootstrapping approach for time-course data, in which Gaussian process regression (GPR) is used to fit a probabilistic model from which replicates may then be drawn. This approach implicitly allows the time dependence of the data to be taken into account, and is applicable to a wide range of problems. We apply GPR bootstrapping to two datasets from the literature. In the first example, we show how the approach may be used to investigate the effects of data uncertainty upon the estimation of parameters in an ordinary differential equations (ODE) model of a cell signalling pathway. Although we find that the parameter estimates inferred from the original dataset are relatively robust to data uncertainty, we also identify a distinct second set of estimates. In the second example, we use our method to show that the topology of networks constructed from time-course gene expression data appears to be sensitive to data uncertainty, although there may be individual edges in the network that are robust in light of present data. Matlab code for performing GPR bootstrapping is available from our web site: http://www3.imperial.ac.uk/theoreticalsystemsbiology/data-software/.
Embodied Language Learning and Cognitive Bootstrapping: Methods and Design Principles
Directory of Open Access Journals (Sweden)
Caroline Lyon
2016-05-01
Full Text Available Co-development of action, conceptualization and social interaction mutually scaffold and support each other within a virtuous feedback cycle in the development of human language in children. Within this framework, the purpose of this article is to bring together diverse but complementary accounts of research methods that jointly contribute to our understanding of cognitive development and in particular, language acquisition in robots. Thus, we include research pertaining to developmental robotics, cognitive science, psychology, linguistics and neuroscience, as well as practical computer science and engineering. The different studies are not at this stage all connected into a cohesive whole; rather, they are presented to illuminate the need for multiple different approaches that complement each other in the pursuit of understanding cognitive development in robots. Extensive experiments involving the humanoid robot iCub are reported, while human learning relevant to developmental robotics has also contributed useful results. Disparate approaches are brought together via common underlying design principles. Without claiming to model human language acquisition directly, we are nonetheless inspired by analogous development in humans and consequently, our investigations include the parallel co-development of action, conceptualization and social interaction. Though these different approaches need to ultimately be integrated into a coherent, unified body of knowledge, progress is currently also being made by pursuing individual methods.
<strong>Neuroeconomics and Health Economicsstrong>/>
DEFF Research Database (Denmark)
Larsen, Torben
2009-01-01
activation of Amygdala - a key center in our emotional arousal (limbic system) - as shaped in the elder stone-age with many acute threats. II. In general, the Hawthorne-effect of management is explained as the result of supportive job-relations reinforcing the homeostatic properties of the limbic system...... with de-stressing benefits as reduced anxiety, less use of stimulants and a reduction of blood pressure which in all increase life-expectancy. Conclusion: Neuroeconomics helps economists to identify dominant health economic interventions that may be overlooked by traditional discipålines [i] This part...
Statistical error estimation of the Feynman-α method using the bootstrap method
International Nuclear Information System (INIS)
Endo, Tomohiro; Yamamoto, Akio; Yagi, Takahiro; Pyeon, Cheol Ho
2016-01-01
Applicability of the bootstrap method is investigated to estimate the statistical error of the Feynman-α method, which is one of the subcritical measurement techniques on the basis of reactor noise analysis. In the Feynman-α method, the statistical error can be simply estimated from multiple measurements of reactor noise, however it requires additional measurement time to repeat the multiple times of measurements. Using a resampling technique called 'bootstrap method' standard deviation and confidence interval of measurement results obtained by the Feynman-α method can be estimated as the statistical error, using only a single measurement of reactor noise. In order to validate our proposed technique, we carried out a passive measurement of reactor noise without any external source, i.e. with only inherent neutron source by spontaneous fission and (α,n) reactions in nuclear fuels at the Kyoto University Criticality Assembly. Through the actual measurement, it is confirmed that the bootstrap method is applicable to approximately estimate the statistical error of measurement results obtained by the Feynman-α method. (author)
A wild bootstrap approach for the selection of biomarkers in early diagnostic trials.
Zapf, Antonia; Brunner, Edgar; Konietschke, Frank
2015-05-01
In early diagnostic trials, particularly in biomarker studies, the aim is often to select diagnostic tests among several methods. In case of metric, discrete, or even ordered categorical data, the area under the receiver operating characteristic (ROC) curve (denoted by AUC) is an appropriate overall accuracy measure for the selection, because the AUC is independent of cut-off points. For selection of biomarkers the individual AUC's are compared with a pre-defined threshold. To keep the overall coverage probability or the multiple type-I error rate, simultaneous confidence intervals and multiple contrast tests are considered. We propose a purely nonparametric approach for the estimation of the AUC's with the corresponding confidence intervals and statistical tests. This approach uses the correlation among the statistics to account for multiplicity. For small sample sizes, a Wild-Bootstrap approach is presented. It is shown that the corresponding intervals and tests are asymptotically exact. Extensive simulation studies indicate that the derived Wild-Bootstrap approach keeps and exploits the nominal type-I error at best, even for high accuracies and in case of small samples sizes. The strength of the correlation, the type of covariance structure, a skewed distribution, and also a moderate imbalanced case-control ratio do not have any impact on the behavior of the approach. A real data set illustrates the application of the proposed methods. We recommend the new Wild Bootstrap approach for the selection of biomarkers in early diagnostic trials, especially for high accuracies and small samples sizes.
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Anderst, William J
2015-05-01
There is substantial inter-subject variability in intervertebral range of motion (ROM) in the cervical spine. This makes it difficult to define "normal" ROM, and to assess the effects of age, injury, and surgical procedures on spine kinematics. The objective of this study was to define normal intervertebral kinematics in the cervical spine during dynamic functional loading. Twenty-nine participants performed dynamic flexion\\extension, axial rotation, and lateral bending while biplane radiographs were collected at 30 images/s. Vertebral motion was tracked with sub-millimeter accuracy using a validated volumetric model-based tracking process that matched subject-specific CT-based bone models to the radiographs. Gaussian point-by-point and bootstrap techniques were used to determine 90% prediction bands for the intervertebral kinematic curves at 1% intervals of each movement cycle. Cross validation was performed to estimate the true achieved coverage for each method. For a targeted coverage of 90%, the estimated true coverage using bootstrap prediction bands averaged 86±5%, while the estimated true coverage using Gaussian point-by-point intervals averaged 56±10% over all movements and all motion segments. Bootstrap prediction bands are recommended as the standard for evaluating full ROM cervical spine kinematic curves. The data presented here can be used to identify abnormal motion in patients presenting with neck pain, to drive computational models, and to assess the biofidelity of in vitro loading paradigms. Copyright © 2015 Elsevier Ltd. All rights reserved.
Nonparametric bootstrap technique for calibrating surgical SmartForceps: theory and application.
Azimaee, Parisa; Jafari Jozani, Mohammad; Maddahi, Yaser; Zareinia, Kourosh; Sutherland, Garnette
2017-10-01
Knowledge of forces, exerted on the brain tissue during the performance of neurosurgical tasks, is critical for quality assurance, case rehearsal, and training purposes. Quantifying the interaction forces has been made possible by developing SmartForceps, a bipolar forceps retrofitted by a set of strain gauges. The forces are estimated using voltages read from strain gauges. We therefore need to quantify the force-voltage relationship to estimate the interaction forces during microsurgery. This problem has been addressed in the literature by following the physical and deterministic properties of the force-sensing strain gauges without obtaining the precision associated with each estimate. In this paper, we employ a probabilistic methodology by using a nonparametric Bootstrap approach to obtain both point and interval estimates of the applied forces at the tool tips, while the precision associated with each estimate is provided. To show proof-of-concept, the Bootstrap technique is employed to estimate unknown forces, and construct necessary confidence intervals using observed voltages in data sets that are measured from the performance of surgical tasks on a cadaveric brain. Results indicate that the Bootstrap technique is capable of estimating tool-tissue interaction forces with acceptable level of accuracy compared to the linear regression technique under the normality assumption.
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
International Nuclear Information System (INIS)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin
2013-01-01
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
<strong>Neuroeconomics and behavioral health economicsstrong>/>
DEFF Research Database (Denmark)
Larsen, Torben
2009-01-01
dissemination of relaxation procedures is evident in industrialized countries since about 1970 both inside the medical healthcare system and as NGO-settings in a market-alike competition. However, a serious barrier to the dissemination of meditative de-stressing is the lack of general knowledge of the action...... for explanation of the neural dynamics of normal decision making. Secondly, the literature is reviewed for evidence on hypothesized applications of NeM in behavioral health. Results I. The present bias as documented by neuroeconomic game-trials is explained by NeM as rooted in the basal activation of Amygdala...... - a key center in our emotional arousal (limbic system) - as shaped in the elder stone-age with many acute threats. II. In general, the Hawthorne-effect of human-relations management is explained as the result of supportive job-relations relaxing Amygdala for better emotional integration...
<strong>Neuroeconomics and behavioral health economicsstrong>/>
DEFF Research Database (Denmark)
Larsen, Torben
2009-01-01
- a key center in our emotional arousal (limbic system) - as shaped in the elder stone-age with many acute threats. II. In general, the Hawthorne-effect of human-relations management is explained as the result of supportive job-relations relaxing Amygdala for better emotional integration...... some are rooted in the religious tradition while other aim to be post-religious. Medical meditation across settings combines savings on health care costs with de-stressing benefits as reduced anxiety, less use of stimulants and a reduction of blood pressure which in all increase life...... is met by a meso-strategy aiming the formation of an international, multidisciplinary network which might organize regional workshops for representatives for all involved parties in order to prepare local implementation projects. Regarding de-stressing by medical meditation a relatively fast...
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of
<strong>Generic Patch Inferencestrong>
DEFF Research Database (Denmark)
Andersen, Jesper; Lawall, Julia Laetitia
2008-01-01
A key issue in maintaining Linux device drivers is the need to update drivers in response to evolutions in Linux internal libraries. Currently, there is little tool support for performing and documenting such changes. In this paper we present a tool, spfind, that identifies common changes made...... developers can use it to extract an abstract representation of the set of changes that others have made. Our experiments on recent changes in Linux show that the inferred generic patches are more concise than the corresponding patches found in commits to the Linux source tree while being safe with respect...
Strongly Correlated Topological Insulators
2016-02-03
Strongly Correlated Topological Insulators In the past year, the grant was used for work in the field of topological phases, with emphasis on finding...surface of topological insulators. In the past 3 years, we have started a new direction, that of fractional topological insulators. These are materials...in which a topologically nontrivial quasi-flat band is fractionally filled and then subject to strong interactions. The views, opinions and/or
arXiv The S-matrix Bootstrap II: Two Dimensional Amplitudes
Paulos, Miguel F.; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-22
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Construction of prediction intervals for Palmer Drought Severity Index using bootstrap
Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan
2018-04-01
In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.
The use of GLIM and the bootstrap in assessing a clinical trial of two drugs.
Mapleson, W W
1986-01-01
An approach is described for estimating the dose of a new drug which is equipotent to an established dose of an old drug. The approach is basically that of the parallel-line assay but it can allow for concomitant variables and, by exploiting the facilities available in the statistical computer package GLIM (generalized linear interactive modelling), the approach can be applied when the residuals conform to one of a number of distributions and, with suitable safeguards, to continuous, discrete and even 'scored' responses. In some circumstances, it is necessary to obtain confidence limits by Efron's 'bootstrap' technique. The method is illustrated with results from a trial of two premedicant drugs in children.
Directory of Open Access Journals (Sweden)
Goddard Mike E
2004-03-01
Full Text Available Abstract The ordinary-, penalized-, and bootstrap t-test, least squares and best linear unbiased prediction were compared for their false discovery rates (FDR, i.e. the fraction of falsely discovered genes, which was empirically estimated in a duplicate of the data set. The bootstrap-t-test yielded up to 80% lower FDRs than the alternative statistics, and its FDR was always as good as or better than any of the alternatives. Generally, the predicted FDR from the bootstrapped P-values agreed well with their empirical estimates, except when the number of mRNA samples is smaller than 16. In a cancer data set, the bootstrap-t-test discovered 200 differentially regulated genes at a FDR of 2.6%, and in a knock-out gene expression experiment 10 genes were discovered at a FDR of 3.2%. It is argued that, in the case of microarray data, control of the FDR takes sufficient account of the multiple testing, whilst being less stringent than Bonferoni-type multiple testing corrections. Extensions of the bootstrap simulations to more complicated test-statistics are discussed.
Deng, Nina; Allison, Jeroan J; Fang, Hua Julia; Ash, Arlene S; Ware, John E
2013-05-31
Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r 0.9).
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
Energy Technology Data Exchange (ETDEWEB)
Castejón, F.; Gómez-Iglesias, A.; Velasco, J. L.
2015-07-01
This work is devoted to introduce new optimization criterion in the DAB (Distributed Asynchronous Bees) code. With this new criterion, we have now in DAB the equilibrium and Mercier stability criteria, the minimization of Bxgrad(B) criterion, which ensures the reduction of neoclassical transport and the improvement of the confinement of fast particles, and the reduction of bootstrap current. We have started from a neoclassically optimised configuration of the helias type and imposed the reduction of bootstrap current. The obtained configuration only presents a modest reduction of total bootstrap current, but the local current density is reduced along the minor radii. Further investigations are developed to understand the reason of this modest improvement.
Tests for informative cluster size using a novel balanced bootstrap scheme.
Nevalainen, Jaakko; Oja, Hannu; Datta, Somnath
2017-07-20
Clustered data are often encountered in biomedical studies, and to date, a number of approaches have been proposed to analyze such data. However, the phenomenon of informative cluster size (ICS) is a challenging problem, and its presence has an impact on the choice of a correct analysis methodology. For example, Dutta and Datta (2015, Biometrics) presented a number of marginal distributions that could be tested. Depending on the nature and degree of informativeness of the cluster size, these marginal distributions may differ, as do the choices of the appropriate test. In particular, they applied their new test to a periodontal data set where the plausibility of the informativeness was mentioned, but no formal test for the same was conducted. We propose bootstrap tests for testing the presence of ICS. A balanced bootstrap method is developed to successfully estimate the null distribution by merging the re-sampled observations with closely matching counterparts. Relying on the assumption of exchangeability within clusters, the proposed procedure performs well in simulations even with a small number of clusters, at different distributions and against different alternative hypotheses, thus making it an omnibus test. We also explain how to extend the ICS test to a regression setting and thereby enhancing its practical utility. The methodologies are illustrated using the periodontal data set mentioned earlier. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Faster family-wise error control for neuroimaging with a parametric bootstrap.
Vandekar, Simon N; Satterthwaite, Theodore D; Rosen, Adon; Ciric, Rastko; Roalf, David R; Ruparel, Kosha; Gur, Ruben C; Gur, Raquel E; Shinohara, Russell T
2017-10-20
In neuroimaging, hundreds to hundreds of thousands of tests are performed across a set of brain regions or all locations in an image. Recent studies have shown that the most common family-wise error (FWE) controlling procedures in imaging, which rely on classical mathematical inequalities or Gaussian random field theory, yield FWE rates (FWER) that are far from the nominal level. Depending on the approach used, the FWER can be exceedingly small or grossly inflated. Given the widespread use of neuroimaging as a tool for understanding neurological and psychiatric disorders, it is imperative that reliable multiple testing procedures are available. To our knowledge, only permutation joint testing procedures have been shown to reliably control the FWER at the nominal level. However, these procedures are computationally intensive due to the increasingly available large sample sizes and dimensionality of the images, and analyses can take days to complete. Here, we develop a parametric bootstrap joint testing procedure. The parametric bootstrap procedure works directly with the test statistics, which leads to much faster estimation of adjusted p-values than resampling-based procedures while reliably controlling the FWER in sample sizes available in many neuroimaging studies. We demonstrate that the procedure controls the FWER in finite samples using simulations, and present region- and voxel-wise analyses to test for sex differences in developmental trajectories of cerebral blood flow. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Inference for Optimal Dynamic Treatment Regimes using an Adaptive m-out-of-n Bootstrap Scheme
Chakraborty, Bibhas; Laber, Eric B.; Zhao, Yingqi
2013-01-01
Summary A dynamic treatment regime consists of a set of decision rules that dictate how to individualize treatment to patients based on available treatment and covariate history. A common method for estimating an optimal dynamic treatment regime from data is Q-learning which involves nonsmooth operations of the data. This nonsmoothness causes standard asymptotic approaches for inference like the bootstrap or Taylor series arguments to breakdown if applied without correction. Here, we consider the m-out-of-n bootstrap for constructing confidence intervals for the parameters indexing the optimal dynamic regime. We propose an adaptive choice of m and show that it produces asymptotically correct confidence sets under fixed alternatives. Furthermore, the proposed method has the advantage of being conceptually and computationally much more simple than competing methods possessing this same theoretical property. We provide an extensive simulation study to compare the proposed method with currently available inference procedures. The results suggest that the proposed method delivers nominal coverage while being less conservative than alternatives. The proposed methods are implemented in the qLearn R-package and have been made available on the Comprehensive R-Archive Network (http://cran.r-project.org/). Analysis of the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) study is used as an illustrative example. PMID:23845276
Bootstrap resampling approach to disaggregate analysis of road crashes in Hong Kong.
Pei, Xin; Sze, N N; Wong, S C; Yao, Danya
2016-10-01
Road safety affects health and development worldwide; thus, it is essential to examine the factors that influence crashes and injuries. As the relationships between crashes, crash severity, and possible risk factors can vary depending on the type of collision, we attempt to develop separate prediction models for different crash types (i.e., single- versus multi-vehicle crashes and slight injury versus killed and serious injury crashes). Taking advantage of the availability of crash and traffic data disaggregated by time and space, it is possible to identify the factors that may contribute to crash risks in Hong Kong, including traffic flow, road design, and weather conditions. To remove the effects of excess zeros on prediction performance in a highly disaggregated crash prediction model, a bootstrap resampling method is applied. The results indicate that more accurate and reliable parameter estimates, with reduced standard errors, can be obtained with the use of a bootstrap resampling method. Results revealed that factors including rainfall, geometric design, traffic control, and temporal variations all determined the crash risk and crash severity. This helps to shed light on the development of remedial engineering and traffic management and control measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Index of Biological Integrity and the bootstrap revisited: an example from Minnesota streams
Dolph, Christine L.; Sheshukov, Aleksey Y.; Chizinski, Christopher J.; Vondracek, Bruce C.; Wilson, Bruce
2010-01-01
Multimetric indices, such as the Index of Biological Integrity (IBI), are increasingly used by management agencies to determine whether surface water quality is impaired. However, important questions about the variability of these indices have not been thoroughly addressed in the scientific literature. In this study, we used a bootstrap approach to quantify variability associated with fish IBIs developed for streams in two Minnesota river basins. We further placed this variability into a management context by comparing it to impairment thresholds currently used in water quality determinations for Minnesota streams. We found that 95% confidence intervals ranged as high as 40 points for IBIs scored on a 0–100 point scale. However, on average, 90% of IBI scores calculated from bootstrap replicate samples for a given stream site yielded the same impairment status as the original IBI score. We suggest that sampling variability in IBI scores is related to both the number of fish and the number of rare taxa in a field collection. A comparison of the effects of different scoring methods on IBI variability indicates that a continuous scoring method may reduce the amount of bias in IBI scores.
Directory of Open Access Journals (Sweden)
Gu Xun
2007-03-01
Full Text Available Abstract Background Phylogenetically related miRNAs (miRNA families convey important information of the function and evolution of miRNAs. Due to the special sequence features of miRNAs, pair-wise sequence identity between miRNA precursors alone is often inadequate for unequivocally judging the phylogenetic relationships between miRNAs. Most of the current methods for miRNA classification rely heavily on manual inspection and lack measurements of the reliability of the results. Results In this study, we designed an analysis pipeline (the Phylogeny-Bootstrap-Cluster (PBC pipeline to identify miRNA families based on branch stability in the bootstrap trees derived from overlapping genome-wide miRNA sequence sets. We tested the PBC analysis pipeline with the miRNAs from six animal species, H. sapiens, M. musculus, G. gallus, D. rerio, D. melanogaster, and C. elegans. The resulting classification was compared with the miRNA families defined in miRBase. The two classifications were largely consistent. Conclusion The PBC analysis pipeline is an efficient method for classifying large numbers of heterogeneous miRNA sequences. It requires minimum human involvement and provides measurements of the reliability of the classification results.
Recovery of signal loss adopting the residual bootstrap method in fetal heart rate dynamics.
Lee, Sun-Kyung; Park, Young-Sun; Cha, Kyung-Joon
2018-03-19
Fetal heart rate (FHR) data obtained from a non-stress test (NST) can be presented in a type of time series, which is accompanied by signal loss due to physical and biological causes. To recover or estimate FHR data, which is subjected to a high rate of signal loss, time series models [second-order autoregressive (AR(2)), first-order autoregressive conditional heteroscedasticity (ARCH(1)) and empirical mode decomposition and vector autoregressive (EMD-VAR)] and the residual bootstrap method were applied. The ARCH(1) model with the residual bootstrap technique was the most accurate [root mean square error (RMSE), 2.065] as it reflects the nonlinearity of the FHR data [mean absolute error (MAE) for approximate entropy (ApEn), 0.081]. As a result, the goal of predicting fetal health and identifying a high-risk pregnancy could be achieved. These trials may be effectively used to save the time and cost of repeating the NST when the fetal diagnosis is impossible owing to a large amount of signal loss.
Severiano, Ana; Carriço, João A.; Robinson, D. Ashley; Ramirez, Mário; Pinto, Francisco R.
2011-01-01
Several research fields frequently deal with the analysis of diverse classification results of the same entities. This should imply an objective detection of overlaps and divergences between the formed clusters. The congruence between classifications can be quantified by clustering agreement measures, including pairwise agreement measures. Several measures have been proposed and the importance of obtaining confidence intervals for the point estimate in the comparison of these measures has been highlighted. A broad range of methods can be used for the estimation of confidence intervals. However, evidence is lacking about what are the appropriate methods for the calculation of confidence intervals for most clustering agreement measures. Here we evaluate the resampling techniques of bootstrap and jackknife for the calculation of the confidence intervals for clustering agreement measures. Contrary to what has been shown for some statistics, simulations showed that the jackknife performs better than the bootstrap at accurately estimating confidence intervals for pairwise agreement measures, especially when the agreement between partitions is low. The coverage of the jackknife confidence interval is robust to changes in cluster number and cluster size distribution. PMID:21611165
A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.
Liang, Faming; Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.
A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data
Kim, Jinsu; Song, Qifan
2016-01-01
Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469
How efficient are Greek hospitals? A case study using a double bootstrap DEA approach.
Kounetas, Kostas; Papathanassopoulos, Fotis
2013-12-01
The purpose of this study was to measure Greek hospital performance using different input-output combinations, and to identify the factors that influence their efficiency thus providing policy makers with valuable input for the decision-making process. Using a unique dataset, we estimated the productive efficiency of each hospital through a bootstrapped data envelopment analysis (DEA) approach. In a second stage, we explored, using a bootstrapped truncated regression, the impact of environmental factors on hospitals' technical and scale efficiency. Our results reveal that over 80% of the examined hospitals appear to have a technical efficiency lower than 0.8, while the majority appear to be scale efficient. Moreover, efficiency performance differed with inclusion of medical examinations as an additional variable. On the other hand, bed occupancy ratio appeared to affect both technical and scale efficiency in a rather interesting way, while the adoption of advanced medical equipment and the type of hospital improves scale and technical efficiency, correspondingly. The findings of this study on Greek hospitals' performance are not encouraging. Furthermore, our results raise questions regarding the number of hospitals that should operate, and which type of hospital is more efficient. Finally, the results indicate the role of medical equipment in performance, confirming its misallocation in healthcare expenditure.
International Nuclear Information System (INIS)
Chu, Hsiao-Ping; Chang Tsangyao
2012-01-01
This study applies bootstrap panel Granger causality to test whether energy consumption promotes economic growth using data from G-6 countries over the period of 1971–2010. Both nuclear and oil consumption data are used in this study. Regarding the nuclear consumption-economic growth nexus, nuclear consumption causes economic growth in Japan, the UK, and the US; economic growth causes nuclear consumption in the US; nuclear consumption and economic growth show no causal relation in Canada, France and Germany. Regarding oil consumption-economic growth nexus, we find that there is one-way causality from economic growth to oil consumption only in the US, and that oil consumption does not Granger cause economic growth in G-6 countries except Germany and Japan. Our results have important policy implications for the G-6 countries within the context of economic development. - Highlights: ► Bootstrap panel Granger causality test whether energy consumption promotes economic growth. ► Data from G-6 countries for both nuclear and oil consumption data are used. ► Results have important policy implications within the context of economic development.
Gray bootstrap method for estimating frequency-varying random vibration signals with small samples
Directory of Open Access Journals (Sweden)
Wang Yanqing
2014-04-01
Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
Off-critical statistical models: factorized scattering theories and bootstrap program
International Nuclear Information System (INIS)
Mussardo, G.
1992-01-01
We analyze those integrable statistical systems which originate from some relevant perturbations of the minimal models of conformal field theories. When only massive excitations are present, the systems can be efficiently characterized in terms of the relativistic scattering data. We review the general properties of the factorizable S-matrix in two dimensions with particular emphasis on the bootstrap principle. The classification program of the allowed spins of conserved currents and of the non-degenerate S-matrices is discussed and illustrated by means of some significant examples. The scattering theories of several massive perturbations of the minimal models are fully discussed. Among them are the Ising model, the tricritical Ising model, the Potts models, the series of the non-unitary minimal models M 2,2n+3 , the non-unitary model M 3,5 and the scaling limit of the polymer system. The ultraviolet limit of these massive integrable theories can be exploited by the thermodynamics Bethe ansatz, in particular the central charge of the original conformal theories can be recovered from the scattering data. We also consider the numerical method based on the so-called conformal space truncated approach which confirms the theoretical results and allows a direct measurement of the scattering data, i.e. the masses and the S-matrix of the particles in bootstrap interaction. The problem of computing the off-critical correlation functions is discussed in terms of the form-factor approach
Analytic bounds and emergence of AdS2 physics from the conformal bootstrap
Mazáč, Dalimil
2017-04-01
We study analytically the constraints of the conformal bootstrap on the lowlying spectrum of operators in field theories with global conformal symmetry in one and two spacetime dimensions. We introduce a new class of linear functionals acting on the conformal bootstrap equation. In 1D, we use the new basis to construct extremal functionals leading to the optimal upper bound on the gap above identity in the OPE of two identical primary operators of integer or half-integer scaling dimension. We also prove an upper bound on the twist gap in 2D theories with global conformal symmetry. When the external scaling dimensions are large, our functionals provide a direct point of contact between crossing in a 1D CFT and scattering of massive particles in large AdS2. In particular, CFT crossing can be shown to imply that appropriate OPE coefficients exhibit an exponential suppression characteristic of massive bound states, and that the 2D flat-space S-matrix should be analytic away from the real axis.
Espinoza, Benjamin; Gartside, Paul; Kovan-Bakan, Merve; Mamatelashvili, Ana
2012-01-01
A space is `n-strong arc connected' (n-sac) if for any n points in the space there is an arc in the space visiting them in order. A space is omega-strong arc connected (omega-sac) if it is n-sac for all n. We study these properties in finite graphs, regular continua, and rational continua. There are no 4-sac graphs, but there are 3-sac graphs and graphs which are 2-sac but not 3-sac. For every n there is an n-sac regular continuum, but no regular continuum is omega-sac. There is an omega-sac ...
Abortion: Strong's counterexamples fail
DEFF Research Database (Denmark)
Di Nucci, Ezio
2009-01-01
This paper shows that the counterexamples proposed by Strong in 2008 in the Journal of Medical Ethics to Marquis's argument against abortion fail. Strong's basic idea is that there are cases--for example, terminally ill patients--where killing an adult human being is prima facie seriously morally......'s scenarios have some valuable future or admitted that killing them is not seriously morally wrong. Finally, if "valuable future" is interpreted as referring to objective standards, one ends up with implausible and unpalatable moral claims....
Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji
Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.
International Nuclear Information System (INIS)
Marier, D.
1992-01-01
This article presents the results of a financial rankings survey which show a strong economic activity in the independent energy industry. The topics of the article include advisor turnover, overseas banks, and the increase in public offerings. The article identifies the top project finance investors for new projects and restructurings and rankings for lenders
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Ozel, M.E.; Mayer-Hasselwander, H.
1985-01-01
This paper discusses the bootstrap scheme which fits well for many astronomical applications. It is based on the well-known sampling plan called ''sampling with replacement''. Digital computers make the method very practical for the investigation of various trends present in a limited set of data which is usually a small fraction of the total population. The authors attempt to apply the method and demonstrate its feasibility. The study indicates that the discrete nature of high energy gamma-ray data makes the bootstrap method especially attractive for gamma-ray astronomy. Present analysis shows that the ratio of pulse strengths is variable with a 99.8% confidence
Tarrés, J; Fina, M; Piedrafita, J
2010-09-01
The aim of this study was to compare the goodness of fit of the threshold models with homoscedasticity or heteroscedasticity and the grouped data model for the analysis of calving ease in beef cattle by using a parametric bootstrap procedure. Field data included 8,205 records of the Bruna dels Pirineus beef cattle breed in the Pyrenean mountain areas of Catalonia (Spain). The actual distribution was 81.81% of calvings without assistance, 11.02% slightly assisted by the farmer, 5.12% strongly assisted by the farmer, 0.89% assisted by the veterinarian, and 1.16% cesarean, but these percentages were very different in the different herds. This can be explained partially by the different subjective way of scoring of each farmer. Primiparous cows had a greater (P < 0.001) difficulty calving than cows with 5 or more parities (11.74 vs. 4.49% of calvings strongly assisted by the farmer or the veterinarian and 2.8 vs. 0.65% cesarean). Male calves caused greater (P < 0.001) calving difficulty than females (7.71% of male calvings strongly assisted by the farmer or the veterinarian vs. 4.25% of females and 1.83% cesarean in males vs. 0.47% in females). The month and year of calving also had a strong influence on calving ease. These data were analyzed using 3 different models: the threshold models with homoscedasticity or heteroscedasticity and the grouped data model. The bootstrap comparison among models suggested that the threshold models, even allowing for heteroscedasticity, did not fit the herd effects well. In contrast, fitting deficiencies were not observed for the grouped data model in any factor. The variance of direct effect of the calf was estimated using the 3 models, and the heritability estimate ranged from 0.165 for the grouped data model to 0.185 for the hereroscedastic threshold model. This heritability was moderate, but it would justify the inclusion of direct effects of the calf on calving ease in the breeding objective. Overall, results highlighted the
Strong Electroweak Symmetry Breaking
Grinstein, Benjamin
2011-01-01
Models of spontaneous breaking of electroweak symmetry by a strong interaction do not have fine tuning/hierarchy problem. They are conceptually elegant and use the only mechanism of spontaneous breaking of a gauge symmetry that is known to occur in nature. The simplest model, minimal technicolor with extended technicolor interactions, is appealing because one can calculate by scaling up from QCD. But it is ruled out on many counts: inappropriately low quark and lepton masses (or excessive FCNC), bad electroweak data fits, light scalar and vector states, etc. However, nature may not choose the minimal model and then we are stuck: except possibly through lattice simulations, we are unable to compute and test the models. In the LHC era it therefore makes sense to abandon specific models (of strong EW breaking) and concentrate on generic features that may indicate discovery. The Technicolor Straw Man is not a model but a parametrized search strategy inspired by a remarkable generic feature of walking technicolor,...
Plasmons in strong superconductors
International Nuclear Information System (INIS)
Baldo, M.; Ducoin, C.
2011-01-01
We present a study of the possible plasmon excitations that can occur in systems where strong superconductivity is present. In these systems the plasmon energy is comparable to or smaller than the pairing gap. As a prototype of these systems we consider the proton component of Neutron Star matter just below the crust when electron screening is not taken into account. For the realistic case we consider in detail the different aspects of the elementary excitations when the proton, electron components are considered within the Random-Phase Approximation generalized to the superfluid case, while the influence of the neutron component is considered only at qualitative level. Electron screening plays a major role in modifying the proton spectrum and spectral function. At the same time the electron plasmon is strongly modified and damped by the indirect coupling with the superfluid proton component, even at moderately low values of the gap. The excitation spectrum shows the interplay of the different components and their relevance for each excitation modes. The results are relevant for neutrino physics and thermodynamical processes in neutron stars. If electron screening is neglected, the spectral properties of the proton component show some resemblance with the physical situation in high-T c superconductors, and we briefly discuss similarities and differences in this connection. In a general prospect, the results of the study emphasize the role of Coulomb interaction in strong superconductors.
Strong-coupling approximations
International Nuclear Information System (INIS)
Abbott, R.B.
1984-03-01
Standard path-integral techniques such as instanton calculations give good answers for weak-coupling problems, but become unreliable for strong-coupling. Here we consider a method of replacing the original potential by a suitably chosen harmonic oscillator potential. Physically this is motivated by the fact that potential barriers below the level of the ground-state energy of a quantum-mechanical system have little effect. Numerically, results are good, both for quantum-mechanical problems and for massive phi 4 field theory in 1 + 1 dimensions. 9 references, 6 figures
International Nuclear Information System (INIS)
Ebata, T.
1981-01-01
With an assumed weak multiplet structure for bosonic hadrons, which is consistent with the ΔI = 1/2 rule, it is shown that the strong interaction effective hamiltonian is compatible with the weak SU(2) x U(1) gauge transformation. Especially the rho-meson transforms as a triplet under SU(2)sub(w), and this is the origin of the rho-photon analogy. It is also shown that the existence of the non-vanishing Cabibbo angle is a necessary condition for the absence of the exotic hadrons. (orig.)
Bootstrapping in a language of thought: a formal model of numerical concept learning.
Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D
2012-05-01
In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning. Copyright Â© 2011 Elsevier B.V. All rights reserved.
Two Bootstrap Strategies for a k-Problem up to Location-Scale with Dependent Samples
Directory of Open Access Journals (Sweden)
Jean-François Quessy
2014-01-01
Full Text Available This paper extends the work of Quessy and Éthier (2012 who considered tests for the k-sample problem with dependent samples. Here, the marginal distributions are allowed, under H0, to differ according to their mean and their variance; in other words, one focuses on the shape of the distributions. Although easily stated, this problem nevertheless requires a careful treatment for the computation of valid P values. To this end, two bootstrap strategies based on the multiplier central limit theorem are proposed, both exploiting a representation of the test statistics in terms of a Hadamard differentiable functional. This accounts for the fact that one works with empirically standardized data instead of the original observations. Simulations reported show the nice sample properties of the method based on Cramér-von Mises and characteristic function type statistics. The newly introduced tests are illustrated on the marginal distributions of the eight-dimensional Oil currency data set.
A Bootstrapping Based Approach for Open Geo-entity Relation Extraction
Directory of Open Access Journals (Sweden)
YU Li
2016-05-01
Full Text Available Extracting spatial relations and semantic relations between two geo-entities from Web texts, asks robust and effective solutions. This paper puts forward a novel approach: firstly, the characteristics of terms (part-of-speech, position and distance are analyzed by means of bootstrapping. Secondly, the weight of each term is calculated and the keyword is picked out as the clue of geo-entity relations. Thirdly, the geo-entity pairs and their keywords are organized into structured information. Finally, an experiment is conducted with Baidubaike and Stanford CoreNLP. The study shows that the presented method can automatically explore part of the lexical features and find additional relational terms which neither the domain expert knowledge nor large scale corpora need. Moreover, compared with three classical frequency statistics methods, namely Frequency, TF-IDF and PPMI, the precision and recall are improved about 5% and 23% respectively.
A bootstrapped PMHT with feature measurements and a new way to derive its information matrix
Lu, Qin; Domrese, Katherine; Willett, Peter; Bar-Shalom, Yaakov; Pattipati, Krishna
2017-05-01
The probabilistic multiple-hypothesis tracker (PMHT), a tracking algorithm of considerable theoretical elegance based on the expectation-maximization (EM) algorithm, will be considered for the problem of multiple target tracking (MTT) with multiple sensors in clutter. Aside from position observations, continuous measurements associated with the unique and constant feature of each target are incorporated to jointly estimate the states and feature of the targets for the sake of tracking and classification, leading to a bootstrapped implementation of the PMHT. In addition, we rederived the information matrix for the big state vector stacking states for all the targets at all the time steps during the observation time. Simulation results have been conducted for both closely spaced and well separated scenarios with and without feature measurements. The normalized estimation error squared (NEES) calculated using the information matrix for both scenarios with and without feature measurements are within the 95% probability region. In other words, the estimates are consistent with the corresponding covariances.
The use of bootstrap resampling to assess the variability of Draize tissue scores.
Worth, A P; Cronin, M T
2001-01-01
The acute dermal and ocular effects of chemicals are generally assessed by performing the Draize skin and eye tests, respectively. Because the animal data obtained in these tests are also used for the development and validation of alternative methods for skin and eye irritation, it is important to assess the inherent variability of the animal data, since this variability places an upper limit on the predictive performance that can be expected of any alternative model. The statistical method of bootstrap resampling was used to estimate the variability arising from the use of different animals and time-points, and the estimates of variability were used to determine the maximal extent to which Draize test tissue scores can be predicted.
Bootstrap-Calibrated Interval Estimates for Latent Variable Scores in Item Response Theory.
Liu, Yang; Yang, Ji Seung
2017-09-06
In most item response theory applications, model parameters need to be first calibrated from sample data. Latent variable (LV) scores calculated using estimated parameters are thus subject to sampling error inherited from the calibration stage. In this article, we propose a resampling-based method, namely bootstrap calibration (BC), to reduce the impact of the carryover sampling error on the interval estimates of LV scores. BC modifies the quantile of the plug-in posterior, i.e., the posterior distribution of the LV evaluated at the estimated model parameters, to better match the corresponding quantile of the true posterior, i.e., the posterior distribution evaluated at the true model parameters, over repeated sampling of calibration data. Furthermore, to achieve better coverage of the fixed true LV score, we explore the use of BC in conjunction with Jeffreys' prior. We investigate the finite-sample performance of BC via Monte Carlo simulations and apply it to two empirical data examples.
Beyond Crossing Fibers: Bootstrap Probabilistic Tractography Using Complex Subvoxel Fiber Geometries
Campbell, Jennifer S. W.; MomayyezSiahkal, Parya; Savadjiev, Peter; Leppert, Ilana R.; Siddiqi, Kaleem; Pike, G. Bruce
2014-01-01
Diffusion magnetic resonance imaging fiber tractography is a powerful tool for investigating human white matter connectivity in vivo. However, it is prone to false positive and false negative results, making interpretation of the tractography result difficult. Optimal tractography must begin with an accurate description of the subvoxel white matter fiber structure, includes quantification of the uncertainty in the fiber directions obtained, and quantifies the confidence in each reconstructed fiber tract. This paper presents a novel and comprehensive pipeline for fiber tractography that meets the above requirements. The subvoxel fiber geometry is described in detail using a technique that allows not only for straight crossing fibers but for fibers that curve and splay. This technique is repeatedly performed within a residual bootstrap statistical process in order to efficiently quantify the uncertainty in the subvoxel geometries obtained. A robust connectivity index is defined to quantify the confidence in the reconstructed connections. The tractography pipeline is demonstrated in the human brain. PMID:25389414
Hypothesis Testing of Population Percentiles via the Wald Test with Bootstrap Variance Estimates
Johnson, William D.; Romer, Jacob E.
2016-01-01
Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data. PMID:27034909
Bootstrapping hypercubic and hypertetrahedral theories in three dimensions arXiv
Stergiou, Andreas
There are three generalizations of the Platonic solids that exist in all dimensions, namely the hypertetrahedron, the hypercube, and the hyperoctahedron, with the latter two being dual. Conformal field theories with the associated symmetry groups as global symmetries can be argued to exist in $d=3$ spacetime dimensions if the $\\varepsilon=4-d$ expansion is valid when $\\varepsilon\\to1$. In this paper hypercubic and hypertetrahedral theories are studied with the non-perturbative numerical conformal bootstrap. In the $N=3$ cubic case it is found that a bound with a kink is saturated by a solution with properties that cannot be reconciled with the $\\varepsilon$ expansion of the cubic theory. Possible implications for cubic magnets and structural phase transitions are discussed. For the hypertetrahedral theory evidence is found that the non-conformal window that is seen with the $\\varepsilon$ expansion exists in $d=3$ as well, and a rough estimate of its extent is given.
Solid oxide fuel cell power plant having a bootstrap start-up system
Energy Technology Data Exchange (ETDEWEB)
Lines, Michael T
2016-10-04
The bootstrap start-up system (42) achieves an efficient start-up of the power plant (10) that minimizes formation of soot within a reformed hydrogen rich fuel. A burner (48) receives un-reformed fuel directly from the fuel supply (30) and combusts the fuel to heat cathode air which then heats an electrolyte (24) within the fuel cell (12). A dilute hydrogen forming gas (68) cycles through a sealed heat-cycling loop (66) to transfer heat and generated steam from an anode side (32) of the electrolyte (24) through fuel processing system (36) components (38, 40) and back to an anode flow field (26) until fuel processing system components (38, 40) achieve predetermined optimal temperatures and steam content. Then, the heat-cycling loop (66) is unsealed and the un-reformed fuel is admitted into the fuel processing system (36) and anode flow (26) field to commence ordinary operation of the power plant (10).
A bootstrapped, low-noise, and high-gain photodetector for shot noise measurement
Energy Technology Data Exchange (ETDEWEB)
Zhou, Haijun; Yang, Wenhai; Li, Zhixiu; Li, Xuefeng; Zheng, Yaohui, E-mail: yhzheng@sxu.edu.cn [State Key Laboratory of Quantum Optics and Quantum Optics Devices, Institute of Opto-Electronics, Shanxi University, Taiyuan 030006 (China)
2014-01-15
We presented a low-noise, high-gain photodetector based on the bootstrap structure and the L-C (inductance and capacitance) combination. Electronic characteristics of the photodetector, including electronic noise, gain and frequency response, and dynamic range, were verified through a single-frequency Nd:YVO{sub 4} laser at 1064 nm with coherent output. The measured shot noise of 50 μW laser was 13 dB above the electronic noise at the analysis frequency of 2 MHz, and 10 dB at 3 MHz. And a maximum clearance of 28 dB at 2 MHz was achieved when 1.52 mW laser was illuminated. In addition, the photodetector showed excellent linearities for both DC and AC amplifications in the laser power range between 12.5 μW and 1.52 mW.
Energy Technology Data Exchange (ETDEWEB)
Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bugbee, Bruce [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gotseff, Peter [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2017-10-03
Capturing technical and economic impacts of solar photovoltaics (PV) and other distributed energy resources (DERs) on electric distribution systems can require high-time resolution (e.g. 1 minute), long-duration (e.g. 1 year) simulations. However, such simulations can be computationally prohibitive, particularly when including complex control schemes in quasi-steady-state time series (QSTS) simulation. Various approaches have been used in the literature to down select representative time segments (e.g. days), but typically these are best suited for lower time resolutions or consider only a single data stream (e.g. PV production) for selection. We present a statistical approach that combines stratified sampling and bootstrapping to select representative days while also providing a simple method to reassemble annual results. We describe the approach in the context of a recent study with a utility partner. This approach enables much faster QSTS analysis by simulating only a subset of days, while maintaining accurate annual estimates.
Application of Robust Regression and Bootstrap in Poductivity Analysis of GERD Variable in EU27
Directory of Open Access Journals (Sweden)
Dagmar Blatná
2014-06-01
Full Text Available The GERD is one of Europe 2020 headline indicators being tracked within the Europe 2020 strategy. The headline indicator is the 3% target for the GERD to be reached within the EU by 2020. Eurostat defi nes “GERD” as total gross domestic expenditure on research and experimental development in a percentage of GDP. GERD depends on numerous factors of a general economic background, namely of employment, innovation and research, science and technology. The values of these indicators vary among the European countries, and consequently the occurrence of outliers can be anticipated in corresponding analyses. In such a case, a classical statistical approach – the least squares method – can be highly unreliable, the robust regression methods representing an acceptable and useful tool. The aim of the present paper is to demonstrate the advantages of robust regression and applicability of the bootstrap approach in regression based on both classical and robust methods.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
International Nuclear Information System (INIS)
Li, Ke; Lin, Boqiang
2017-01-01
This paper proposes a total-factor energy consumption performance index (TEPI) for measuring China's energy efficiency across 30 provinces during the period 1997 to 2012. The TEPI is derived by solving an improved non-radial data envelopment analysis (DEA) model, which is based on an energy distance function. The production possibility set is constructed by combining the super-efficiency and sequential DEA models to avoid “discriminating power problem” and “technical regress”. In order to explore the impacts of technological progress on TEPI and perform statistical inferences on the results, a two-stage double bootstrap approach is adopted. The important findings are that China's energy technology innovation produces a negative effect on TEPI, while technology import and imitative innovation produce positive effects on TEPI. Thus, the main contribution of TEPI improvement is technology import. These conclusions imply that technology import especially foreign direct investment (FDI) is important for imitative innovation and can improve China's energy efficiency. In the long run, as the technical level of China approaches to the frontier, energy technology innovation and its wide adoption become a sustained way to improve energy efficiency. Therefore, it is urgent for China to introduce measures such as technology translation and spillover policies as well as energy pricing reforms to support energy technology innovation. - Highlights: • A total-factor energy consumption performance index (TEPI) is introduced. • Three types of technological progress have various effects on TEPI. • FDI is the main contributor of TEPI improvement. • An improved DEA calculation method is introduced. • A two-stage double-bootstrap non-radial DEA model is used.
On strongly condensing operators
Czech Academy of Sciences Publication Activity Database
Erzakova, N.A.; Väth, Martin
2017-01-01
Roč. 196, č. 1 (2017), s. 309-323 ISSN 0373-3114 Institutional support: RVO:67985840 Keywords : asymptotic derivative * compactness * Fréchet derivative Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.864, year: 2016 http://link.springer.com/article/10.1007%2Fs10231-016-0573-8
The Bootstrap Discovery Behaviour (BDB): a new outlook on usability evaluation.
Borsci, Simone; Londei, Alessandro; Federici, Stefano
2011-02-01
The value of λ is one of the main issues debated in international usability studies. The debate is centred on the deficiencies of the mathematical return on investment model (ROI model) of Nielsen and Landauer (1993). The ROI model is discussed in order to identify the base of another model that, respecting Nielsen and Landauer's one, tries to consider a large number of variables for the estimation of the number of evaluators needed for an interface. Using the bootstrap model (Efron 1979), we can take into account: (a) the interface properties, as the properties at zero condition of evaluation and (b) the probability that the population discovery behaviour is represented by all the possible discovery behaviours of a sample. Our alternative model, named Bootstrap Discovery Behaviour (BDB), provides an alternative estimation of the number of experts and users needed for a usability evaluation. Two experimental groups of users and experts are involved in the evaluation of a website (http://www.serviziocivile.it). Applying the BDB model to the problems identified by the two groups, we found that 13 experts and 20 users are needed to identify 80% of usability problems, instead of 6 experts and 7 users required according to the estimation of the discovery likelihood provided by the ROI model. The consequence of the difference between the results of those models is that in following the BDB the costs of usability evaluation increase, although this is justified considering that the results obtained have the best probability of representing the entire population of experts and users.
Smoothed Bootstrap Aggregation for Assessing Selection Pressure at Amino Acid Sites.
Mingrone, Joseph; Susko, Edward; Bielawski, Joseph
2016-11-01
To detect positive selection at individual amino acid sites, most methods use an empirical Bayes approach. After parameters of a Markov process of codon evolution are estimated via maximum likelihood, they are passed to Bayes formula to compute the posterior probability that a site evolved under positive selection. A difficulty with this approach is that parameter estimates with large errors can negatively impact Bayesian classification. By assigning priors to some parameters, Bayes Empirical Bayes (BEB) mitigates this problem. However, as implemented, it imposes uniform priors, which causes it to be overly conservative in some cases. When standard regularity conditions are not met and parameter estimates are unstable, inference, even under BEB, can be negatively impacted. We present an alternative to BEB called smoothed bootstrap aggregation (SBA), which bootstraps site patterns from an alignment of protein coding DNA sequences to accommodate the uncertainty in the parameter estimates. We show that deriving the correction for parameter uncertainty from the data in hand, in combination with kernel smoothing techniques, improves site specific inference of positive selection. We compare BEB to SBA by simulation and real data analysis. Simulation results show that SBA balances accuracy and power at least as well as BEB, and when parameter estimates are unstable, the performance gap between BEB and SBA can widen in favor of SBA. SBA is applicable to a wide variety of other inference problems in molecular evolution. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Bootstrapping mixed correlators in the five dimensional critical O(N) models
Energy Technology Data Exchange (ETDEWEB)
Li, Zhijin; Su, Ning [George P. and Cynthia W. Mitchell Institute for Fundamental Physics and Astronomy,Texas A& M University, College Station, TX 77843 (United States)
2017-04-18
We use the conformal bootstrap approach to explore 5D CFTs with O(N) global symmetry, which contain N scalars ϕ{sub i} transforming as O(N) vector. Specifically, we study multiple four-point correlators of the leading O(N) vector ϕ{sub i} and the O(N) singlet σ. The crossing symmetry of the four-point functions and the unitarity condition provide nontrivial constraints on the scaling dimensions (Δ{sub ϕ}, Δ{sub σ}) of ϕ{sub i} and σ. With reasonable assumptions on the gaps between scaling dimensions of ϕ{sub i} (σ) and the next O(N) vector ϕ{sub i}{sup ′} (singlet σ{sup ′}) scalar, we are able to isolate the scaling dimensions (Δ{sub ϕ}, Δ{sub σ}) in small islands. In particular, for large N=500, the isolated region is highly consistent with the result obtained from large N expansion. We also study the interacting O(N) CFTs for 1≤N≤100. Isolated regions on (Δ{sub ϕ},Δ{sub σ}) plane are obtained using conformal bootstrap program with lower order of derivatives Λ; however, they disappear after increasing Λ. For N=100, no solution can be found with Λ=25 under the assumptions on the scaling dimensions of next O(N) vector Δ{sub ϕ{sub i{sup ′}}}≥5.0 (singlet Δ{sub σ{sup ′}}≥3.3). These islands are expected to be corresponding to interacting but nonunitary O(N) CFTs. Our results suggest a lower bound on the critical value N{sub c}>100, below which the interacting O(N) CFTs turn into nonunitary.
BOBA FRET: bootstrap-based analysis of single-molecule FRET data.
Directory of Open Access Journals (Sweden)
Sebastian L B König
Full Text Available Time-binned single-molecule Förster resonance energy transfer (smFRET experiments with surface-tethered nucleic acids or proteins permit to follow folding and catalysis of single molecules in real-time. Due to the intrinsically low signal-to-noise ratio (SNR in smFRET time traces, research over the past years has focused on the development of new methods to extract discrete states (conformations from noisy data. However, limited observation time typically leads to pronounced cross-sample variability, i.e., single molecules display differences in the relative population of states and the corresponding conversion rates. Quantification of cross-sample variability is necessary to perform statistical testing in order to assess whether changes observed in response to an experimental parameter (metal ion concentration, the presence of a ligand, etc. are significant. However, such hypothesis testing has been disregarded to date, precluding robust biological interpretation. Here, we address this problem by a bootstrap-based approach to estimate the experimental variability. Simulated time traces are presented to assess the robustness of the algorithm in conjunction with approaches commonly used in thermodynamic and kinetic analysis of time-binned smFRET data. Furthermore, a pair of functionally important sequences derived from the self-cleaving group II intron Sc.ai5γ (d3'EBS1/IBS1 is used as a model system. Through statistical hypothesis testing, divalent metal ions are shown to have a statistically significant effect on both thermodynamic and kinetic aspects of their interaction. The Matlab source code used for analysis (bootstrap-based analysis of smFRET data, BOBA FRET, as well as a graphical user interface, is available via http://www.aci.uzh.ch/rna/.
Karain, Wael I
2017-11-28
Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.
Cheng, Zhaohui; Cai, Miao; Tao, Hongbing; He, Zhifei; Lin, Xiaojun; Lin, Haifeng; Zuo, Yuling
2016-01-01
Objective Township hospitals (THs) are important components of the three-tier rural healthcare system of China. However, the efficiency and productivity of THs have been questioned since the healthcare reform was implemented in 2009. The objective of this study is to analyse the efficiency and productivity changes in THs before and after the reform process. Setting and participants A total of 48 sample THs were selected from the Xiaogan Prefecture in Hubei Province from 2008 to 2014. Outcome measures First, bootstrapping data envelopment analysis (DEA) was performed to estimate the technical efficiency (TE), pure technical efficiency (PTE) and scale efficiency (SE) of the sample THs during the period. Second, the bootstrapping Malmquist productivity index was used to calculate the productivity changes over time. Results The average TE, PTE and SE of the sample THs over the 7-year period were 0.5147, 0.6373 and 0.7080, respectively. The average TE and PTE increased from 2008 to 2012 but declined considerably after 2012. In general, the sample THs experienced a negative shift in productivity from 2008 to 2014. The negative change was 2.14%, which was attributed to a 23.89% decrease in technological changes (TC). The sample THs experienced a positive productivity shift from 2008 to 2012 but experienced deterioration from 2012 to 2014. Conclusions There was considerable space for TE improvement in the sample THs since the average TE was relatively low. From 2008 to 2014, the sample THs experienced a decrease in productivity, and the adverse alteration in TC should be emphasised. In the context of healthcare reform, the factors that influence TE and productivity of THs are complex. Results suggest that numerous quantitative and qualitative studies are necessary to explore the reasons for the changes in TE and productivity. PMID:27836870
Dvali, Gia
2009-01-01
We show that whenever a 4-dimensional theory with N particle species emerges as a consistent low energy description of a 3-brane embedded in an asymptotically-flat (4+d)-dimensional space, the holographic scale of high-dimensional gravity sets the strong coupling scale of the 4D theory. This connection persists in the limit in which gravity can be consistently decoupled. We demonstrate this effect for orbifold planes, as well as for the solitonic branes and string theoretic D-branes. In all cases the emergence of a 4D strong coupling scale from bulk holography is a persistent phenomenon. The effect turns out to be insensitive even to such extreme deformations of the brane action that seemingly shield 4D theory from the bulk gravity effects. A well understood example of such deformation is given by large 4D Einstein term in the 3-brane action, which is known to suppress the strength of 5D gravity at short distances and change the 5D Newton's law into the four-dimensional one. Nevertheless, we observe that the ...
H. T. Schreuder; M. S. Williams
2000-01-01
In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...
Wagstaff, David A.; Elek, Elvira; Kulis, Stephen; Marsiglia, Flavio
2009-01-01
A nonparametric bootstrap was used to obtain an interval estimate of Pearson's "r," and test the null hypothesis that there was no association between 5th grade students' positive substance use expectancies and their intentions to not use substances. The students were participating in a substance use prevention program in which the unit of…
van de Schoot, Rens; Strohmeier, Dagmar
2011-01-01
In the present paper, the application of a parametric bootstrap procedure, as described by van de Schoot, Hoijtink, and Dekovic (2010), will be applied to demonstrate that a direct test of an informative hypothesis offers more informative results compared to testing traditional null hypotheses against catch-all rivals. Also, more power can be…
Kennet-Cohen, Tamar; Kleper, Dvir; Turvall, Elliot
2018-02-01
A frequent topic of psychological research is the estimation of the correlation between two variables from a sample that underwent a selection process based on a third variable. Due to indirect range restriction, the sample correlation is a biased estimator of the population correlation, and a correction formula is used. In the past, bootstrap standard error and confidence intervals for the corrected correlations were examined with normal data. The present study proposes a large-sample estimate (an analytic method) for the standard error, and a corresponding confidence interval for the corrected correlation. Monte Carlo simulation studies involving both normal and non-normal data were conducted to examine the empirical performance of the bootstrap and analytic methods. Results indicated that with both normal and non-normal data, the bootstrap standard error and confidence interval were generally accurate across simulation conditions (restricted sample size, selection ratio, and population correlations) and outperformed estimates of the analytic method. However, with certain combinations of distribution type and model conditions, the analytic method has an advantage, offering reasonable estimates of the standard error and confidence interval without resorting to the bootstrap procedure's computer-intensive approach. We provide SAS code for the simulation studies. © 2017 The British Psychological Society.
Czech Academy of Sciences Publication Activity Database
Kyselý, Jan
2010-01-01
Roč. 101, 3-4 (2010), s. 345-361 ISSN 0177-798X R&D Projects: GA AV ČR KJB300420801 Institutional research plan: CEZ:AV0Z30420517 Keywords : bootstrap * extreme value analysis * confidence intervals * heavy-tailed distributions * precipitation amounts Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.684, year: 2010
Essid, Hedi; Ouellette, Pierre; Vigeant, Stephane
2010-01-01
The objective of this paper is to measure the efficiency of high schools in Tunisia. We use a statistical data envelopment analysis (DEA)-bootstrap approach with quasi-fixed inputs to estimate the precision of our measure. To do so, we developed a statistical model serving as the foundation of the data generation process (DGP). The DGP is…
Antonella Del Rosso
2016-01-01
Twenty years of designing, building and testing a number of innovative technologies, with the strong belief that the endeavour would lead to a historic breakthrough. The Bulletin publishes an abstract of the Courier’s interview with Barry Barish, one of the founding fathers of LIGO. The plots show the signals of gravitational waves detected by the twin LIGO observatories at Livingston, Louisiana, and Hanford, Washington. (Image: Caltech/MIT/LIGO Lab) On 11 February, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo collaborations published a historic paper in which they showed a gravitational signal emitted by the merger of two black holes. These results come after 20 years of hard work by a large collaboration of scientists operating the two LIGO observatories in the US. Barry Barish, Linde Professor of Physics, Emeritus at the California Institute of Technology and former Director of the Global Design Effort for the Internat...
Cutti, A G; Parel, I; Raggi, M; Petracci, E; Pellegrini, A; Accardo, A P; Sacchetti, R; Porcellini, G
2014-03-21
Quantitative motion analysis protocols have been developed to assess the coordination between scapula and humerus. However, the application of these protocols to test whether a subject's scapula resting position or pattern of coordination is "normal", is precluded by the unavailability of reference prediction intervals and bands, respectively. The aim of this study was to present such references for the "ISEO" protocol, by using the non-parametric Bootstrap approach and two parametric Gaussian methods (based on Student's T and Normal distributions). One hundred and eleven asymptomatic subjects were divided into three groups based on their age (18-30, 31-50, and 51-70). For each group, "monolateral" prediction bands and intervals were computed for the scapulo-humeral patterns and the scapula resting orientation, respectively. A fourth group included the 36 subjects (42 ± 13 year-old) for whom the scapulo-humeral coordination was measured bilaterally, and "differential" prediction bands and intervals were computed, which describe right-to-left side differences. Bootstrap and Gaussian methods were compared using cross-validation analyses, by evaluating the coverage probability in comparison to a 90% target. Results showed a mean coverage for Bootstrap from 86% to 90%, compared to 67-70% for parametric bands and 87-88% for parametric intervals. Bootstrap prediction bands showed a distinctive change in amplitude and mean pattern related to age, with an increase toward scapula retraction, lateral rotation and posterior tilt. In conclusion, Bootstrap ensures an optimal coverage and should be preferred over parametric methods. Moreover, the stratification of "monolateral" prediction bands and intervals by age appears relevant for the correct classification of patients. Copyright © 2014 Elsevier Ltd. All rights reserved.
Figueiras, Adolfo; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen
2005-10-01
In recent years a great number of studies have applied generalised additive models (GAMs) to time series data to estimate the short term health effects of air pollution. Lately, however, it has been found that concurvity--the non-parametric analogue of multicollinearity--might lead to underestimation of standard errors of the effects of independent variables. Underestimation of standard errors means that for concurvity levels commonly present in the data, the risk of committing type I error rises by over threefold. This study developed a conditional bootstrap methology that consists of assuming that the outcome in any observation is conditional upon the values of the set of independent variables used. It then tested this procedure by means of a simulation study using a Poisson additive model. The response variable of this model is a function of an unobserved confounding variable (that introduces trend and seasonality), real black smoke data, and temperature. Scenarios were created with different coefficients and degrees of concurvity. Conditional bootstrap provides confidence intervals with coverages close to nominal (95%), irrespective of the degree of concurvity, number of variables in the model or magnitude of the coefficient to be estimated (for example, for a concurvity of 0.85, bootstrap confidence interval coverage is 95% compared with 71% in the case of the asymptotic interval obtained directly with S-plus gam function). The bootstrap method avoids the problem of concurvity in time series studies of air pollution, and is easily generalised to non-linear dose-risk effects. All bootstrap calculations described in this paper can be performed using S-Plus gam.boot software.
PREFACE: Strongly correlated electron systems Strongly correlated electron systems
Saxena, Siddharth S.; Littlewood, P. B.
2012-07-01
make use of 'small' electrons packed to the highest possible density. These are by definition 'strongly correlated'. For example: good photovoltaics must be efficient optical absorbers, which means that photons will generate tightly bound electron-hole pairs (excitons) that must then be ionised at a heterointerface and transported to contacts; efficient solid state refrigeration depends on substantial entropy changes in a unit cell, with large local electrical or magnetic moments; efficient lighting is in a real sense the inverse of photovoltaics; the limit of an efficient battery is a supercapacitor employing mixed valent ions; fuel cells and solar to fuel conversion require us to understand electrochemistry on the scale of a single atom; and we already know that the only prospect for effective high temperature superconductivity involves strongly correlated materials. Even novel IT technologies are now seen to have value not just for novel function but also for efficiency. While strongly correlated electron systems continue to excite researchers and the public alike due to the fundamental science issues involved, it seems increasingly likely that support for the science will be leveraged by its impact on energy and sustainability. Strongly correlated electron systems contents Strongly correlated electron systemsSiddharth S Saxena and P B Littlewood Magnetism, f-electron localization and superconductivity in 122-type heavy-fermion metalsF Steglich, J Arndt, O Stockert, S Friedemann, M Brando, C Klingner, C Krellner, C Geibel, S Wirth, S Kirchner and Q Si High energy pseudogap and its evolution with doping in Fe-based superconductors as revealed by optical spectroscopyN L Wang, W Z Hu, Z G Chen, R H Yuan, G Li, G F Chen and T Xiang Structural investigations on YbRh2Si2: from the atomic to the macroscopic length scaleS Wirth, S Ernst, R Cardoso-Gil, H Borrmann, S Seiro, C Krellner, C Geibel, S Kirchner, U Burkhardt, Y Grin and F Steglich Confinement of chiral magnetic
Wickens, F
Our friend and colleague John Strong was cruelly taken from us by a brain tumour on Monday 31st July, a few days before his 65th birthday John started his career working with a group from Westfield College, under the leadership of Ted Bellamy. He obtained his PhD and spent the early part of his career on experiments at Rutherford Appleton Laboratory (RAL), but after the early 1970s his research was focussed on experiments in CERN. Over the years he made a number of notable contributions to experiments in CERN: The Omega spectrometer adopted a system John had originally developed for experiments at RAL using vidicon cameras to record the sparks in the spark chambers; He contributed to the success of NA1 and NA7, where he became heavily involved in the electronic trigger systems; He was responsible for the second level trigger system for the ALEPH detector and spent five years leading a team that designed and built the system, which ran for twelve years with only minor interventions. Following ALEPH he tur...
Stirring Strongly Coupled Plasma
Fadafan, Kazem Bitaghsir; Rajagopal, Krishna; Wiedemann, Urs Achim
2009-01-01
We determine the energy it takes to move a test quark along a circle of radius L with angular frequency w through the strongly coupled plasma of N=4 supersymmetric Yang-Mills (SYM) theory. We find that for most values of L and w the energy deposited by stirring the plasma in this way is governed either by the drag force acting on a test quark moving through the plasma in a straight line with speed v=Lw or by the energy radiated by a quark in circular motion in the absence of any plasma, whichever is larger. There is a continuous crossover from the drag-dominated regime to the radiation-dominated regime. In the crossover regime we find evidence for significant destructive interference between energy loss due to drag and that due to radiation as if in vacuum. The rotating quark thus serves as a model system in which the relative strength of, and interplay between, two different mechanisms of parton energy loss is accessible via a controlled classical gravity calculation. We close by speculating on the implicati...
Strong-interaction nonuniversality
International Nuclear Information System (INIS)
Volkas, R.R.; Foot, R.; He, X.; Joshi, G.C.
1989-01-01
The universal QCD color theory is extended to an SU(3) 1 direct product SU(3) 2 direct product SU(3) 3 gauge theory, where quarks of the ith generation transform as triplets under SU(3)/sub i/ and singlets under the other two factors. The usual color group is then identified with the diagonal subgroup, which remains exact after symmetry breaking. The gauge bosons associated with the 16 broken generators then form two massive octets under ordinary color. The interactions between quarks and these heavy gluonlike particles are explicitly nonuniversal and thus an exploration of their physical implications allows us to shed light on the fundamental issue of strong-interaction universality. Nonuniversality and weak flavor mixing are shown to generate heavy-gluon-induced flavor-changing neutral currents. The phenomenology of these processes is studied, as they provide the major experimental constraint on the extended theory. Three symmetry-breaking scenarios are presented. The first has color breaking occurring at the weak scale, while the second and third divorce the two scales. The third model has the interesting feature of radiatively induced off-diagonal Kobayashi-Maskawa matrix elements
International Nuclear Information System (INIS)
Konovalov, S.V.; Mikhailovskii, A.B.; Shirokov, M.S.; Ozeki, T.; Tsypin, V.S.
2005-01-01
A study is made of the suppression of neoclassical tearing modes in tokamaks under anomalous transverse transport conditions when the magnetic well effect predominates over the bootstrap drive. It is stressed that the corresponding effect, which is called the compound suppression effect, depends strongly on the profiles of the electron and ion temperature perturbations. Account is taken of the fact that the temperature profile can be established as a result of the competition between anomalous transverse heat transport, on the one hand, and longitudinal collisional heat transport, longitudinal heat convection, longitudinal inertial transport, and transport due to the rotation of magnetic islands, on the other hand. The role of geodesic effects is discussed. The cases of competition just mentioned are described by the model sets of reduced transport equations, which are called, respectively, collisional, convective, inertial, and rotational plasmophysical models. The magnetic well is calculated with allowance for geodesic effects. It is shown that, for strong anomalous heat transport conditions, the contribution of the magnetic well to the generalized Rutherford equation for the island width W is independent of W not only in the collisional model (which has been investigated earlier) but also in the convective and inertial models and depends very weakly (logarithmically) on W in the rotational model. It is this weak dependence that gives rise to the compound effect, which is the subject of the present study. A criterion for the stabilization of neoclassical tearing modes by the compound effect at an arbitrary level of the transverse heat transport by electrons and ions is derived and is analyzed for two cases: when the electron heat transport and ion heat transport are both strong, and when the electron heat transport is strong and the ion heat transport is weak
Al-Mudhafar, W. J.
2013-12-01
Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly
A Bootstrapping Model of Frequency and Context Effects in Word Learning.
Kachergis, George; Yu, Chen; Shiffrin, Richard M
2017-04-01
Prior research has shown that people can learn many nouns (i.e., word-object mappings) from a short series of ambiguous situations containing multiple words and objects. For successful cross-situational learning, people must approximately track which words and referents co-occur most frequently. This study investigates the effects of allowing some word-referent pairs to appear more frequently than others, as is true in real-world learning environments. Surprisingly, high-frequency pairs are not always learned better, but can also boost learning of other pairs. Using a recent associative model (Kachergis, Yu, & Shiffrin, 2012), we explain how mixing pairs of different frequencies can bootstrap late learning of the low-frequency pairs based on early learning of higher frequency pairs. We also manipulate contextual diversity, the number of pairs a given pair appears with across training, since it is naturalistically confounded with frequency. The associative model has competing familiarity and uncertainty biases, and their interaction is able to capture the individual and combined effects of frequency and contextual diversity on human learning. Two other recent word-learning models do not account for the behavioral findings. Copyright © 2016 Cognitive Science Society, Inc.
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks.
Harris, Daniel R; Baus, Adam D; Harper, Tamela J; Jarrett, Traci D; Pollard, Cecil R; Talbert, Jeffery C
2016-08-01
We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites.
Lee, Soojeong; Rajan, Sreeraman; Jeon, Gwanggil; Chang, Joon-Hyuk; Dajani, Hilmi R; Groza, Voicu Z
2017-06-01
Blood pressure (BP) is one of the most important vital indicators and plays a key role in determining the cardiovascular activity of patients. This paper proposes a hybrid approach consisting of nonparametric bootstrap (NPB) and machine learning techniques to obtain the characteristic ratios (CR) used in the blood pressure estimation algorithm to improve the accuracy of systolic blood pressure (SBP) and diastolic blood pressure (DBP) estimates and obtain confidence intervals (CI). The NPB technique is used to circumvent the requirement for large sample set for obtaining the CI. A mixture of Gaussian densities is assumed for the CRs and Gaussian mixture model (GMM) is chosen to estimate the SBP and DBP ratios. The K-means clustering technique is used to obtain the mixture order of the Gaussian densities. The proposed approach achieves grade "A" under British Society of Hypertension testing protocol and is superior to the conventional approach based on maximum amplitude algorithm (MAA) that uses fixed CR ratios. The proposed approach also yields a lower mean error (ME) and the standard deviation of the error (SDE) in the estimates when compared to the conventional MAA method. In addition, CIs obtained through the proposed hybrid approach are also narrower with a lower SDE. The proposed approach combining the NPB technique with the GMM provides a methodology to derive individualized characteristic ratio. The results exhibit that the proposed approach enhances the accuracy of SBP and DBP estimation and provides narrower confidence intervals for the estimates. Copyright © 2015 Elsevier Ltd. All rights reserved.
Effect of voltage source internal resistance on the SQUID bootstrap circuit
Dong, Hui; Zhang, Guofeng; Wang, Yongliang; Zhang, Yi; Xie, Xiaoming; Krause, Hans-Joachim; Braginski, Alex I.; Offenhäusser, Andreas
2012-01-01
The voltage-biased SQUID bootstrap circuit (SBC) is suitable for achieving simple and low-noise direct readout of dc SQUIDs. In practice, an ideal voltage bias is difficult to realize because of non-zero internal resistance Rin of the bias voltage source. In order to clearly observe the influence of Rin on the SBC parameters (namely the flux-to-current transfer coefficient (∂I/∂Φ)SBC and the dynamic resistance Rd(SBC)) and the noise performance, we introduced an additional adjustable resistor Rad at room temperature to simulate a variable Rin between the SQUID and the preamplifier. We found that the measured SQUID flux noise does not rise, even though Rad increases significantly. This result demonstrates that a highly resistive connection can be inserted between the liquid-helium-cooled SQUID and the room-temperature readout electronics in the SBC scheme, thus reducing the conductive heat loss of the system. This work will be significant for developing multichannel SBC readout systems, e.g. for biomagnetism, and systems using SQUIDs as amplifiers, for example, in TES-array readout.
The lightcone bootstrap and the spectrum of the 3d Ising CFT
Energy Technology Data Exchange (ETDEWEB)
Simmons-Duffin, David [School of Natural Sciences, Institute for Advanced Study, Princeton, New Jersey 08540 (United States); Walter Burke Institute for Theoretical Physics, Caltech, Pasadena, California 91125 (United States)
2017-03-15
We compute numerically the dimensions and OPE coefficients of several operators in the 3d Ising CFT, and then try to reverse-engineer the solution to crossing symmetry analytically. Our key tool is a set of new techniques for computing infinite sums of SL(2,ℝ) conformal blocks. Using these techniques, we solve the lightcone bootstrap to all orders in an asymptotic expansion in large spin, and suggest a strategy for going beyond the large spin limit. We carry out the first steps of this strategy for the 3d Ising CFT, deriving analytic approximations for the dimensions and OPE coefficients of several infinite families of operators in terms of the initial data {Δ_σ,Δ_ϵ,f_σ_σ_ϵ,f_ϵ_ϵ_ϵ,c_T}. The analytic results agree with numerics to high precision for about 100 low-twist operators (correctly accounting for O(1) mixing effects between large-spin families). Plugging these results back into the crossing equations, we obtain approximate analytic constraints on the initial data.
Pseudo-Bootstrap Network Analysis-an Application in Functional Connectivity Fingerprinting.
Cheng, Hu; Li, Ao; Koenigsberger, Andrea A; Huang, Chunfeng; Wang, Yang; Sheng, Jinhua; Newman, Sharlene D
2017-01-01
Brain parcellation divides the brain's spatial domain into small regions, which are represented by nodes within the network analysis framework. While template-based parcellations are widely used, the parcels on the template do not necessarily match individual's functional nodes. A new method is developed to overcome the inconsistent network analysis results by by-passing the difficulties of parcellating the brain into functionally meaningful areas. First, roughly equal-sized parcellations are obtained. Second, these random parcellations are applied to individual subjects multiple times and a pseudo-bootstrap (PBS) of the network is obtained for statistical inferences. It was found that the variation of mean global network metrics from PBS sampling is smaller compared with inter-subject variation or within-subject variation between two diffusion MRI scans. Using the mean global network metrics from PBS sampling, the intra-class correlation is always higher than the average obtained from using a single random parcellation. As one application, the PBS method was tested on the Human Connectome Project resting state dataset to identify individuals across scan sessions based on the mean functional connectivity (FC)-a trivial network property that has little information about the connectivity between nodes. An accuracy rate of ∼90% was achieved by simply finding the maximum correlation of mean FC of PBS samples between two scan sessions.
International Nuclear Information System (INIS)
Tsukamoto, Megumi; Hatabu, Asuka; Takahashi, Yoshitake; Matsuda, Hiroshi; Okamoto, Kousuke; Yamashita, Noriyuki; Takagi, Tatsuya
2013-01-01
Many of the neurodegenerative diseases associated with a decrease in regional cerebral blood flow (rCBF) are untreatable, and the appropriate therapeutic strategy is to slow the progression of the disease. Therefore, it is important that a definitive diagnosis is made as soon as possible when such diseases are suspected. Diagnostic imaging methods, such as positron emission tomography (PET) and single-photon emission computed tomography (SPECT), play an important role in such a definitive diagnosis. Since several problems arise when evaluating these images visually, a procedure to evaluate them objectively is necessary, and studies of image analyses using statistical evaluations have been suggested. However, the assumed data distribution in a statistical procedure may occasionally be inappropriate. Therefore, to evaluate the decrease of rCBF, it is important to use a statistical procedure without assumptions about the data distribution. In this study, we propose a new procedure that uses nonparametric or smoothed bootstrap methods to calculate a standardized distribution of the Z-score without assumptions about the data distribution. To test whether the judgment of the proposed procedure is equivalent to that of an evaluation based on the Z-score with a fixed threshold, the procedure was applied to a sample data set whose size was large enough to be appropriate for the assumption of the Z-score. As a result, the evaluations of the proposed procedure were equivalent to that of an evaluation based on the Z-score. (author)
Chang, T. W.; Ide, S.
2017-12-01
Slip inversion using empirical Green's function (EGF) method has its advantages of removing the complex path and site effect that is difficult to model theoretically. The method, which uses one "EGF event" that's smaller in magnitude for over 1.5 as the Green's function, is essentially an inversion highlighting the arrival time of the waveforms. In this study, inversions of very large earthquakes were conducted with far-field data, using non-negative-least-squares method, and taking EGF selection from Baltay et al. (2014). Objective way of screening station components is applied by evaluating the radiation pattern for the earthquakes of each stations. To better estimate model error due to the usage of empirical Green's function, which is also specific to station selection, bootstrapping is made on the station selection process, randomly selecting waveforms from P or SH components in various stations. This will give the average of inversion trials using different data components with different Green's Functions, resulting in a smoothed model with stable features of the individual results, without explicitly applying smoothing constraints. So far, the above method had been applied to the MW 8.8 2010 Maule, Chile, and the MW 9.0 2011 Tohoku-Oki, Japan earthquakes, both giving comparable slip pattern to previous studies, although slip is concentrated in very small regions with unreasonably large amount of slip. These results should be considered as an extreme case of concentrated slip, and further physical inference is necessary to understand the real rupture process.
Estimates by bootstrap interval for time series forecasts obtained by theta model
Directory of Open Access Journals (Sweden)
Daniel Steffen
2017-03-01
Full Text Available In this work, are developed an experimental computer program in Matlab language version 7.1 from the univariate method for time series forecasting called Theta, and implementation of resampling technique known as computer intensive "bootstrap" to estimate the prediction for the point forecast obtained by this method by confidence interval. To solve this problem built up an algorithm that uses Monte Carlo simulation to obtain the interval estimation for forecasts. The Theta model presented in this work was very efficient in M3 Makridakis competition, where tested 3003 series. It is based on the concept of modifying the local curvature of the time series obtained by a coefficient theta (Θ. In it's simplest approach the time series is decomposed into two lines theta representing terms of long term and short term. The prediction is made by combining the forecast obtained by fitting lines obtained with the theta decomposition. The results of Mape's error obtained for the estimates confirm the favorable results to the method of M3 competition being a good alternative for time series forecast.
Bootstrapping de-shadowing and self-calibration for scanning electron microscope photometric stereo
International Nuclear Information System (INIS)
Miyamoto, Atsushi; Chen, Deshan; Kaneko, Shun’ichi
2014-01-01
In this paper, we present a novel approach that addresses the blind reconstruction problem in scanning electron microscope (SEM) photometric stereo. Using only two observed images that suffer from shadowing effects, our method automatically calibrates the parameter and resolves shadowing errors for estimating an accurate three-dimensional (3D) shape and underlying shadowless images. We introduce a novel shadowing compensation model using image intensities for both cases of presence and absence of shadowing. With this model, the proposed de-shadowing algorithm iteratively compensates for image intensities and modifies the corresponding 3D surface. Besides de-shadowing, we introduce a practically useful self-calibration criterion by enforcing a good reconstruction. We show that incorrect parameters will engender significant distortions of 3D reconstructions in shadowed regions during the de-shadowing procedure. This motivated us to design the self-calibration criterion by utilizing shadowing to pursue the proper parameter that produces the best reconstruction with least distortions. As a result, we develop a bootstrapping approach for simultaneous de-shadowing and self-calibration in SEM photometric stereo. Extensive experiments on real image data demonstrate the effectiveness of our method. (paper)
Brandic, Ivona; Music, Dejan; Dustdar, Schahram
Nowadays, novel computing paradigms as for example Cloud Computing are gaining more and more on importance. In case of Cloud Computing users pay for the usage of the computing power provided as a service. Beforehand they can negotiate specific functional and non-functional requirements relevant for the application execution. However, providing computing power as a service bears different research challenges. On one hand dynamic, versatile, and adaptable services are required, which can cope with system failures and environmental changes. On the other hand, human interaction with the system should be minimized. In this chapter we present the first results in establishing adaptable, versatile, and dynamic services considering negotiation bootstrapping and service mediation achieved in context of the Foundations of Self-Governing ICT Infrastructures (FoSII) project. We discuss novel meta-negotiation and SLA mapping solutions for Cloud services bridging the gap between current QoS models and Cloud middleware and representing important prerequisites for the establishment of autonomic Cloud services.
Design as bootstrapping. On the evolution of ICT networks in health care.
Hanseth, O; Aanestad, M
2003-01-01
This paper assumes that in addressing major challenges related to telemedicine as networks enabling huge improvements of heath services we need to move beyond complexity and rather focus on the very nature of such networks. The results of this paper are based on an interpretive analysis of three case studies involving telemedicine, i.e. broadband networks in minimal invasive surgery, EDI infrastructures and telemedicine in ambulances. The well-known concept of "critical mass" focuses on the number of users as a significant factor of network growth. We argue however, that we should not only consider the size of the network, but also the heterogeneity of its elements. In order to discuss heterogeneity along several dimensions, we find Granovetter's and Schelling's models of diversity in individual preferences helpful. In addition to the heterogeneity of the individual users, we discuss heterogeneity related to use areas and situation, to technologies, etc. The interdependencies and possible conflicts between these dimensions are discussed, and we suggest "bootstrapping" as a concept to guide the navigation/exploitation in/of these dimensions.
Directory of Open Access Journals (Sweden)
Campbell Michael J
2004-12-01
Full Text Available Abstract Health-Related Quality of Life (HRQoL measures are becoming increasingly used in clinical trials as primary outcome measures. Investigators are now asking statisticians for advice on how to analyse studies that have used HRQoL outcomes. HRQoL outcomes, like the SF-36, are usually measured on an ordinal scale. However, most investigators assume that there exists an underlying continuous latent variable that measures HRQoL, and that the actual measured outcomes (the ordered categories, reflect contiguous intervals along this continuum. The ordinal scaling of HRQoL measures means they tend to generate data that have discrete, bounded and skewed distributions. Thus, standard methods of analysis such as the t-test and linear regression that assume Normality and constant variance may not be appropriate. For this reason, conventional statistical advice would suggest that non-parametric methods be used to analyse HRQoL data. The bootstrap is one such computer intensive non-parametric method for analysing data. We used the bootstrap for hypothesis testing and the estimation of standard errors and confidence intervals for parameters, in four datasets (which illustrate the different aspects of study design. We then compared and contrasted the bootstrap with standard methods of analysing HRQoL outcomes. The standard methods included t-tests, linear regression, summary measures and General Linear Models. Overall, in the datasets we studied, using the SF-36 outcome, bootstrap methods produce results similar to conventional statistical methods. This is likely because the t-test and linear regression are robust to the violations of assumptions that HRQoL data are likely to cause (i.e. non-Normality. While particular to our datasets, these findings are likely to generalise to other HRQoL outcomes, which have discrete, bounded and skewed distributions. Future research with other HRQoL outcome measures, interventions and populations, is required to
Collins, Jon W; Heyward Hull, J; Dumond, Julie B
2017-12-01
Sparse tissue sampling with intensive plasma sampling creates a unique data analysis problem in determining drug exposure in clinically relevant tissues. Tissue exposure may govern drug efficacy, as many drugs exert their actions in tissues. We compared tissue area-under-the-curve (AUC) generated from bootstrapped noncompartmental analysis (NCA) methods and compartmental nonlinear mixed effect (NLME) modeling. A model of observed data after single-dose tenofovir disoproxil fumarate was used to simulate plasma and tissue concentrations for two destructive tissue sampling schemes. Two groups of 100 data sets with densely-sampled plasma and one tissue sample per individual were created. The bootstrapped NCA (SAS 9.3) used a trapezoidal method to calculate geometric mean tissue AUC per dataset. For NLME, individual post hoc estimates of tissue AUC were determined, and the geometric mean from each dataset calculated. Median normalized prediction error (NPE) and absolute normalized prediction error (ANPE) were calculated for each method from the true values of the modeled concentrations. Both methods produced similar tissue AUC estimates close to true values. Although the NLME-generated AUC estimates had larger NPEs, it had smaller ANPEs. Overall, NLME NPEs showed AUC under-prediction but improved precision and fewer outliers. The bootstrapped NCA method produced more accurate estimates but with some NPEs > 100%. In general, NLME is preferred, as it accommodates less intensive tissue sampling with reasonable results, and provides simulation capabilities for optimizing tissue distribution. However, if the main goal is an accurate AUC for the studied scenario, and relatively intense tissue sampling is feasible, the NCA bootstrap method is a reasonable, and potentially less time-intensive solution.
DJANSENA, Alradix; 田中, 宏明; 工藤, 亮
2015-01-01
CFRP has been used in aircraft structures for decades. Although CFRP is light, its laminationis its main weakness. We have developed a new method to increase the probability of detectingdelamination in carbon fiber reinforced plastic (CFRP) by narrowing the confidence interval ofthe changes in natural frequency. The changes in the natural frequency in delaminated CFRPare tiny compared with measurement errors. We use the bootstrap method, a statisticaltechnique that increases the estimation ac...
Directory of Open Access Journals (Sweden)
Enrico Zio
2008-01-01
Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.
Palazón-Bru, Antonio; Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3-100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35-3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease.
Gudicha, Dereje W; Schmittmann, Verena D; Tekle, Fetene B; Vermunt, Jeroen K
2016-01-01
The latent Markov (LM) model is a popular method for identifying distinct unobserved states and transitions between these states over time in longitudinally observed responses. The bootstrap likelihood-ratio (BLR) test yields the most rigorous test for determining the number of latent states, yet little is known about power analysis for this test. Power could be computed as the proportion of the bootstrap p values (PBP) for which the null hypothesis is rejected. This requires performing the full bootstrap procedure for a large number of samples generated from the model under the alternative hypothesis, which is computationally infeasible in most situations. This article presents a computationally feasible shortcut method for power computation for the BLR test. The shortcut method involves the following simple steps: (1) obtaining the parameters of the model under the null hypothesis, (2) constructing the empirical distributions of the likelihood ratio under the null and alternative hypotheses via Monte Carlo simulations, and (3) using these empirical distributions to compute the power. We evaluate the performance of the shortcut method by comparing it to the PBP method and, moreover, show how the shortcut method can be used for sample-size determination.
Chen, Meng; Liao, Congyu; Chen, Song; Ding, Qiuping; Zhu, Darong; Liu, Hui; Yan, Xu; Zhong, Jianhui
2017-06-01
The aim of this work is to quantify individual and regional differences in the relative concentration of gamma-aminobutyric acid (GABA) in human brain with in vivo magnetic resonance spectroscopy. Spectral editing Mescher-Garwood point resolved spectroscopy (MEGA-PRESS) sequence and GABA analysis toolkit (Gannet) were used to detect and quantify GABA in anterior cingulate cortex (ACC) and occipital cortex (OCC) of healthy volunteers. Residual bootstrap, a model-based statistical analysis technique, was applied to resample the fitting residuals of GABA from the Gaussian fitting model (referred to as GABA + thereafter) in both individual and group data of ACC and OCC. The inter-subject coefficient of variation (CV) of GABA + in OCC (20.66 %) and ACC (12.55 %) with residual bootstrap was lower than that of a standard Gaussian model analysis (21.58 % and 16.73 % for OCC and ACC, respectively). The intra-subject uncertainty and CV of OCC were lower than that of ACC in both analyses. The residual bootstrap analysis thus provides a more robust uncertainty estimation of individual and group GABA + detection in different brain regions, which may be useful in our understanding of GABA biochemistry in brain and its use for the diagnosis of related neuropsychiatric diseases.
Lukin, Vladimir V.; Abramov, Sergey K.; Vozel, Benoit; Chehdi, Kacem
2005-10-01
Multichannel (multispectral) remote sensing (MRS) is widely used for various applications nowadays. However, original images are commonly corrupted by noise and other distortions. This prevents reliable retrieval of useful information from remote sensing data. Because of this, image pre-filtering and/or reconstruction are typical stages of multichannel image processing. And majority of modern efficient methods for image pre-processing requires availability of a priori information concerning noise type and its statistical characteristics. Thus, there is a great need in automatic blind methods for determination of noise type and its characteristics. However, almost all such methods fail to perform appropriately well if an image under consideration contains a large percentage of texture regions, details and edges. In this paper we demonstrate that by applying bootstrap it is possible to obtain rather accurate estimates of noise variance that can be used either as the final or preliminary ones. Different quantiles (order statistics) are used as initial estimates of mode location for distribution of noise variance local estimations and then bootstrap is applied for their joint analysis. To further improve accuracy of noise variance estimations, it is proposed under certain condition to apply myriad operation with tunable parameter k set in accordance with preliminary estimate obtained by bootstrap. Numerical simulation results confirm applicability of the proposed approach and produce data allowing to evaluate method accuracy.
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.
A Note on Strongly Dense Matrices
Czech Academy of Sciences Publication Activity Database
Fiedler, Miroslav; Hall, F.J.
2015-01-01
Roč. 1, č. 4 (2015), s. 721-730 ISSN 2199-675X Institutional support: RVO:67985807 Keywords : strongly dense matrix * Boolean matrix * nonnegative matrix * idempotent matrix * intrinsic product * generalized complementary basic matrix Subject RIV: BA - General Mathematics
Directory of Open Access Journals (Sweden)
Prescott C. Ensign
2016-01-01
Full Text Available This case study chronicles the timeline of a new venture – Keenga Research. Keenga Research has a novel proposition that it is seeking to introduce to the market. The business concept is to ask entrepreneurs to review the venture capital (VC firm that funded them. Reviews of VC firms would then be developed and marketed to those interested (funds and perhaps enterprises seeking funding. What makes this case unique is that Keenga Research was a lean start-up. Bootstrapping is a situation in which the entrepreneur chooses to fund the venture with his/her own personal resources. It involves self-funding (family and friends, tight monitoring of expenses, and maintaining control of ownership and management (Winborg & Landstrom 2001; Perry, Chandler, Yao, & Wolff, 2011; Winborg, 2015. The lean start-up approach favors experimentation over elaborate planning, customer feedback over intuition and iterative design over traditional big upfront research and development. This case study requires the reader to consider a number of the basic challenges facing all entrepreneurs and new ventures. Is the concept marketable? Can the concept be developed and brought to market in a timely manner? Will the product generate revenue? How? When? What are the commitments of the entrepreneurs? Have they considered the major challenges to be faced? Since this venture involved gathering and developing research information and then creating an online platform, Keenga Research faced significant concept-to-market challenges. The research method used in this case study is first person participant observation and interviews. One of the authors was a team member so the contextual details come from direct observation and first-hand knowledge. This method of research is often used in anthropology, sociology, and social psychology where an investigator studies the group by sharing in its activities. The other author provided an objective and conceptual perspective for analyzing
Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec
2008-07-01
Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.
Chou, Chih-Chin; Robb, Jayci Lynn; Clay, Matthew Christopher; Chronister, Julie Ann
2013-01-01
In this study, 51 individuals from online substance abuse support groups were surveyed to investigate the mediating role of social support on the relationship between internalized stigma and coping. Regression and bootstrapping were conducted to perform mediation analysis. Findings suggest that social support mediates the negative impact of…
Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudhashree
2015-01-01
In 2004, the largest HIV prevention project (Avahan) conducted globally was implemented in India. Avahan was implemented by NGOs supported by state lead partners in order to provide HIV prevention services to high-risk population groups. In 2007, most of the NGOs reached full coverage. Using a panel data set of the NGOs that implemented Avahan, we investigate the level of technical efficiency as well as the drivers of technical inefficiency by using the double bootstrap procedure developed by Simar & Wilson (2007). Unlike the two-stage traditional method, this method allows valid inference in the presence of measurement error and serial correlation. We find that over the 4 years, Avahan NGOs could have reduced the level of inputs by 43% given the level of outputs reached. We find that efficiency of the project has increased over time. Results indicate that main drivers of inefficiency come from the characteristics of the state lead partner, the NGOs and the catchment area. These organisational factors are important to explicitly consider and assess when designing and implementing HIV prevention programmes and in setting benchmarks in order to optimise the use and allocation of resources. C14, I1.
EDITORIAL: Strongly correlated electron systems Strongly correlated electron systems
Ronning, Filip; Batista, Cristian
2011-03-01
during SCES 2010. As we learned, past dogmas about strongly correlated materials and phenomena must be re-examined with an open and inquisitive mind. Invited speakers and respected leaders in the field were invited to contribute to this special issue and we have insisted that they present new data, ideas, or perspectives, as opposed to simply an overview of their past work. As with the conference, this special issue touches upon recent developments of strongly correlated electron systems in d-electron materials, such as Sr3Ru2O7, graphene, and the new Fe-based superconductors, but it is dominated by topics in f-electron compounds. Contributions reflect the growing appreciation for the influence of disorder and frustration, the need for organizing principles, as well as detailed investigations on particular materials of interest and, of course, new materials. As this special issue could not possibly capture the full breadth and depth that the conference had to offer, it is being published simultaneously with an issue of Journal of Physics: Conference Series containing 157 manuscripts in which all poster presenters at SCES 2010 were invited to contribute. Since this special issue grew out of the 2010 SCES conference, we take this opportunity to give thanks. This conference would not have been possible without the hard work of the SCES 2010 Program Committee, International and National Advisory Committees, Local Committee, and conference organizers, the New Mexico Consortium. We thank them as well as those organizations that generously provided financial support: ICAM-I2CAM, Quantum Design, Lakeshore, the National High Magnetic Field Laboratory and the Department of Energy National Laboratories at Argonne, Berkeley, Brookhaven, Los Alamos and Oak Ridge. Of course, we especially thank the participants for bringing new ideas and new results, without which SCES 2010 would not have been possible. Strongly correlated electron systems contents Spin-orbit coupling and k
Promoting Strong ISO 50001 Outcomes with Supportive National Infrastructure:
McKane, Aimee, T.; Siciliano, Graziella; de los Reyes, Pamela
2015-01-01
The ISO 50001 standard is a key mechanism for reducing greenhouse gas emissions and improving energy efficiency globally. An increasing number of companies are seeking certification, creating the need for personnel that are competent to conduct ISO 50001 certification audits. The growth of ISO 50001 is expected to accelerate as more companies integrate ISO 50001 into their corporate sustainability strategies and supplier requirements. Robust implementation of ISO 50001 represents an impo...
South Korea: strong infrastructure to support nation's needs
International Nuclear Information System (INIS)
Hayes, David.
1995-01-01
A brief report is given on the development of the natural gas market in South Korea. The country is increasingly turning to imported LNG due to the phasing out of dirtier fuels by stricter planning regulations. Topics covered include gas terminals, gas-fired power stations and gas distribution systems. (UK)
Quantum electrodynamics of strong fields
International Nuclear Information System (INIS)
Greiner, W.
1983-01-01
Quantum Electrodynamics of Strong Fields provides a broad survey of the theoretical and experimental work accomplished, presenting papers by a group of international researchers who have made significant contributions to this developing area. Exploring the quantum theory of strong fields, the volume focuses on the phase transition to a charged vacuum in strong electric fields. The contributors also discuss such related topics as QED at short distances, precision tests of QED, nonperturbative QCD and confinement, pion condensation, and strong gravitational fields In addition, the volume features a historical paper on the roots of quantum field theory in the history of quantum physics by noted researcher Friedrich Hund
Verifying interpretive criteria for bioaerosol data using (bootstrap) Monte Carlo techniques.
Spicer, R Christopher; Gangloff, Harry
2008-02-01
A number of interpretive descriptors have been proposed for bioaerosol data due to the lack of health-based numerical standards, but very few have been verified as to their ability to describe a suspect indoor environment. Culturable and nonculturable (spore trap) sampling using the bootstrap version of Monte Carlo simulation (BMC) at several sites during 2003-2006 served as a source of indoor and outdoor data to test various criteria with regard to their variability in characterizing an indoor or outdoor environment. The purpose was to gain some insight for the reliability of some of the interpretive criteria in use as well as to demonstrate the utility of BMC methods as a generalized technique for validation of various interpretive criteria for bioaerosols. The ratio of nonphylloplane (NP) fungi (total of Aspergillus and Penicillium) to phylloplane (P) fungi (total of Cladosporium, Alternaria, and Epicoccum), or NP/P, is a descriptor that has been used to identify "dominance" of nonphylloplane fungi (NP/P > 1.0), assumed to be indicative of a problematic indoor environment. However, BMC analysis of spore trap and culturable bioaerosol data using the NP/P ratio identified frequent dominance by nonphylloplane fungi in outdoor air. Similarly, the NP/P descriptor indicated dominance of nonphylloplane fungi in buildings with visible mold growth and/or known water intrusion with a frequency often in the range of 0.5 Fixed numerical criteria for spore trap data of 900 and 1300 spores/m(3) for total spores and 750 Aspergillus/Penicillium spores/m(3) exhibited similar variability, as did ratios of nonphylloplane to total fungi, phylloplane to total fungi, and indoor/outdoor ratios for total fungal spores. Analysis of bioaerosol data by BMC indicates that numerical levels or descriptors based on dominance of certain fungi are unreliable as criteria for characterizing a given environment. The utility of BMC analysis lies in its generalized application to test mathematically
International Nuclear Information System (INIS)
Wesseh, Presley K.; Zoumara, Babette
2012-01-01
This contribution investigates causal interdependence between energy consumption and economic growth in Liberia and proposes application of a bootstrap methodology. To better reflect causality, employment is incorporated as additional variable. The study demonstrates evidence of distinct bidirectional Granger causality between energy consumption and economic growth. Additionally, the results show that employment in Liberia Granger causes economic growth and apply irrespective of the short-run or long-run. Evidence from a Monte Carlo experiment reveals that the asymptotic Granger causality test suffers size distortion problem for Liberian data, suggesting that the bootstrap technique employed in this study is more appropriate. Given the empirical results, implications are that energy expansion policies like energy subsidy or low energy tariff for instance, would be necessary to cope with demand exerted as a result of economic growth in Liberia. Furthermore, Liberia might have the performance of its employment generation on the economy partly determined by adequate energy. Therefore, it seems fully justified that a quick shift towards energy production based on clean energy sources may significantly slow down economic growth in Liberia. Hence, the government’s target to implement a long-term strategy to make Liberia a carbon neutral country, and eventually less carbon dependent by 2050 is understandable. - Highlights: ► Causality between energy consumption and economic growth in Liberia investigated. ► There is bidirectional causality between energy consumption and economic growth. ► Energy expansion policies are necessary to cope with demand from economic growth. ► Asymptotic Granger causality test suffers size distortion problem for Liberian data. ► The bootstrap methodology employed in our study is more appropriate.
Energy Technology Data Exchange (ETDEWEB)
Hager, Robert, E-mail: rhager@pppl.gov; Chang, C. S., E-mail: cschang@pppl.gov [Princeton Plasma Physics Laboratory, P.O. Box 451, Princeton, New Jersey 08543 (United States)
2016-04-15
As a follow-up on the drift-kinetic study of the non-local bootstrap current in the steep edge pedestal of tokamak plasma by Koh et al. [Phys. Plasmas 19, 072505 (2012)], a gyrokinetic neoclassical study is performed with gyrokinetic ions and drift-kinetic electrons. Besides the gyrokinetic improvement of ion physics from the drift-kinetic treatment, a fully non-linear Fokker-Planck collision operator—that conserves mass, momentum, and energy—is used instead of Koh et al.'s linearized collision operator in consideration of the possibility that the ion distribution function is non-Maxwellian in the steep pedestal. An inaccuracy in Koh et al.'s result is found in the steep edge pedestal that originated from a small error in the collisional momentum conservation. The present study concludes that (1) the bootstrap current in the steep edge pedestal is generally smaller than what has been predicted from the small banana-width (local) approximation [e.g., Sauter et al., Phys. Plasmas 6, 2834 (1999) and Belli et al., Plasma Phys. Controlled Fusion 50, 095010 (2008)], (2) the plasma flow evaluated from the local approximation can significantly deviate from the non-local results, and (3) the bootstrap current in the edge pedestal, where the passing particle region is small, can be dominantly carried by the trapped particles in a broad trapped boundary layer. A new analytic formula based on numerous gyrokinetic simulations using various magnetic equilibria and plasma profiles with self-consistent Grad-Shafranov solutions is constructed.
International Nuclear Information System (INIS)
Narayan, Paresh Kumar; Prasad, Arti
2008-01-01
The goal of this paper is to examine any causal effects between electricity consumption and real GDP for 30 OECD countries. We use a bootstrapped causality testing approach and unravel evidence in favour of electricity consumption causing real GDP in Australia, Iceland, Italy, the Slovak Republic, the Czech Republic, Korea, Portugal, and the UK. The implication is that electricity conservation policies will negatively impact real GDP in these countries. However, for the rest of the 22 countries our findings suggest that electricity conversation policies will not affect real GDP
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, Robert
of the estimator now depends on nuisance parameters derived both from the weak dependence and heteroskedasticity present in the shocks. We then investigate classical methods of inference based on the Wald, likelihood ratio and Lagrange multiplier tests for linear hypotheses on either or both of the long and short...... memory parameters of the model. The limiting null distributions of these test statistics are shown to be non-pivotal under heteroskedasticity, while that of a robustWald statistic (based around a sandwich estimator of the variance) is pivotal. We show that wild bootstrap implementations of the tests...
Energy Technology Data Exchange (ETDEWEB)
Pelaez, Jose R
1998-12-14
We present a brief pedagogical introduction to the Effective Electroweak Chiral Lagrangians, which provide a model independent description of the WW interactions in the strong regime. When it is complemented with some unitarization or a dispersive approach, this formalism allows the study of the general strong scenario expected at the LHC, including resonances.
International Nuclear Information System (INIS)
DeSantis, G.N.
1995-01-01
The calculation decides the integrity of the safety latch that will hold the strong-back to the pump during lifting. The safety latch will be welded to the strong-back and will latch to a 1.5-in. dia cantilever rod welded to the pump baseplate. The static and dynamic analysis shows that the safety latch will hold the strong-back to the pump if the friction clamps fail and the pump become free from the strong-back. Thus, the safety latch will meet the requirements of the Lifting and Rigging Manual for under the hook lifting for static loading; it can withstand shock loads from the strong-back falling 0.25 inch
Energy Technology Data Exchange (ETDEWEB)
DeSantis, G.N.
1995-03-06
The calculation decides the integrity of the safety latch that will hold the strong-back to the pump during lifting. The safety latch will be welded to the strong-back and will latch to a 1.5-in. dia cantilever rod welded to the pump baseplate. The static and dynamic analysis shows that the safety latch will hold the strong-back to the pump if the friction clamps fail and the pump become free from the strong-back. Thus, the safety latch will meet the requirements of the Lifting and Rigging Manual for under the hook lifting for static loading; it can withstand shock loads from the strong-back falling 0.25 inch.
Strong-Q-sequences and small d
Czech Academy of Sciences Publication Activity Database
Chodounský, David
2012-01-01
Roč. 159, č. 3 (2012), s. 2942-2946 ISSN 0166-8641. [Prague Symposium on General Topology and its Relations to Modern Analysis and Algebra /11./. Prague, 07.08.2011-12.08.2011] Institutional support: RVO:67985840 Keywords : Katowice problem * strong-Q-sequence * dominating number Subject RIV: BA - General Mathematics Impact factor: 0.562, year: 2012 http://www.sciencedirect.com/science/article/pii/S0166864112002222
Titanium: light, strong, and white
Woodruff, Laurel; Bedinger, George
2013-01-01
Titanium (Ti) is a strong silver-gray metal that is highly resistant to corrosion and is chemically inert. It is as strong as steel but 45 percent lighter, and it is twice as strong as aluminum but only 60 percent heavier. Titanium dioxide (TiO2) has a very high refractive index, which means that it has high light-scattering ability. As a result, TiO2 imparts whiteness, opacity, and brightness to many products. ...Because of the unique physical properties of titanium metal and the whiteness provided by TiO2, titanium is now used widely in modern industrial societies.
Angrisano, Antonio; Maratea, Antonio; Gaglione, Salvatore
2018-01-01
In the absence of obstacles, a GPS device is generally able to provide continuous and accurate estimates of position, while in urban scenarios buildings can generate multipath and echo-only phenomena that severely affect the continuity and the accuracy of the provided estimates. Receiver autonomous integrity monitoring (RAIM) techniques are able to reduce the negative consequences of large blunders in urban scenarios, but require both a good redundancy and a low contamination to be effective. In this paper a resampling strategy based on bootstrap is proposed as an alternative to RAIM, in order to estimate accurately position in case of low redundancy and multiple blunders: starting with the pseudorange measurement model, at each epoch the available measurements are bootstrapped—that is random sampled with replacement—and the generated a posteriori empirical distribution is exploited to derive the final position. Compared to standard bootstrap, in this paper the sampling probabilities are not uniform, but vary according to an indicator of the measurement quality. The proposed method has been compared with two different RAIM techniques on a data set collected in critical conditions, resulting in a clear improvement on all considered figures of merit.
Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D
2015-03-01
Calculating the confidence interval is a common procedure in data analysis and is readily obtained from normally distributed populations with the familiar [Formula: see text] formula. However, when working with non-normally distributed data, determining the confidence interval is not as obvious. For this type of data, there are fewer references in the literature, and they are much less accessible. We describe, in simple language, the percentile and bias-corrected and accelerated variations of the bootstrap method to calculate confidence intervals. This method can be applied to a wide variety of parameters (mean, median, slope of a calibration curve, etc.) and is appropriate for normal and non-normal data sets. As a worked example, the confidence interval around the median concentration of cocaine in femoral blood is calculated using bootstrap techniques. The median of the non-toxic concentrations was 46.7 ng/mL with a 95% confidence interval of 23.9-85.8 ng/mL in the non-normally distributed set of 45 postmortem cases. This method should be used to lead to more statistically sound and accurate confidence intervals for non-normally distributed populations, such as reference values of therapeutic and toxic drug concentration, as well as situations of truncated concentration values near the limit of quantification or cutoff of a method. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; Manna, Piero; Terribile, Fabio
2013-04-01
Digital soil mapping procedures are widespread used to build two-dimensional continuous maps about several pedological attributes. Our work addressed a regression kriging (RK) technique and a bootstrapped artificial neural network approach in order to evaluate and compare (i) the accuracy of prediction, (ii) the susceptibility of being included in automatic engines (e.g. to constitute web processing services), and (iii) the time cost needed for calibrating models and for making predictions. Regression kriging is maybe the most widely used geostatistical technique in the digital soil mapping literature. Here we tried to apply the EBLUP regression kriging as it is deemed to be the most statistically sound RK flavor by pedometricians. An unusual multi-parametric and nonlinear machine learning approach was accomplished, called BAGAP (Bootstrap aggregating Artificial neural networks with Genetic Algorithms and Principal component regression). BAGAP combines a selected set of weighted neural nets having specified characteristics to yield an ensemble response. The purpose of applying these two particular models is to ascertain whether and how much a more cumbersome machine learning method could be much promising in making more accurate/precise predictions. Being aware of the difficulty to handle objects based on EBLUP-RK as well as BAGAP when they are embedded in environmental applications, we explore the susceptibility of them in being wrapped within Web Processing Services. Two further kinds of aspects are faced for an exhaustive evaluation and comparison: automaticity and time of calculation with/without high performance computing leverage.
Góes-Neto, Aristóteles; Diniz, Marcelo V C; Carvalho, Daniel S; Bomfim, Gilberto C; Duarte, Angelo A; Brzozowski, Jerzy A; Petit Lobão, Thierry C; Pinho, Suani T R; El-Hani, Charbel N; Andrade, Roberto F S
2018-01-01
Complex networks have been successfully applied to the characterization and modeling of complex systems in several distinct areas of Biological Sciences. Nevertheless, their utilization in phylogenetic analysis still needs to be widely tested, using different molecular data sets and taxonomic groups, and, also, by comparing complex networks approach to current methods in phylogenetic analysis. In this work, we compare all the four main methods of phylogenetic analysis (distance, maximum parsimony, maximum likelihood, and Bayesian) with a complex networks method that has been used to provide a phylogenetic classification based on a large number of protein sequences as those related to the chitin metabolic pathway and ATP-synthase subunits. In order to perform a close comparison to these methods, we selected Basidiomycota fungi as the taxonomic group and used a high-quality, manually curated and characterized database of chitin synthase sequences. This enzymatic protein plays a key role in the synthesis of one of the exclusive features of the fungal cell wall: the presence of chitin. The communities (modules) detected by the complex network method corresponded exactly to the groups retrieved by the phylogenetic inference methods. Additionally, we propose a bootstrap method for the complex network approach. The statistical results we have obtained with this method were also close to those obtained using traditional bootstrap methods.
Lin, Jyh-Jiuan; Chang, Ching-Hui; Pal, Nabendu
2015-01-01
To test the mutual independence of two qualitative variables (or attributes), it is a common practice to follow the Chi-square tests (Pearson's as well as likelihood ratio test) based on data in the form of a contingency table. However, it should be noted that these popular Chi-square tests are asymptotic in nature and are useful when the cell frequencies are "not too small." In this article, we explore the accuracy of the Chi-square tests through an extensive simulation study and then propose their bootstrap versions that appear to work better than the asymptotic Chi-square tests. The bootstrap tests are useful even for small-cell frequencies as they maintain the nominal level quite accurately. Also, the proposed bootstrap tests are more convenient than the Fisher's exact test which is often criticized for being too conservative. Finally, all test methods are applied to a few real-life datasets for demonstration purposes.
Baudry, A-S; Lelorain, S; Mahieuxe, M; Christophe, V
2018-01-01
The aim of this study was to test the effect of intrapersonal and interpersonal emotional competence on cancer patients' supportive care needs, as mediated by anxiety and depression symptoms. Cross-sectional design: 137 cancer patients (42% breast or ovarian cancer, 58% gastrointestinal cancer) in 4 French hospitals completed the Profile of Emotional Competence (PEC), the Hospital Anxiety and Depression Scale (HADS), and the Supportive Care Needs Survey Short Form (SCNS-SF). Bootstrap methods with PROCESS Macro were used to test multiple mediation models. Emotional competence presented a direct or indirect beneficial effect on the satisfaction of supportive care needs, anxiety and depression symptoms. As expected, anxiety and depression symptoms had also strong positive correlations with unmet needs. All multiple mediation models were significant, except for physical needs: intrapersonal and interpersonal emotional competence impacted anxiety and depression symptoms, which in turn impacted psychological, sexual, care/support, and information needs. These innovative results show the important effect of patients' emotional competence on their supportive care need satisfaction, as mediated by anxiety and depression. Consequently, patients with high emotional competence may require less psychosocial input from medical clinicians. Thus, emotional competence may be integrated into health models and psychosocial interventions to improve patient adjustment. Further investigation is, however, needed to know which are the most beneficial specific emotional competences and at what point of the cancer pathway.
Dosne, Anne-Gaëlle; Niebecker, Ronald; Karlsson, Mats O
2016-12-01
Knowledge of the uncertainty in model parameters is essential for decision-making in drug development. Contrarily to other aspects of nonlinear mixed effects models (NLMEM), scrutiny towards assumptions around parameter uncertainty is low, and no diagnostic exists to judge whether the estimated uncertainty is appropriate. This work aims at introducing a diagnostic capable of assessing the appropriateness of a given parameter uncertainty distribution. The new diagnostic was applied to case bootstrap examples in order to investigate for which dataset sizes case bootstrap is appropriate for NLMEM. The proposed diagnostic is a plot comparing the distribution of differences in objective function values (dOFV) of the proposed uncertainty distribution to a theoretical Chi square distribution with degrees of freedom equal to the number of estimated model parameters. The uncertainty distribution was deemed appropriate if its dOFV distribution was overlaid with or below the theoretical distribution. The diagnostic was applied to the bootstrap of two real data and two simulated data examples, featuring pharmacokinetic and pharmacodynamic models and datasets of 20-200 individuals with between 2 and 5 observations on average per individual. In the real data examples, the diagnostic indicated that case bootstrap was unsuitable for NLMEM analyses with around 70 individuals. A measure of parameter-specific "effective" sample size was proposed as a potentially better indicator of bootstrap adequacy than overall sample size. In the simulation examples, bootstrap confidence intervals were shown to underestimate inter-individual variability at low sample sizes. The proposed diagnostic proved a relevant tool for assessing the appropriateness of a given parameter uncertainty distribution and as such it should be routinely used.
Bootstrap en los modelos de elección discreta: una aplicación en el método de valoración contingente
Ledesma Goyzueta, Luis Manuel
2016-01-01
Demuestra la inclusión del método bootstrap dentro del proceso de estimación de la disposición a pagar (DAP) de un determinado bien y/o servicio ambiental, bajo el enfoque de la valoración contingente de formato binario. El principal aporte es la inclusión de un criterio adicional en el proceso de remuestreo bootstrap, seleccionándose aleatoriamente muestras que contengan valores balanceados en la variable dependiente binaria. Con fines ilustrativos, se utiliza la base de datos de tres estudi...
Energy Technology Data Exchange (ETDEWEB)
Marshall, P.
2005-01-03
Basic considerations of lens detection and identification indicate that a wide field survey of the types planned for weak lensing and Type Ia SNe with SNAP are close to optimal for the optical detection of strong lenses. Such a ''piggy-back'' survey might be expected even pessimistically to provide a catalogue of a few thousand new strong lenses, with the numbers dominated by systems of faint blue galaxies lensed by foreground ellipticals. After sketching out our strategy for detecting and measuring these galaxy lenses using the SNAP images, we discuss some of the scientific applications of such a large sample of gravitational lenses: in particular we comment on the partition of information between lens structure, the source population properties and cosmology. Understanding this partitioning is key to assessing strong lens cosmography's value as a cosmological probe.
International Nuclear Information System (INIS)
Aoki, Ken-ichi
1988-01-01
Existence of a strong coupling phase in QED has been suggested in solutions of the Schwinger-Dyson equation and in Monte Carlo simulation of lattice QED. In this article we recapitulate the previous arguments, and formulate the problem in the modern framework of the renormalization theory, Wilsonian renormalization. This scheme of renormalization gives the best understanding of the basic structure of a field theory especially when it has a multi-phase structure. We resolve some misleading arguments in the previous literature. Then we set up a strategy to attack the strong phase, if any. We describe a trial; a coupled Schwinger-Dyson equation. Possible picture of the strong coupling phase QED is presented. (author)
Strong Decomposition of Random Variables
DEFF Research Database (Denmark)
Hoffmann-Jørgensen, Jørgen; Kagan, Abram M.; Pitt, Loren D.
2007-01-01
A random variable X is stongly decomposable if X=Y+Z where Y=Φ(X) and Z=X-Φ(X) are independent non-degenerated random variables (called the components). It is shown that at least one of the components is singular, and we derive a necessary and sufficient condition for strong decomposability...
Strong interaction at finite temperature
Indian Academy of Sciences (India)
Abstract. We review two methods discussed in the literature to determine the effective parameters of strongly interacting particles as they move through a heat bath. The first one is the general method of chiral perturbation theory, which may be readily applied to this problem. The other is the method of thermal QCD sum rules ...
Strong-strong beam-beam simulation on parallel computer
Energy Technology Data Exchange (ETDEWEB)
Qiang, Ji
2004-08-02
The beam-beam interaction puts a strong limit on the luminosity of the high energy storage ring colliders. At the interaction points, the electromagnetic fields generated by one beam focus or defocus the opposite beam. This can cause beam blowup and a reduction of luminosity. An accurate simulation of the beam-beam interaction is needed to help optimize the luminosity in high energy colliders.
Strong-strong beam-beam simulation on parallel computer
International Nuclear Information System (INIS)
Qiang, Ji
2004-01-01
The beam-beam interaction puts a strong limit on the luminosity of the high energy storage ring colliders. At the interaction points, the electromagnetic fields generated by one beam focus or defocus the opposite beam. This can cause beam blowup and a reduction of luminosity. An accurate simulation of the beam-beam interaction is needed to help optimize the luminosity in high energy colliders
Strong Josephson Coupling in Planar Graphene Junctions
Park, Jinho; Lee, Gil-Ho; Lee, Jae Hyeong; Takane, Yositake; Imura, Ken-Ichiro; Taniguchi, Takashi; Watanabe, Kenji; Lee, Hu-Jong
A recent breakthrough of processing graphene, employing encapsulation by hexagonal boron nitride layers (BGB structure), allows realizing the ballistic carrier transport in graphene. Thereafter, ballistic Josephson coupling has been studied by closely edge-contacted BGB structure with two superconducting electrodes. Here, we report on the strong Josephson coupling with planar graphene junction in truly short and ballistic regime. Our device showed high transmission probability and the junction critical current (IC) oscillating for sweeping the gate voltage along with the normal conductance oscillation (Fabry-Perot oscillations), providing a direct evidence for the ballistic nature of the junction pair current. We also observed the convex-upward shape of decreasing critical currents with increasing temperature, canonical properties of the short Josephson coupling. By fitting these curves into theoretical models, we demonstrate the strong Josephson coupling in our devices, which is also supported by the exceptionally large value of ICRN ( 2 Δ / e RNis the normal resistance).
Hanson, Sonya M.; Ekins, Sean; Chodera, John D.
2015-12-01
All experimental assay data contains error, but the magnitude, type, and primary origin of this error is often not obvious. Here, we describe a simple set of assay modeling techniques based on the bootstrap principle that allow sources of error and bias to be simulated and propagated into assay results. We demonstrate how deceptively simple operations—such as the creation of a dilution series with a robotic liquid handler—can significantly amplify imprecision and even contribute substantially to bias. To illustrate these techniques, we review an example of how the choice of dispensing technology can impact assay measurements, and show how large contributions to discrepancies between assays can be easily understood and potentially corrected for. These simple modeling techniques—illustrated with an accompanying IPython notebook—can allow modelers to understand the expected error and bias in experimental datasets, and even help experimentalists design assays to more effectively reach accuracy and imprecision goals.
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A; Peddada, Shyamal D
2018-01-01
Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html.
Noma, Hisashi; Nagashima, Kengo; Maruo, Kazushi; Gosho, Masahiko; Furukawa, Toshi A
2018-03-30
In network meta-analyses that synthesize direct and indirect comparison evidence concerning multiple treatments, multivariate random effects models have been routinely used for addressing between-studies heterogeneities. Although their standard inference methods depend on large sample approximations (eg, restricted maximum likelihood estimation) for the number of trials synthesized, the numbers of trials are often moderate or small. In these situations, standard estimators cannot be expected to behave in accordance with asymptotic theory; in particular, confidence intervals cannot be assumed to exhibit their nominal coverage probabilities (also, the type I error probabilities of the corresponding tests cannot be retained). The invalidity issue may seriously influence the overall conclusions of network meta-analyses. In this article, we develop several improved inference methods for network meta-analyses to resolve these problems. We first introduce 2 efficient likelihood-based inference methods, the likelihood ratio test-based and efficient score test-based methods, in a general framework of network meta-analysis. Then, to improve the small-sample inferences, we developed improved higher-order asymptotic methods using Bartlett-type corrections and bootstrap adjustment methods. The proposed methods adopt Monte Carlo approaches using parametric bootstraps to effectively circumvent complicated analytical calculations of case-by-case analyses and to permit flexible application to various statistical models network meta-analyses. These methods can also be straightforwardly applied to multivariate meta-regression analyses and to tests for the evaluation of inconsistency. In numerical evaluations via simulations, the proposed methods generally performed well compared with the ordinary restricted maximum likelihood-based inference method. Applications to 2 network meta-analysis datasets are provided. Copyright © 2017 John Wiley & Sons, Ltd.