Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... in the literature by proposing a bootstrap sequential algorithm which we demonstrate delivers consistent cointegration rank estimation for general I(1) processes. Finite sample Monte Carlo simulations show the proposed procedure performs well in practice....
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying VAR model which obtain under the reduced rank null hypothesis. They propose methods based on an i.i.d. bootstrap re-sampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co-integrated VAR model with i.i.d. innovations. In this paper we investigate...... the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap re-sampling scheme, when time-varying behaviour is present in either the conditional or unconditional variance of the innovations. We show that the bootstrap PLR tests are asymptotically correctly sized and...
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, G.; Rahbek, Anders; Taylor, A.M.R.
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... of the underlying vector autoregressive (VAR) model which obtain under the reduced rank null hypothesis. They propose methods based on an independent and individual distributed (i.i.d.) bootstrap resampling scheme and establish the validity of their proposed bootstrap procedures in the context of a co......-integrated VAR model with i.i.d. innovations. In this paper we investigate the properties of their bootstrap procedures, together with analogous procedures based on a wild bootstrap resampling scheme, when time-varying behavior is present in either the conditional or unconditional variance of the innovations. We...
Varouchakis, Emmanouil; Hristopulos, Dionissios
2015-04-01
Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
-law avalanches, while the continuous transition is characterized by truncated avalanches in a related sequential bootstrap process. We explain this behaviour on the basis of an analytical and numerical study of the avalanche distributions on ...
Bootstrap, Wild Bootstrap and Generalized Bootstrap
Mammen, Enno
1995-01-01
Some modifications and generalizations of the bootstrap procedurehave been proposed. In this note we will consider the wild bootstrap and the generalized bootstrap and we will give two arguments why it makes sense touse these modifications instead of the original bootstrap. The firstargument is that there exist examples where generalized and wild bootstrapwork, but where the original bootstrap fails and breaks down. The secondargument will be based on higher order considerations. We will show...
Sequential injection spectrophotometric determination of V(V) in ...
African Journals Online (AJOL)
Sequential injection spectrophotometric determination of V(V) in environmental polluted waters. ES Silva, PCAG Pinto, JLFC Lima, MLMFS Saraiva. Abstract. A fast and robust sequential injection analysis (SIA) methodology for routine determination of V(V) in environmental polluted waters is presented. The determination ...
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
CISA: combined NMR resonance connectivity information determination and sequential assignment.
Wan, Xiang; Lin, Guohui
2007-01-01
A nearly complete sequential resonance assignment is a key factor leading to successful protein structure determination via NMR spectroscopy. Assuming the availability of a set of NMR spectral peak lists, most of the existing assignment algorithms first use the differences between chemical shift values for common nuclei across multiple spectra to provide the evidence that some pairs of peaks should be assigned to sequentially adjacent amino acid residues in the target protein. They then use these connectivities as constraints to produce a sequential assignment. At various levels of success, these algorithms typically generate a large number of potential connectivity constraints, and it grows exponentially as the quality of spectral data decreases. A key observation used in our sequential assignment program, CISA, is that chemical shift residual signature information can be used to improve the connectivity determination, and thus to dramatically decrease the number of predicted connectivity constraints. Fewer connectivity constraints lead to less ambiguities in the sequential assignment. Extensive simulation studies on several large test datasets demonstrated that CISA is efficient and effective, compared to three most recently proposed sequential resonance assignment programs RANDOM, PACES, and MARS.
Sequential determination of important ecotoxic radionuclides in nuclear waste samples
International Nuclear Information System (INIS)
Bilohuscin, J.
2016-01-01
In the dissertation thesis we focused on the development and optimization of a sequential determination method for radionuclides 93 Zr, 94 Nb, 99 Tc and 126 Sn, employing extraction chromatography sorbents TEVA (R) Resin and Anion Exchange Resin, supplied by Eichrom Industries. Prior to the attestation of sequential separation of these proposed radionuclides from radioactive waste samples, a unique sequential procedure of 90 Sr, 239 Pu, 241 Am separation from urine matrices was tried, using molecular recognition sorbents of AnaLig (R) series and extraction chromatography sorbent DGA (R) Resin. On these experiments, four various sorbents were continually used for separation, including PreFilter Resin sorbent, which removes interfering organic materials present in raw urine. After the acquisition of positive results of this sequential procedure followed experiments with a 126 Sn separation using TEVA (R) Resin and Anion Exchange Resin sorbents. Radiochemical recoveries obtained from samples of radioactive evaporate concentrates and sludge showed high efficiency of the separation, while values of 126 Sn were under the minimum detectable activities MDA. Activity of 126 Sn was determined after ingrowth of daughter nuclide 126m Sb on HPGe gamma detector, with minimal contamination of gamma interfering radionuclides with decontamination factors (D f ) higher then 1400 for 60 Co and 47000 for 137 Cs. Based on the acquired experiments and results of these separation procedures, a complex method of sequential separation of 93 Zr, 94 Nb, 99 Tc and 126 Sn was proposed, which included optimization steps similar to those used in previous parts of the dissertation work. Application of the sequential separation method for sorbents TEVA (R) Resin and Anion Exchange Resin on real samples of radioactive wastes provided satisfactory results and an economical, time sparing, efficient method. (author)
Bootstrapping heteroskedastic regression models: wild bootstrap vs. pairs bootstrap
Emmanuel Flachaire
2005-01-01
In regression models, appropriate bootstrap methods for inference robust to heteroskedasticity of unknown form are the wild bootstrap and the pairs bootstrap. The finite sample performance of a heteroskedastic-robust test is investigated with Monte Carlo experiments. The simulation results suggest that one specific version of the wild bootstrap outperforms the other versions of the wild bootstrap and of the pairs bootstrap. It is the only one for which the bootstrap test gives always better r...
Energy Technology Data Exchange (ETDEWEB)
Castedo Echeverri, Alejandro [SISSA, Trieste (Italy); INFN, Trieste (Italy); Harling, Benedict von [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Serone, Marco [SISSA, Trieste (Italy); INFN, Trieste (Italy); ICTP, Trieste (Italy)
2016-06-15
We study the numerical bounds obtained using a conformal-bootstrap method where different points in the plane of conformal cross ratios z and anti z are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z= anti z=1/2, we can consistently ''integrate out'' higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this ''effective'' bootstrap by studying the 3D Ising and O(n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O(n) vector models, with n=2,3,4, which have not yet been computed using bootstrap techniques.
Pfiffner, H. J.
1969-01-01
Circuit can sample a number of transducers in sequence without drawing from them. This bootstrap unloader uses a differential amplifier with one input connected to a circuit which is the equivalent of the circuit to be unloaded, and the other input delivering the proper unloading currents.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
Wild bootstrap versus moment-oriented bootstrap
Sommerfeld, Volker
1997-01-01
We investigate the relative merits of a “moment-oriented” bootstrap method of Bunke (1997) in comparison with the classical wild bootstrap of Wu (1986) in nonparametric heteroscedastic regression situations. The “moment-oriented” bootstrap is a wild bootstrap based on local estimators of higher order error moments that are smoothed by kernel smoothers. In this paper we perform an asymptotic comparison of these two dierent bootstrap procedures. We show that the moment-oriented bootstrap is in ...
Desharnais, Brigitte; Camirand-Lemyre, Félix; Mireault, Pascal; Skinner, Cameron D
2015-03-01
Calculating the confidence interval is a common procedure in data analysis and is readily obtained from normally distributed populations with the familiar [Formula: see text] formula. However, when working with non-normally distributed data, determining the confidence interval is not as obvious. For this type of data, there are fewer references in the literature, and they are much less accessible. We describe, in simple language, the percentile and bias-corrected and accelerated variations of the bootstrap method to calculate confidence intervals. This method can be applied to a wide variety of parameters (mean, median, slope of a calibration curve, etc.) and is appropriate for normal and non-normal data sets. As a worked example, the confidence interval around the median concentration of cocaine in femoral blood is calculated using bootstrap techniques. The median of the non-toxic concentrations was 46.7 ng/mL with a 95% confidence interval of 23.9-85.8 ng/mL in the non-normally distributed set of 45 postmortem cases. This method should be used to lead to more statistically sound and accurate confidence intervals for non-normally distributed populations, such as reference values of therapeutic and toxic drug concentration, as well as situations of truncated concentration values near the limit of quantification or cutoff of a method. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Determination of isoxsuprine hydrochloride by sequential injection visible spectrophotometry.
Beyene, Negussie W; Van Staden, Jacobus F; Stefan, Raluca-Ioana; Aboul-Enein, Hassan Y
2005-01-01
An automated sequential injection (SI) spectrophotometric method for the determination of isoxsuprine hydrochloride is described. The method is based on the condensation of aminoantipyrine with phenols (isoxsuprine hydrochloride) in the presence of an alkaline oxidizing agent (potassium hexacyanoferrate) to yield a pink colored product, the absorbance of which is monitored at 507 nm. Chemical as well as physical SI parameters that affect the signal response have been optimized in order to get better sensitivity, higher sampling rate and better reagent economy. Using the optimized aforesaid parameters, a linear relationship between the relative peak height and concentration was obtained in the range 1-60 mg l-1. The detection limit (as 3sigma value) was 0.3 mg l-1 and precision was 1.4% and 1.6% at 5 and 10 mg l-1, respectively. As compared to previous reports, wide linear range, low detection limit, and highly economical reagent consumption are the advantages of this automated method.
Beran, Rudolf
1994-01-01
This essay is organized around the theoretical and computationalproblem of constructing bootstrap confidence sets, with forays into relatedtopics. The seven section headings are: Introduction; The Bootstrap World;Bootstrap Confidence Sets; Computing Bootstrap Confidence Sets; Quality ofBootstrap Confidence Sets; Iterated and Two-step Boostrap; Further Resources.
Velocities of antarctic outlet glaciers determined from sequential Landsat images
MacDonald, Thomas R.; Ferrigno, Jane G.; Williams, Richard S.; Lucchitta, Baerbel K.
1989-01-01
Approximately 91.0 percent of the volume of present-day glacier ice on Earth is in Antarctica; Greenland contains about another 8.3 percent of the volume. Thus, together, these two great ice sheets account for an estimated 99.3 percent of the total. Long-term changes in the volume of glacier ice on our planet are the result of global climate change. Because of the relationship of global ice volume to sea level (± 330 cubic kilometers of glacier ice equals ± 1 millimeter sea level), changes in the mass balance of the antarctic ice sheet are of particular importance.Whether the mass balance of the east and west antarctic ice sheets is positive or negative is not known. Estimates of mass input by total annual precipitation for the continent have been made from scattered meteorological observations (Swithinbank 1985). The magnitude of annual ablation of the ice sheet from calving of outlet glaciers and ice shelves is also not well known. Although the velocities of outlet glaciers can be determined from field measurements during the austral summer,the technique is costly, does not cover a complete annual cycle,and has been applied to just a few glaciers. To increase the number of outlet glaciers in Antarctica for which velocities have been determined and to provide additional data for under-standing the dynamics of the antarctic ice sheets and their response to global climate change, sequential Landsat image of several outlet glaciers were measured.
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
Kozak, Joanna; Wójtowicz, Marzena; Gawenda, Nadzieja; Kościelniak, Paweł
2011-06-15
An automatic sequential injection system, combining monosegmented flow analysis, sequential injection analysis and sequential injection titration is proposed for acidity determination. The system enables controllable sample dilution and generation of standards of required concentration in a monosegmented sequential injection manner, sequential injection titration of the prepared solutions, data collecting, and handling. It has been tested on spectrophotometric determination of acetic, citric and phosphoric acids with sodium hydroxide used as a titrant and phenolphthalein or thymolphthalein (in the case of phosphoric acid determination) as indicators. Accuracy better than |4.4|% (RE) and repeatability better than 2.9% (RSD) have been obtained. It has been applied to the determination of total acidity in vinegars and various soft drinks. The system provides low sample (less than 0.3 mL) consumption. On average, analysis of a sample takes several minutes. Copyright © 2011 Elsevier B.V. All rights reserved.
Ultrafast approximation for phylogenetic bootstrap.
Minh, Bui Quang; Nguyen, Minh Anh Thi; von Haeseler, Arndt
2013-05-01
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and the Shimodaira-Hasegawa-like approximate likelihood ratio test have been introduced to speed up the bootstrap. Here, we suggest an ultrafast bootstrap approximation approach (UFBoot) to compute the support of phylogenetic groups in maximum likelihood (ML) based trees. To achieve this, we combine the resampling estimated log-likelihood method with a simple but effective collection scheme of candidate trees. We also propose a stopping rule that assesses the convergence of branch support values to automatically determine when to stop collecting candidate trees. UFBoot achieves a median speed up of 3.1 (range: 0.66-33.3) to 10.2 (range: 1.32-41.4) compared with RAxML RBS for real DNA and amino acid alignments, respectively. Moreover, our extensive simulations show that UFBoot is robust against moderate model violations and the support values obtained appear to be relatively unbiased compared with the conservative standard bootstrap. This provides a more direct interpretation of the bootstrap support. We offer an efficient and easy-to-use software (available at http://www.cibiv.at/software/iqtree) to perform the UFBoot analysis with ML tree inference.
DEFF Research Database (Denmark)
Liu, Xuezhu; Hansen, Elo Harald
1996-01-01
A sequential injection analysis system is described that incorporates a nylon tubular reactor containing immobilised glucose oxidase, allowing determination of D-glucose by means of subsequent luminol chemiluminescence detection of the hydrogen peroxide generated in the enzymatic reaction...
Analyzing large datasets with bootstrap penalization.
Fang, Kuangnan; Ma, Shuangge
2017-03-01
Data with a large p (number of covariates) and/or a large n (sample size) are now commonly encountered. For many problems, regularization especially penalization is adopted for estimation and variable selection. The straightforward application of penalization to large datasets demands a "big computer" with high computational power. To improve computational feasibility, we develop bootstrap penalization, which dissects a big penalized estimation into a set of small ones, which can be executed in a highly parallel manner and each only demands a "small computer". The proposed approach takes different strategies for data with different characteristics. For data with a large p but a small to moderate n, covariates are first clustered into relatively homogeneous blocks. The proposed approach consists of two sequential steps. In each step and for each bootstrap sample, we select blocks of covariates and run penalization. The results from multiple bootstrap samples are pooled to generate the final estimate. For data with a large n but a small to moderate p, we bootstrap a small number of subjects, apply penalized estimation, and then conduct a weighted average over multiple bootstrap samples. For data with a large p and a large n, the natural marriage of the previous two methods is applied. Numerical studies, including simulations and data analysis, show that the proposed approach has computational and numerical advantages over the straightforward application of penalization. An R package has been developed to implement the proposed methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Comparison of ERBS orbit determination accuracy using batch least-squares and sequential methods
Oza, D. H.; Jones, T. L.; Fabien, S. M.; Mistretta, G. D.; Hart, R. C.; Doll, C. E.
1991-01-01
The Flight Dynamics Div. (FDD) at NASA-Goddard commissioned a study to develop the Real Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination of spacecraft on a DOS based personal computer (PC). An overview is presented of RTOD/E capabilities and the results are presented of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOS/E on a PC with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. RTOD/E was used to perform sequential orbit determination for the Earth Radiation Budget Satellite (ERBS), and the Goddard Trajectory Determination System (GTDS) was used to perform the batch least squares orbit determination. The estimated ERBS ephemerides were obtained for the Aug. 16 to 22, 1989, timeframe, during which intensive TDRSS tracking data for ERBS were available. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for ERBS; the solution differences were less than 40 meters after the filter had reached steady state.
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger
new resampling method, the local fractional bootstrap, relies on simulating an auxiliary fractional Brownian motion that mimics the fine properties of high frequency differences of the Brownian semistationary process under the null hypothesis. We prove the first order validity of the bootstrap method...... to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data....
Noel, James D; Biswas, Pratim; Giammar, Daniel E
2007-07-01
Leaching of mercury from coal combustion byproducts is a concern because of the toxicity of mercury. Leachability of mercury can be assessed by using sequential extraction procedures. Sequential extraction procedures are commonly used to determine the speciation and mobility of trace metals in solid samples and are designed to differentiate among metals bound by different mechanisms and to different solid phases. This study evaluated the selectivity and effectiveness of a sequential extraction process used to determine mercury binding mechanisms to various materials. A six-step sequential extraction process was applied to laboratory-synthesized materials with known mercury concentrations and binding mechanisms. These materials were calcite, hematite, goethite, and titanium dioxide. Fly ash from a full-scale power plant was also investigated. The concentrations of mercury were measured using inductively coupled plasma (ICP) mass spectrometry, whereas the major elements were measured by ICP atomic emission spectrometry. The materials were characterized by X-ray powder diffraction and scanning electron microscopy with energy dispersive spectroscopy. The sequential extraction procedure provided information about the solid phases with which mercury was associated in the solid sample. The procedure effectively extracted mercury from the target phases. The procedure was generally selective in extracting mercury. However, some steps in the procedure extracted mercury from nontarget phases, and others resulted in mercury redistribution. Iron from hematite and goethite was only leached in the reducible and residual extraction steps. Some mercury associated with goethite was extracted in the ion exchangeable step, whereas mercury associated with hematite was extracted almost entirely in the residual step. Calcium in calcite and mercury associated with calcite were primarily removed in the acid-soluble extraction step. Titanium in titanium dioxide and mercury adsorbed onto
Sequential testing in a high stakes OSCE: Determining number of screening tests.
Currie, Graeme P; Sivasubramaniam, Selvaraj; Cleland, Jennifer
2016-07-01
The sequential objective structured clinical exam (OSCE) is a stand-alone variation of the traditional OSCE whereby all students sit a screening test. Those who pass this initial assessment undergo no further testing while weakly performing students sit an additional (sequential) test to determine their overall pass/fail status. Our aim was to determine outcomes of adopting a sequential OSCE approach using different numbers of screening stations and pass marks. We carried out a retrospective, observational study of anonymised databases of two cohorts of student outcomes from the final OSCE examination at the University of Aberdeen Medical School. Data were accessed for students (n = 388) who sat the exam in the years 2013-2014. We used Stata simulate program to compare outcomes - in terms of sensitivity and specificity - across 5000 random selections of 6-14 OSCE stations using random selections of groups of 100 students (with different screening test pass marks) versus those obtained across 15 stations. Across 6-14 stations, the sensitivity was ≥87% in 2013 and ≥84% in 2014 while the specificity ranged from 60% to 100% in both years. Specificity generally increased as the number of screening stations increased (with concomitant narrowing of the 95% confidence interval), while sensitivity varied between 84 and 98%. Similar sensitivities and specificities were found with screening pass marks of +1, +2 and +3 standard errors of measurement (SEM). Eight stations as a screening test appeared to be a reasonable compromise in terms of high sensitivity (88-89%) and specificity (83-86%). This research extends current sequential OSCE literature using a novel and robust approach to identify the "ideal" in terms of number of screening stations and pass mark. We discuss the educational and resource implications of our findings and make recommendations for the use of the sequential OSCE in medical education.
Profile of sequential determinants in tissue polypeptide antigen BrCN:B fragment.
Chersi, A; Camera, M; Trinca, M L; Castelli, M
1989-02-15
A synthetic approach has been applied to determine the profile of sequential determinants of one immunodominant region of Tissue Polypeptide Antigen (TPA). Five overlapping peptides, covering 30 of the 32 amino acid residues of this fragment, were chemically synthesized, and their antibody-binding activities for rabbit anti-TPA antibodies determined by enzyme-linked immunoadsorbant assays. Anti-TPA reacted with two overlapping fragments at the COOH-terminal end of the fragment, but not with peptides that include Arg 15 considered as essential for the antigenicity of the whole fragment. This might suggest that this critical residue is involved in the formation of a complex conformational determinant.
Energy Technology Data Exchange (ETDEWEB)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group
2017-02-15
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-10-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c≥ 13/24 for the central charge of such models, which we argue cannot be saturated by an interacting SCFT.
International Nuclear Information System (INIS)
Cornagliotto, Martina; Lemos, Madalena; Schomerus, Volker
2017-02-01
Applications of the bootstrap program to superconformal field theories promise unique new insights into their landscape and could even lead to the discovery of new models. Most existing results of the superconformal bootstrap were obtained form correlation functions of very special fields in short (BPS) representations of the superconformal algebra. Our main goal is to initiate a superconformal bootstrap for long multiplets, one that exploits all constraints from superprimaries and their descendants. To this end, we work out the Casimir equations for four-point correlators of long multiplets of the two-dimensional global N=2 superconformal algebra. After constructing the full set of conformal blocks we discuss two different applications. The first one concerns two-dimensional (2,0) theories. The numerical bootstrap analysis we perform serves a twofold purpose, as a feasibility study of our long multiplet bootstrap and also as an exploration of (2,0) theories. A second line of applications is directed towards four-dimensional N=3 SCFTs. In this context, our results imply a new bound c ≥ (13)/(24) for the central charge of such models. A theory that saturates this bound is not known yet.
Sequential filter design for precision orbit determination and physical constant refinement
Curkendall, D. W.; Leondes, C. T.
1974-01-01
Earth-based spacecraft tracking data have historically been processed with classical least squares filtering techniques both for navigation purposes and for physical constant determination. The small, stochastic nongravitational forces acting on the spacecraft are described to motivate the use of sequential estimation as an alternative to the least squares fitting procedures. The stochastic forces are investigated both in terms of their effect on the tracking data and their influence on estimation accuracy. A flexible sequential filter design which leaves the existing trajectory, variational equations, data observable and partial computations undisturbed is described. A detailed filter design is presented that meets the precision demands and flexibility requirements of deep space navigation and of scientific problems.
van Staden, J F; Mashamba, Mulalo G; Stefan, Raluca I
2002-09-01
An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid is proposed. A solution of 0.1 mol L(-1) sodium chloride is used as carrier. Titration is achieved by aspirating acetic acid samples between two strong base-zone volumes into a holding coil and by channelling the stack of well-defined zones with flow reversal through a reaction coil to a potentiometric sensor where the peak widths were measured. A linear relationship between peak width and logarithm of the acid concentration was obtained in the range 1-9 g/100 mL. Vinegar samples were analysed without any sample pre-treatment. The method has a relative standard deviation of 0.4% with a sample frequency of 28 samples per hour. The results revealed good agreement between the proposed sequential injection and an automated batch titration method.
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2013-08-16
We implement the conformal bootstrap for N=4 superconformal field theories in four dimensions. The consistency of the four-point function of the stress-energy tensor multiplet imposes significant upper bounds for the scaling dimensions of unprotected local operators as functions of the central charge of the theory. At the threshold of exclusion, a particular operator spectrum appears to be singled out by the bootstrap constraints. We conjecture that this extremal spectrum is that of N=4 supersymmetric Yang-Mills theory at an S-duality invariant value of the complexified gauge coupling.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
by presenting an analytic and numerical study of the problem on a Bethe lattice. The Bethe lattice does not capture all the complexities of bootstrap dynamics on periodic lattices but it does provide useful insight into what makes the avalanche distributions different in the two cases. The following presentation is self- ...
Durrieu, G; Ciffroy, P; Garnier, J-M
2006-11-01
The objective of the study was to provide global probability density functions (PDFs) representing the uncertainty of distribution coefficients (Kds) in freshwater for radioisotopes of Co, Cs, Sr and I. A comprehensive database containing Kd values referenced in 61 articles was first built and quality scores were affected to each data point according to various criteria (e.g. presentation of data, contact times, pH, solid-to-liquid ratio, expert judgement). A weighted bootstrapping procedure was then set up in order to build PDFs, in such a way that more importance is given to the most relevant data points (i.e. those corresponding to typical natural environments). However, it was also assessed that the relevance and the robustness of the PDFs determined by our procedure depended on the number of Kd values in the database. Owing to the large database, conditional PDFs were also proposed, for site studies where some parametric information is known (e.g. pH, contact time between radionuclides and particles, solid-to-liquid ratio). Such conditional PDFs reduce the uncertainty on the Kd values. These global and conditional PDFs are useful for end-users of dose models because the uncertainty and sensitivity of Kd values are taking into account.
The bootstrap and Bayesian bootstrap method in assessing bioequivalence
International Nuclear Information System (INIS)
Wan Jianping; Zhang Kongsheng; Chen Hui
2009-01-01
Parametric method for assessing individual bioequivalence (IBE) may concentrate on the hypothesis that the PK responses are normal. Nonparametric method for evaluating IBE would be bootstrap method. In 2001, the United States Food and Drug Administration (FDA) proposed a draft guidance. The purpose of this article is to evaluate the IBE between test drug and reference drug by bootstrap and Bayesian bootstrap method. We study the power of bootstrap test procedures and the parametric test procedures in FDA (2001). We find that the Bayesian bootstrap method is the most excellent.
DEFF Research Database (Denmark)
Liu, Xuezhu; Hansen, Elo Harald
1996-01-01
A sequential injection analysis system is described that incorporates a nylon tubular reactor containing immobilised glucose oxidase, allowing determination of D-glucose by means of subsequent luminol chemiluminescence detection of the hydrogen peroxide generated in the enzymatic reaction....... The operating parameters were optimised by fractional factorial screening and response surface modelling. The linear range of D-glucose determination was 30-600 mu M, With a detection limit of 15 mu M using a photodiode detector. The sampling frequency was 54 h(-1). Lower LOD (0.5 mu M D-glucose) could...... be reached by using a PMT as the detector. Fermentation broth samples were determined and good recoveries were obtained....
Khongpet, Wanpen; Pencharee, Somkid; Puangpila, Chanida; Kradtap Hartwell, Supaporn; Lapanantnoppakhun, Somchai; Jakmunee, Jaroon
2018-01-15
A microfluidic hydrodynamic sequential injection (μHSI) spectrophotometric system was designed and fabricated. The system was built by laser engraving a manifold pattern on an acrylic block and sealing with another flat acrylic plate to form a microfluidic channel platform. The platform was incorporated with small solenoid valves to obtain a portable setup for programmable control of the liquid flow into the channel according to the HSI principle. The system was demonstrated for the determination of phosphate using a molybdenum blue method. An ascorbic acid, standard or sample, and acidic molybdate solutions were sequentially aspirated to fill the channel forming a stack zone before flowing to the detector. Under the optimum condition, a linear calibration graph in the range of 0.1-6mg P L -1 was obtained. The detection limit was 0.1mgL -1 . The system is compact (5.0mm thick, 80mm wide × 140mm long), durable, portable, cost-effective, and consumes little amount of chemicals (83μL each of molybdate and ascorbic acid, 133μL of the sample solution and 1.7mL of water carrier/run). It was applied for the determination of phosphate content in extracted soil samples. The percent recoveries of the analysis were obtained in the range of 91.2-107.3. The results obtained agreed well with those of the batch spectrophotometric method. Copyright © 2017 Elsevier B.V. All rights reserved.
Using the bootstrap in a multivariadte data problem: An example
International Nuclear Information System (INIS)
Glosup, J.G.; Axelrod, M.C.
1995-01-01
The use of the bootstrap in the multivariate version of the paired t-test is considered and demonstrated through an example. The problem of interest involves comparing two different techniques for measuring the chemical constituents of an sample item. The bootstrap is used to form an empirical significance level for Hotelling's one-sample T-squared statistic. The bootstrap was selected to determine empirical significance levels because the implicit assumption of multivariate normality in the classic Hotelling's one-sample test night not hold. The results of both the classic and bootstrap test are presented and contrasted
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
transition. The first-order transition encountered in bootstrap problems has often a mixed character in the sense that the discontinuous drop in magnetization is .... Pn = p z−1. ∑ k=0. ( z − 1 k. ) [Pn+1]k[1 − Pn+1]z−1−kpk+1, pk+1 = 1 if k + 1 ≥ m, pk+1 = 0 if k + 1 < m. (1). The rationale for the above equation is as follows.
Determination of uranium and plutonium in plutonium based fuels by sequential amperometric titration
International Nuclear Information System (INIS)
Nair, P.R.; Lohithakshan, K.V.; Xavier, M.; Marathe, S.G.; Jain, H.C.
1988-01-01
A method is described for the sequential determination of uranium and plutonium in plutonium bearing fuel materials. Uranium and plutonium are reduced to U(IV) and Pu(III) with titanous chloride and then titrated with dichromate to two end points which are detected amperometrically using two polarized platinum electrodes. Uranium-plutonium solutions of known concentrations containing plutonium in the proportions of 4, 30, 50, and 70% were analyzed with precisions better than 0.3%, maintaining the amounts of plutonium per aliquot in the range of 2-10 mg. No significant bias could be detected. Several samples of (U,Pu)O 2 and (U,Pu)C were analyzed by this procedure. The effects of iron, fluoride, oxalic acid and mellitic acid on the method were also studied. (author) 5 refs.; 6 tabs
Bootstrapping language acquisition.
Abend, Omri; Kwiatkowski, Tom; Smith, Nathaniel J; Goldwater, Sharon; Steedman, Mark
2017-07-01
The semantic bootstrapping hypothesis proposes that children acquire their native language through exposure to sentences of the language paired with structured representations of their meaning, whose component substructures can be associated with words and syntactic structures used to express these concepts. The child's task is then to learn a language-specific grammar and lexicon based on (probably contextually ambiguous, possibly somewhat noisy) pairs of sentences and their meaning representations (logical forms). Starting from these assumptions, we develop a Bayesian probabilistic account of semantically bootstrapped first-language acquisition in the child, based on techniques from computational parsing and interpretation of unrestricted text. Our learner jointly models (a) word learning: the mapping between components of the given sentential meaning and lexical words (or phrases) of the language, and (b) syntax learning: the projection of lexical elements onto sentences by universal construction-free syntactic rules. Using an incremental learning algorithm, we apply the model to a dataset of real syntactically complex child-directed utterances and (pseudo) logical forms, the latter including contextually plausible but irrelevant distractors. Taking the Eve section of the CHILDES corpus as input, the model simulates several well-documented phenomena from the developmental literature. In particular, the model exhibits syntactic bootstrapping effects (in which previously learned constructions facilitate the learning of novel words), sudden jumps in learning without explicit parameter setting, acceleration of word-learning (the "vocabulary spurt"), an initial bias favoring the learning of nouns over verbs, and one-shot learning of words and their meanings. The learner thus demonstrates how statistical learning over structured representations can provide a unified account for these seemingly disparate phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
Sequential injection redox or acid-base titration for determination of ascorbic acid or acetic acid.
Lenghor, Narong; Jakmunee, Jaroon; Vilen, Michael; Sara, Rolf; Christian, Gary D; Grudpan, Kate
2002-12-06
Two sequential injection titration systems with spectrophotometric detection have been developed. The first system for determination of ascorbic acid was based on redox reaction between ascorbic acid and permanganate in an acidic medium and lead to a decrease in color intensity of permanganate, monitored at 525 nm. A linear dependence of peak area obtained with ascorbic acid concentration up to 1200 mg l(-1) was achieved. The relative standard deviation for 11 replicate determinations of 400 mg l(-1) ascorbic acid was 2.9%. The second system, for acetic acid determination, was based on acid-base titration of acetic acid with sodium hydroxide using phenolphthalein as an indicator. The decrease in color intensity of the indicator was proportional to the acid content. A linear calibration graph in the range of 2-8% w v(-1) of acetic acid with a relative standard deviation of 4.8% (5.0% w v(-1) acetic acid, n=11) was obtained. Sample throughputs of 60 h(-1) were achieved for both systems. The systems were successfully applied for the assays of ascorbic acid in vitamin C tablets and acetic acid content in vinegars, respectively.
Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.
Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil
2015-02-01
A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations. Copyright © 2014 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Orawon Chailapakul
2008-03-01
Full Text Available A gas diffusion sequential injection system with amperometric detection using aboron-doped diamond electrode was developed for the determination of sulfite. A gasdiffusion unit (GDU was used to prevent interference from sample matrices for theelectrochemical measurement. The sample was mixed with an acid solution to generategaseous sulfur dioxide prior to its passage through the donor channel of the GDU. Thesulfur dioxide diffused through the PTFE hydrophobic membrane into a carrier solution of 0.1 M phosphate buffer (pH 8/0.1% sodium dodecyl sulfate in the acceptor channel of theGDU and turned to sulfite. Then the sulfite was carried to the electrochemical flow cell anddetected directly by amperometry using the boron-doped diamond electrode at 0.95 V(versus Ag/AgCl. Sodium dodecyl sulfate was added to the carrier solution to preventelectrode fouling. This method was applicable in the concentration range of 0.2-20 mgSO32Ã¢ÂˆÂ’/L and a detection limit (S/N = 3 of 0.05 mg SO32Ã¢ÂˆÂ’/L was achieved. This method wassuccessfully applied to the determination of sulfite in wines and the analytical resultsagreed well with those obtained by iodimetric titration. The relative standard deviations forthe analysis of sulfite in wines were in the range of 1.0-4.1 %. The sampling frequency was65 hÃ¢ÂˆÂ’1.
International Nuclear Information System (INIS)
Alarfaj, N.A.; Aly, F.A.; Tamimi, A.A.
2013-01-01
A sequential injection analysis (SIA) with chemiluminescence detection has been proposed for the determination of the antibiotic gemifloxacin mesylate (GFX). The developed method is based on the enhancement effect of silver nanoparticles (Ag NPs) on the chemiluminescence (CL) signal of luminol-potassium ferricyanide reaction in alkaline medium. The introduction of gemifloxacin in this system produced a significant decrease in the CL intensity in presence of (Ag NPs). The optimum conditions for CL emission were investigated. Linear relationship between the decrease in CL intensity and concentration was obtained in the range 0.01-1000 ng mL-1, (r = 0.9997) with detection limit of 2.0 pg mL-1 and quantification limit of 0.01 pg mL-1. The relative standard deviation was 1.3 %. The proposed method was employed for the determination of gemifloxacin in bulk drug, in its pharmaceutical dosage forms and biological fluids such as human serum and urine. The interference of some common additive compounds such as glucose, lactose, starch, talc and magnesium stearate was investigated, and no interference was found from these excipients. The obtained SIA results were statistically compared with those obtained from a reported method and did not show any significant difference at confidence level 95%. (author)
Explorations in Statistics: the Bootstrap
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…
Ultrafast Approximation for Phylogenetic Bootstrap
Bui Quang Minh, [No Value; Nguyen, Thi; von Haeseler, Arndt
Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and
The wild tapered block bootstrap
DEFF Research Database (Denmark)
Hounyo, Ulrich
-based method in terms of asymptotic accuracy of variance estimation and distribution approximation. For stationary time series, the asymptotic validity, and the favorable bias properties of the new bootstrap method are shown in two important cases: smooth functions of means, and M-estimators. The first......-order asymptotic validity of the tapered block bootstrap as well as the wild tapered block bootstrap approximation to the actual distribution of the sample mean is also established when data are assumed to satisfy a near epoch dependent condition. The consistency of the bootstrap variance estimator for the sample......In this paper, a new resampling procedure, called the wild tapered block bootstrap, is introduced as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The method consists in tapering each overlapping block...
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Steier, Peter
2013-01-01
An automated analytical method implemented in a novel dual-column tandem sequential injection (SI) system was developed for simultaneous determination of 236U, 237Np, 239Pu, and 240Pu in seawater samples. A combination of TEVA and UTEVA extraction chromatography was exploited to separate and purify...
Michel, H; Levent, D; Barci, V; Barci-Funel, G; Hurel, C
2008-02-15
A new sequential method for the determination of both natural (U, Th) and anthropogenic (Sr, Cs, Pu, Am) radionuclides has been developed for application to soil and sediment samples. The procedure was optimised using a reference sediment (IAEA-368) and reference soils (IAEA-375 and IAEA-326). Reference materials were first digested using acids (leaching), 'total' acids on hot plate, and acids in microwave in order to compare the different digestion technique. Then, the separation and purification were made by anion exchange resin and selective extraction chromatography: transuranic (TRU) and strontium (SR) resins. Natural and anthropogenic alpha radionuclides were separated by uranium and tetravalent actinide (UTEVA) resin, considering different acid elution medium. Finally, alpha and gamma semiconductor spectrometer and liquid scintillation spectrometer were used to measure radionuclide activities. The results obtained for strontium-90, cesium-137, thorium-232, uranium-238, plutonium-239+240 and americium-241 isotopes by the proposed method for the reference materials provided excellent agreement with the recommended values and good chemical recoveries. Plutonium isotopes in alpha spectrometry planchet deposits could be also analysed by ICPMS.
Bootstrapping quarks and gluons
Energy Technology Data Exchange (ETDEWEB)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.
Bootstrapping quarks and gluons
International Nuclear Information System (INIS)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Development of a Sequential Injection Analysis System for the Determination of Saccharin
Directory of Open Access Journals (Sweden)
Budi Wibowotomo
2017-12-01
Full Text Available Saccharin is a powerfully sweet nonnutritive sweetener that has been approved for food-processing applications within the range of 100–1200 mg/kg. A simple, rapid, and cost-effective sequential injection analysis (SIA technique was developed to determine the saccharin level. This method is based on the reaction of saccharin with p-chloranil in an ethanol medium with a hydrogen peroxide (H2O2 acceleration, and the resultant violet-red compound was detected using a UV-Vis spectrophotometer at λmax = 420 nm. To ascertain the optimal conditions for the SIA system, several parameters were investigated, including buffer flow rate and volume, p-chloranil concentration, and reactant volumes (saccharin, p-chloranil, and H2O2. The optimum setup of the SIA system was achieved with a buffer flow rate, buffer volume, and draw-up time of 1.2 mL/min, 2900 µL, and ~145 s, respectively. The optimal p-chloranil concentration is 30 mM, and the best reactant volumes, presented in an ordered sequence, are as follows: 30 µL of H2O2, 450 µL of saccharin, and 150 µL of p-chloranil. The optimized SIA configuration produced a good linear calibration curve with a correlation coefficient (R2 = 0.9812 in the concentration range of 20–140 mg/L and with a detection limit of 19.69 mg/L. Analytical applications in different food categories also showed acceptable recovery values in the range of 93.1–111.5%. This simple and rapid SIA system offers great feasibility for the saccharin quality control in food-product processing.
Development of a Sequential Injection Analysis System for the Determination of Saccharin.
Wibowotomo, Budi; Eun, Jong-Bang; Rhee, Jong Il
2017-12-12
Saccharin is a powerfully sweet nonnutritive sweetener that has been approved for food-processing applications within the range of 100-1200 mg/kg. A simple, rapid, and cost-effective sequential injection analysis (SIA) technique was developed to determine the saccharin level. This method is based on the reaction of saccharin with p-chloranil in an ethanol medium with a hydrogen peroxide (H₂O₂) acceleration, and the resultant violet-red compound was detected using a UV-Vis spectrophotometer at λ max = 420 nm. To ascertain the optimal conditions for the SIA system, several parameters were investigated, including buffer flow rate and volume, p-chloranil concentration, and reactant volumes (saccharin, p-chloranil, and H₂O₂). The optimum setup of the SIA system was achieved with a buffer flow rate, buffer volume, and draw-up time of 1.2 mL/min, 2900 µL, and ~145 s, respectively. The optimal p-chloranil concentration is 30 mM, and the best reactant volumes, presented in an ordered sequence, are as follows: 30 µL of H₂O₂, 450 µL of saccharin, and 150 µL of p-chloranil. The optimized SIA configuration produced a good linear calibration curve with a correlation coefficient (R² = 0.9812) in the concentration range of 20-140 mg/L and with a detection limit of 19.69 mg/L. Analytical applications in different food categories also showed acceptable recovery values in the range of 93.1-111.5%. This simple and rapid SIA system offers great feasibility for the saccharin quality control in food-product processing.
Can bootstrapping explain concept learning?
Beck, Jacob
2017-01-01
Susan Carey's account of Quinean bootstrapping has been heavily criticized. While it purports to explain how important new concepts are learned, many commentators complain that it is unclear just what bootstrapping is supposed to be or how it is supposed to work. Others allege that bootstrapping falls prey to the circularity challenge: it cannot explain how new concepts are learned without presupposing that learners already have those very concepts. Drawing on discussions of concept learning from the philosophical literature, this article develops a detailed interpretation of bootstrapping that can answer the circularity challenge. The key to this interpretation is the recognition of computational constraints, both internal and external to the mind, which can endow empty symbols with new conceptual roles and thus new contents. Copyright © 2016 Elsevier B.V. All rights reserved.
Bootstrapping pronunciation dictionaries: practical issues
CSIR Research Space (South Africa)
Davel, MH
2005-09-01
Full Text Available entries, increasing the size of the dictionary in an incremental fashion. 2.1. The bootstrapping process The bootstrapping system is initialised with a large word list (containing no pronunciation information), or with a pre- existing pronunciation.... The rule set is extracted in a straightforward fashion: for every letter (grapheme), a de- fault phoneme is derived as the phoneme to which the letter is most likely to map. ?Exceptional? cases ? words for which the expected phoneme is not correct...
Heptagons from the Steinmann cluster bootstrap
Energy Technology Data Exchange (ETDEWEB)
Dixon, Lance J.; McLeod, Andrew J. [Stanford Univ., CA (United States). SLAC National Accelerator Lab.; Drummond, James [Southampton Univ. (United Kingdom). School of Physics and Astronomy; Harrington, Thomas; Spradlin, Marcus [Brown Univ., Providence, RI (United States). Dept. of Physics; Papathanasiou, Georgios [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Stanford Univ., CA (United States). SLAC National Accelerator Lab.
2016-12-15
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Heptagons from the Steinmann cluster bootstrap
International Nuclear Information System (INIS)
Dixon, Lance J.; McLeod, Andrew J.; Drummond, James; Harrington, Thomas; Spradlin, Marcus; Papathanasiou, Georgios; Stanford Univ., CA
2016-12-01
We reformulate the heptagon cluster bootstrap to take advantage of the Steinmann relations, which require certain double discontinuities of any amplitude to vanish. These constraints vastly reduce the number of functions needed to bootstrap seven-point amplitudes in planar N=4 supersymmetric Yang-Mills theory, making higher-loop contributions to these amplitudes more computationally accessible. In particular, dual superconformal symmetry and well-defined collinear limits suffice to determine uniquely the symbols of the three-loop NMHV and four-loop MHV seven-point amplitudes. We also show that at three loops, relaxing the dual superconformal (anti Q) relations and imposing dihedral symmetry (and for NMHV the absence of spurious poles) leaves only a single ambiguity in the heptagon amplitudes. These results point to a strong tension between the collinear properties of the amplitudes and the Steinmann relations.
Energy Technology Data Exchange (ETDEWEB)
Urbanski, P.; Kowalska, E.
1997-12-31
The principle of the bootstrap methodology applied for the assessment of parameters and prediction ability of the linear regression models was presented. Application of this method was shown on the example of calibration of the radioisotope sulphuric acid concentration gauge. The bootstrap method allows to determine not only the numerical values of the regression coefficients, but also enables to investigate their distributions. (author). 11 refs, 12 figs, 3 tabs.
Probabilistic tractography using Lasso bootstrap.
Ye, Chuyang; Prince, Jerry L
2017-01-01
Diffusion magnetic resonance imaging (dMRI) can be used for noninvasive imaging of white matter tracts. Using fiber tracking, which propagates fiber streamlines according to fiber orientations (FOs) computed from dMRI, white matter tracts can be reconstructed for investigation of brain diseases and the brain connectome. Because of image noise, probabilistic tractography has been proposed to characterize uncertainties in FO estimation. Bootstrap provides a nonparametric approach to the estimation of FO uncertainties and residual bootstrap has been used for developing probabilistic tractography. However, recently developed models have incorporated sparsity regularization to reduce the required number of gradient directions to resolve crossing FOs, and the residual bootstrap used in previous methods is not applicable to these models. In this work, we propose a probabilistic tractography algorithm named Lasso bootstrap tractography (LBT) for the models that incorporate sparsity. Using a fixed tensor basis and a sparsity assumption, diffusion signals are modeled using a Lasso formulation. With the residuals from the Lasso model, a distribution of diffusion signals is obtained according to a modified Lasso bootstrap strategy. FOs are then estimated from the synthesized diffusion signals by an algorithm that improves FO estimation by enforcing spatial consistency of FOs. Finally, streamlining fiber tracking is performed with the computed FOs. The LBT algorithm was evaluated on simulated and real dMRI data both qualitatively and quantitatively. Results demonstrate that LBT outperforms state-of-the-art algorithms. Copyright © 2016 Elsevier B.V. All rights reserved.
Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudhashree
2015-01-01
In 2004, the largest HIV prevention project (Avahan) conducted globally was implemented in India. Avahan was implemented by NGOs supported by state lead partners in order to provide HIV prevention services to high-risk population groups. In 2007, most of the NGOs reached full coverage. Using a panel data set of the NGOs that implemented Avahan, we investigate the level of technical efficiency as well as the drivers of technical inefficiency by using the double bootstrap procedure developed by Simar & Wilson (2007). Unlike the two-stage traditional method, this method allows valid inference in the presence of measurement error and serial correlation. We find that over the 4 years, Avahan NGOs could have reduced the level of inputs by 43% given the level of outputs reached. We find that efficiency of the project has increased over time. Results indicate that main drivers of inefficiency come from the characteristics of the state lead partner, the NGOs and the catchment area. These organisational factors are important to explicitly consider and assess when designing and implementing HIV prevention programmes and in setting benchmarks in order to optimise the use and allocation of resources. C14, I1.
Bootstrap inference when using multiple imputation.
Schomaker, Michael; Heumann, Christian
2018-04-16
Many modern estimators require bootstrapping to calculate confidence intervals because either no analytic standard error is available or the distribution of the parameter of interest is nonsymmetric. It remains however unclear how to obtain valid bootstrap inference when dealing with multiple imputation to address missing data. We present 4 methods that are intuitively appealing, easy to implement, and combine bootstrap estimation with multiple imputation. We show that 3 of the 4 approaches yield valid inference, but that the performance of the methods varies with respect to the number of imputed data sets and the extent of missingness. Simulation studies reveal the behavior of our approaches in finite samples. A topical analysis from HIV treatment research, which determines the optimal timing of antiretroviral treatment initiation in young children, demonstrates the practical implications of the 4 methods in a sophisticated and realistic setting. This analysis suffers from missing data and uses the g-formula for inference, a method for which no standard errors are available. Copyright © 2018 John Wiley & Sons, Ltd.
Double-bootstrap methods that use a single double-bootstrap simulation
Chang, Jinyuan; Hall, Peter
2014-01-01
We show that, when the double bootstrap is used to improve performance of bootstrap methods for bias correction, techniques based on using a single double-bootstrap sample for each single-bootstrap sample can be particularly effective. In particular, they produce third-order accuracy for much less computational expense than is required by conventional double-bootstrap methods. However, this improved level of performance is not available for the single double-bootstrap methods that have been s...
International Nuclear Information System (INIS)
Aguado, J.L.; Bolivar, J.P.; San-Miguel, E.G.; Garcia-Tenorio, R.
2003-01-01
A radiochemical sequential extraction procedure has been developed in our laboratory to determine 226 Ra and 234,238 U by alpha spectrometry in environmental samples. This method has been validated for both radionuclides by comparing in selected samples the values obtained through its application with the results obtained by applying alternative procedures. Recoveries obtained, counting periods applied and background levels found in the alpha spectra give suitable detection limits to allow the Ra and U determination in operational forms defined in riverbed contaminated sediments. Results obtained in these speciation studies show that 226 Ra and 234,238 U contamination tend to be associated to precipitated forms of the sediments. (author)
Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Grassi, Viviane; Zagatto, Elias A G
2006-06-14
A sequential injection system for the automatic determination of glycerol in wine and beer was developed. The method is based on the rate of formation of NADH from the reaction of glycerol and NAD+ catalyzed by the enzyme glycerol dehydrogenase in solution. The determination of glycerol was performed between 0.3 and 3.0 mmol L(-1) (0.028 and 0.276 g L(-1)), and good repeatability was attained (rsd waste production was 2.12 mL per assay. Results obtained for samples were in agreement with those obtained with the batch enzymatic method.
Bootstrap data methodology for sequential hybrid model building
Volponi, Allan J. (Inventor); Brotherton, Thomas (Inventor)
2007-01-01
A method for modeling engine operation comprising the steps of: 1. collecting a first plurality of sensory data, 2. partitioning a flight envelope into a plurality of sub-regions, 3. assigning the first plurality of sensory data into the plurality of sub-regions, 4. generating an empirical model of at least one of the plurality of sub-regions, 5. generating a statistical summary model for at least one of the plurality of sub-regions, 6. collecting an additional plurality of sensory data, 7. partitioning the second plurality of sensory data into the plurality of sub-regions, 8. generating a plurality of pseudo-data using the empirical model, and 9. concatenating the plurality of pseudo-data and the additional plurality of sensory data to generate an updated empirical model and an updated statistical summary model for at least one of the plurality of sub-regions.
On using the bootstrap for multiple comparisons.
Westfall, Peter H
2011-11-01
There are many ways to bootstrap data for multiple comparisons procedures. Methods described here include (i) bootstrap (parametric and nonparametric) as a generalization of classical normal-based MaxT methods, (ii) bootstrap as an approximation to exact permutation methods, (iii) bootstrap as a generator of realistic null data sets, and (iv) bootstrap as a generator of realistic non-null data sets. Resampling of MinP versus MaxT is discussed, and the use of the bootstrap for closed testing is also presented. Applications to biopharmaceutical statistics are given.
Lee, Jin-Ho; Kim, Dong-Jin; Ahn, Byung-Koo
2015-06-01
The objectives of this study were to investigate the distribution of thallium in soils collected near suspected areas such as cement plants, active and closed mines, and smelters and to examine the extraction of thallium in the soils using 19 single chemical and sequential chemical extraction procedures. Thallium concentrations in soils near cement plants were distributed between 1.20 and 12.91 mg kg(-1). However, soils near mines and smelters contained relatively low thallium concentrations ranging from 0.18 to 1.09 mg kg(-1). Thallium extractability with 19 single chemical extractants from selected soils near cement plants ranged from 0.10% to 8.20% of the total thallium concentration. In particular, 1.0 M NH4Cl, 1.0 M (NH4)2SO4, and 1.0 M CH3COONH4 extracted more thallium than other extractants. Sequential fractionation results of thallium from different soils such as industrially and artificially contaminated soils varied with the soil properties, especially soil pH and the duration of thallium contamination.
Thongchai, Wisanu; Liawruangrath, Boonsom; Liawruangrath, Saisunee
2010-04-15
The development of sequential injection analysis with lab-at-valve (LAV) semi-automated system on-line liquid-liquid extraction is demonstrated for spectrophotometric determination of solasodine in various Solanum species fruits. The main proposed is semi-automated extractive determination of solasodine using methyl orange as colorimetric reagent. After optimization of the system, sample, reagent and organic solvent were sequentially aspirated into an extraction coil connected to the center of a selection valve, where extraction took place by flow reversal. The aqueous and organic phases were separated in a lab-at-valve unit attracted to one of the ports of the selection valve. The absorption of ion-pair solasodine-methyl orange complex in the organic phase was measured spectrophotometrically at 420 nm. The method performances, including reproducibility, linearity, sensitivity and accuracy, were also evaluated. The proposed method is simple, reproducible and accurate. It was successfully applied to the determination of solasodine in Solanum aculeatissimum Jacq., Solanum violaceum Ortega., Solanum melongena Linn. and Solanum indicum Linn. fruits in Solanaceae family. Results obtained were in good agreement with those obtained by batch wise spectrophotometric method. It is also suitable and useful for determination of solasodine in other medicinal plants. (c) 2009 Elsevier B.V. All rights reserved.
Beta limits of a completely bootstrapped tokamak
International Nuclear Information System (INIS)
Weening, R.H.; Bondeson, A.
1992-03-01
A beta limit is given for a completely bootstrapped tokamak. The beta limit is sensitive to the achievable Troyon factor and depends directly upon the strength of the tokamak bootstrap effect. (author) 16 refs
Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei
2016-02-01
After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry. Copyright © 2015 Elsevier B.V. All rights
del Río, Vanessa; Larrechi, M Soledad; Callao, M Pilar
2010-06-15
A new concept of flow titration is proposed and demonstrated for the determination of total acidity in plant oils and biodiesel. We use sequential injection analysis (SIA) with a diode array spectrophotometric detector linked to chemometric tools such as multivariate curve resolution-alternating least squares (MCR-ALS). This system is based on the evolution of the basic specie of an acid-base indicator, alizarine, when it comes into contact with a sample that contains free fatty acids. The gradual pH change in the reactor coil due to diffusion and reaction phenomenona allows the sequential appearance of both species of the indicator in the detector coil, recording a data matrix for each sample. The SIA-MCR-ALS method helps to reduce the amounts of sample, the reagents and the time consumed. Each determination consumes 0.413ml of sample, 0.250ml of indicator and 3ml of carrier (ethanol) and generates 3.333ml of waste. The frequency of the analysis is high (12 samples h(-1) including all steps, i.e., cleaning, preparing and analysing). The utilized reagents are of common use in the laboratory and it is not necessary to use the reagents of perfect known concentration. The method was applied to determine acidity in plant oil and biodiesel samples. Results obtained by the proposed method compare well with those obtained by the official European Community method that is time consuming and uses large amounts of organic solvents.
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
A bootstrap approach to bump hunting
Silverman, B. W.
1982-01-01
An important question in cluster analysis and pattern recognition is the determination of the number of clusters into which a given population should be divided. Frequently, particularly when certain specific clustering methods are being used, the number of clusters is taken to be equal to the number of modes, or local maxima, in the probability density function underlying the given data set. The use of kernal density estimates in mode estimation is discussed. The test statistic to be used is defined and a bootstrap technique for assessing significance is given. An illustrative application is followed by an examination of the asymptotic behavior of the test statistic.
Locality, bulk equations of motion and the conformal bootstrap
Energy Technology Data Exchange (ETDEWEB)
Kabat, Daniel [Department of Physics and Astronomy, Lehman College, City University of New York,250 Bedford Park Blvd. W, Bronx NY 10468 (United States); Lifschytz, Gilad [Department of Mathematics, Faculty of Natural Science, University of Haifa,199 Aba Khoushy Ave., Haifa 31905 (Israel)
2016-10-18
We develop an approach to construct local bulk operators in a CFT to order 1/N{sup 2}. Since 4-point functions are not fixed by conformal invariance we use the OPE to categorize possible forms for a bulk operator. Using previous results on 3-point functions we construct a local bulk operator in each OPE channel. We then impose the condition that the bulk operators constructed in different channels agree, and hence give rise to a well-defined bulk operator. We refer to this condition as the “bulk bootstrap.” We argue and explicitly show in some examples that the bulk bootstrap leads to some of the same results as the regular conformal bootstrap. In fact the bulk bootstrap provides an easier way to determine some CFT data, since it does not require knowing the form of the conformal blocks. This analysis clarifies previous results on the relation between bulk locality and the bootstrap for theories with a 1/N expansion, and it identifies a simple and direct way in which OPE coefficients and anomalous dimensions determine the bulk equations of motion to order 1/N{sup 2}.
Bootstrapping N=3 superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Lemos, Madalena; Liendo, Pedro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Meneghelli, Carlo [Stony Brook Univ., Stony Brook, NY (United States). Simons Center for Geometry and Physics; Mitev, Vladimir [Mainz Univ. (Germany). PRISMA Cluster of Excellence
2016-12-15
We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of =3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.
Mobile first design : using Bootstrap
Bhusal, Bipin
2017-01-01
The aim of this project was to design and build a website for a company based in Australia. The business offers remedial massage therapy to its clients. It is a small business which works on the basis of calls and message reservation. The business currently has a temporary website designed with Wix, a cloud-based web development platform. The new website was built with responsive design using Bootstrap. This website was intended for the customers using mobile internet browsers. This design is...
Lenehan, Claire E.; Lewis, Simon W.
2002-01-01
LabVIEW®-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 × 10-10 to 5 × 10-6 M) with a line of best fit of y=1.05x+8.9164 (R2 =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 × 10-8 M). The limit of detection (3σ) was determined as 5 × 10-11 M morphine. PMID:18924729
Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W
2002-01-01
LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.
Santos, J; Mendiola, J A; Oliveira, M B P P; Ibáñez, E; Herrero, M
2012-10-26
The simultaneous analysis of fat- and water-soluble vitamins from foods is a difficult task considering the wide range of chemical structures involved. In this work, a new procedure based on a sequential extraction and analysis of both types of vitamins is presented. The procedure couples several simple extraction steps to LC-MS/MS and LC-DAD in order to quantify the free vitamins contents in fresh-cut vegetables before and after a 10-days storage period. The developed method allows the correct quantification of vitamins C, B(1), B(2), B(3), B(5), B(6), B(9), E and provitamin A in ready-to-eat green leafy vegetable products including green lettuce, ruby red lettuce, watercress, swiss chard, lamb's lettuce, spearmint, spinach, wild rocket, pea leaves, mizuna, garden cress and red mustard. Using this optimized methodology, low LOQs were attained for the analyzed vitamins in less than 100 min, including extraction and vitamin analysis using 2 optimized procedures; good repeatability and linearity was achieved for all vitamins studied, while recoveries ranged from 83% to 105%. The most abundant free vitamins found in leafy vegetable products were vitamin C, provitamin A and vitamin E. The richest sample on vitamin C and provitamin A was pea leaves (154 mg/g fresh weight and 14.4 mg/100g fresh weight, respectively), whereas lamb's lettuce was the vegetable with the highest content on vitamin E (3.1 mg/100 g fresh weight). Generally, some losses of vitamins were detected after storage, although the behavior of each vitamin varied strongly among samples. Copyright © 2012 Elsevier B.V. All rights reserved.
Unbiased bootstrap error estimation for linear discriminant analysis.
Vu, Thang; Sima, Chao; Braga-Neto, Ulisses M; Dougherty, Edward R
2014-12-01
Convex bootstrap error estimation is a popular tool for classifier error estimation in gene expression studies. A basic question is how to determine the weight for the convex combination between the basic bootstrap estimator and the resubstitution estimator such that the resulting estimator is unbiased at finite sample sizes. The well-known 0.632 bootstrap error estimator uses asymptotic arguments to propose a fixed 0.632 weight, whereas the more recent 0.632+ bootstrap error estimator attempts to set the weight adaptively. In this paper, we study the finite sample problem in the case of linear discriminant analysis under Gaussian populations. We derive exact expressions for the weight that guarantee unbiasedness of the convex bootstrap error estimator in the univariate and multivariate cases, without making asymptotic simplifications. Using exact computation in the univariate case and an accurate approximation in the multivariate case, we obtain the required weight and show that it can deviate significantly from the constant 0.632 weight, depending on the sample size and Bayes error for the problem. The methodology is illustrated by application on data from a well-known cancer classification study.
Sequential determination of U and Th isotopes, 226Ra, 228Ra, 210Pb, and 210Po in mushroom
International Nuclear Information System (INIS)
Rosa, Mychelle M.L.; Taddei, Maria Helena T.; Dias, Fabiana F.; Maihara, Vera A.
2009-01-01
For this study, mushroom samples were collected in Brazil at the Sao Paulo Metropolitan Region and at the Pocos de Caldas Plateau (PC; a region of elevated natural radioactivity, which houses the first Brazilian uranium mine). This paper discusses a sequential methodology to determine natural series radionuclides in mushrooms, such as uranium ( 238 U and 234 U) and thorium ( 232 Th, 230 Th, and 228 Th) isotopes, radium-226, radium-228, as well as lead-210 and polonium-210; using Alpha Spectrometry, Gamma Spectrometry, and Total Alpha and Beta Counting. The method involves total sample dissolution in a closed system in order to avoid loss of Polonium and employment of specific chromatographic resins for radionuclide purification. A subsequent interpretation of the results can provide information on pollutants present in mushrooms and infer possible contamination in the areas sampled as well as allow an association of measured concentrations to radioactive anomalies in the Plateau. (author)
Davletbaeva, Polina; Chocholouš, Petr; Bulatov, Andrey; Šatínský, Dalibor; Solich, Petr
2017-09-05
Sequential Injection Chromatography (SIC) evolved from fast and automated non-separation Sequential Injection Analysis (SIA) into chromatographic separation method for multi-element analysis. However, the speed of the measurement (sample throughput) is due to chromatography significantly reduced. In this paper, a sub-1min separation using medium polar cyano monolithic column (5mm×4.6mm) resulted in fast and green separation with sample throughput comparable with non-separation flow methods The separation of three synthetic water-soluble dyes (sunset yellow FCF, carmoisine and green S) was in a gradient elution mode (0.02% ammonium acetate, pH 6.7 - water) with flow rate of 3.0mLmin -1 corresponding with sample throughput of 30h -1 . Spectrophotometric detection wavelengths were set to 480, 516 and 630nm and 10Hz data collection rate. The performance of the separation was described and discussed (peak capacities 3.48-7.67, peak symmetries 1.72-1.84 and resolutions 1.42-1.88). The method was represented by validation parameters: LODs of 0.15-0.35mgL -1 , LOQs of 0.50-1.25mgL -1 , calibration ranges 0.50-150.00mgL -1 (r>0.998) and repeatability at 10.0mgL -1 of RSD≤0.98% (n=6). The method was used for determination of the dyes in "forest berries" colored pharmaceutical cough-cold formulation. The sample matrix - pharmaceuticals and excipients were not interfering with vis determination because of no retention in the separation column and colorless nature. The results proved the concept of fast and green chromatography approach using very short medium polar monolithic column in SIC. Copyright © 2017 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Montesinos, J.L.; Poch, M.; Alonso, J.; Valle, M. del; Lima, J.L.F.C.
1991-01-01
Results obtained in the evaluation of a mathematical model for flow-injection sandwich systems, in which the sample is inserted between two different carrier solutions with the aid of an eight-way injection valve, are presented. The model considers the system as a tubular reactor with axially dispersed plug flow, where chemical kinetics have been included. Only a minimum of parameters are needed for the model: the dispersion coefficients, which are obtained from experimental correlations for the different hydrodynamic conditions, and the rate constants, which are determined by univariate optimization. The behaviour of the model, when applied to the case of one analyte and one reagent, was evaluated for the enzymatic determinations of glucose and glycerol. The case of two analytes determined with two reagents was also studied in the combined determination of glucose and glycerol. Experimental profiles were compared with the simulated peaks and showed good agreement. (author). 20 refs.; 11 figs.; 2 tabs
Energy Technology Data Exchange (ETDEWEB)
Montesinos, J.L.; Poch, M. (Universitat Autonoma de Barcelona (Spain). Departament de Quimica, Unitat d' Enginyera Quimica); Alonso, J.; Valle, M. del (Universitat Autonoma de Barcelona (Spain). Departament de Quimica, Unitat de Quimica Analitica); Lima, J.L.F.C. (Universidade de Porto (Portugal). Faculdade de Farmacia, Departamento de Quimica Fisica)
1991-11-20
Results obtained in the evaluation of a mathematical model for flow-injection sandwich systems, in which the sample is inserted between two different carrier solutions with the aid of an eight-way injection valve, are presented. The model considers the system as a tubular reactor with axially dispersed plug flow, where chemical kinetics have been included. Only a minimum of parameters are needed for the model: the dispersion coefficients, which are obtained from experimental correlations for the different hydrodynamic conditions, and the rate constants, which are determined by univariate optimization. The behaviour of the model, when applied to the case of one analyte and one reagent, was evaluated for the enzymatic determinations of glucose and glycerol. The case of two analytes determined with two reagents was also studied in the combined determination of glucose and glycerol. Experimental profiles were compared with the simulated peaks and showed good agreement. (author). 20 refs.; 11 figs.; 2 tabs.
Zárate, N; Irazu, E; Araújo, A N; Montenegro, M C B S M; Pérez-Olmos, R
2008-06-01
Two methods for the determination of potassium nitrate in mouthwashes, used against dentin hypersensitivity, have been simultaneously implemented in an sequence injection analysis (SIA) system. In addition to in-line dilution of the samples, the equipment simultaneously detected potassium and nitrate using two tubular potentiometric detectors, selective to each ion, providing a real-time assessment of the quality of the results. Both determinations were shown to be precise and accurate and the obtained results do not statistically differ from those furnished by applying the AES and HPLC reference methods.
Bootstrap Approach to Comparison of Alternative Methods of ...
African Journals Online (AJOL)
A bootstrap simulation approach was used to generate values for endogenous variables of a simultaneous equation model popularly known as Keynesian Model of Income Determination. Three sample sizes 20, 30 and 40 each replicated 10, 20 and 30 times were considered. Four different estimation techniques: Ordinary ...
Directory of Open Access Journals (Sweden)
Hany W. Darwish
2013-01-01
Full Text Available A new, simple and specific spectrophotometric method was developed and validated in accordance with ICH guidelines for the simultaneous estimation of Amlodipine (AML, Valsartan (VAL, and Hydrochlorothiazide (HCT in their ternary mixture. In this method three techniques were used, namely, direct spectrophotometry, ratio subtraction, and isoabsorptive point. Amlodipine (AML was first determined by direct spectrophotometry and then ratio subtraction was applied to remove the AML spectrum from the mixture spectrum. Hydrochlorothiazide (HCT could then be determined directly without interference from Valsartan (VAL which could be determined using the isoabsorptive point theory. The calibration curve is linear over the concentration ranges of 4–32, 4–44 and 6–20 μg/mL for AML, VAL, and HCT, respectively. This method was tested by analyzing synthetic mixtures of the above drugs and was successfully applied to commercial pharmaceutical preparation of the drugs, where the standard deviation is <2 in the assay of raw materials and tablets. The method was validated according to the ICH guidelines and accuracy, precision, repeatability, and robustness were found to be within the acceptable limits.
DEFF Research Database (Denmark)
Hansen, Elo Harald
Determination of low or trace-level amounts of metals by electrothermal atomic absorption spectrometry (ETAAS) often requires the use of suitable preconcentration and/or separation procedures in order to attain the necessary sensitivity and selectivity. Such schemes are advantageously executed...... by superior performance and versatility. In fact, two approaches are conceivable: The analyte-loaded ion-exchange beads might either be transported directly into the graphite tube where they are pyrolized and the measurand is atomized and quantified; or the loaded beads can be eluted and the eluate forwarded...
More N =4 superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C.
2017-08-01
In this long overdue second installment, we continue to develop the conformal bootstrap program for N =4 superconformal field theories (SCFTs) in four dimensions via an analysis of the correlation function of four stress-tensor supermultiplets. We review analytic results for this correlator and make contact with the SCFT/chiral algebra correspondence of Beem et al. [Commun. Math. Phys. 336, 1359 (2015), 10.1007/s00220-014-2272-x]. We demonstrate that the constraints of unitarity and crossing symmetry require the central charge c to be greater than or equal to 3 /4 in any interacting N =4 SCFT. We apply numerical bootstrap methods to derive upper bounds on scaling dimensions and operator product expansion coefficients for several low-lying, unprotected operators as a function of the central charge. We interpret our bounds in the context of N =4 super Yang-Mills theories, formulating a series of conjectures regarding the embedding of the conformal manifold—parametrized by the complexified gauge coupling—into the space of scaling dimensions and operator product expansion coefficients. Our conjectures assign a distinguished role to points on the conformal manifold that are self-dual under a subgroup of the S -duality group. This paper contains a more detailed exposition of a number of results previously reported in Beem et al. [Phys. Rev. Lett. 111, 071601 (2013), 10.1103/PhysRevLett.111.071601] in addition to new results.
Greenblatt, Richard L.; And Others
1992-01-01
The diagnostic accuracy of nonadjusted and bootstrapped diagnosis was compared using a sample of 1,455 psychiatric patients who completed the Millon Clinical Multiaxial Inventory. The usefulness of bootstrapping depended on the criteria for accuracy. Conditions under which bootstrapping might increase diagnostic accuracy are detailed. (SLD)
Modified bootstrap consistency rates for U-quantiles
JANSSEN, Paul; SWANEPOEL, Jan; VERAVERBEKE, Noel
2001-01-01
We show that, compared to the classical bootstrap, the modified bootstrap provides faster consistency rates for the bootstrap distribution of U-quantiles. This shows that the modified bootstrap is useful, not only in cases where the classical bootstrap fails, but also in situations where it is valid.
Paseková, H; Sales, M G; Montenegro, M C; Araújo, A N; Polásek, M
2001-03-01
This paper deals with the development of an automated procedure for formulation assays and dissolution tests based on a sequential injection analysis (SIA) system involving an ion-selective electrode as sensing device. Construction of a tubular salicylate (Sal) selective electrode suitable for potentiometric determination of acetylsalicylic acid (Asa) in pharmaceutical formulations is described. The flow-through electrode is formed by a PVC membrane containing 29.2% (w/w) PVC, 5.8% (w/w) tetraoctylammonium salicylate (ionic sensor), 58.5% o-nitrophenyloctylether (plasticizer) and 6.5% (w/w) p-tert-octylphenol (stabilising additive which increases electrode selectivity). The calibration range is 0.05--10 mM Sal, the limit of detection (LOD) is 0.05 mM Sal, the slope is 56.0 mV per decade at 22 degrees C. The R.S.D. is 0.20% (15 readings) when determining 2.5 mM Sal in standard solution. The electrode is used for sensing Asa after its on-line chemical hydrolysis to Sal in a SIA system. The sampling rate is 6 h(-1) but for the dissolution tests the frequency is increased to 20 h(-1). The SIA set-up is employed for the assay of Asa in plain tablets, composed tablets and effervescent tablets and for performing dissolution tests of normal and sustained release tablets. Results obtained by this technique compare well with those required by the US Pharmacopoeia XXIV.
Speciation of 210Po and 210Pb in air particulates determined by sequential extraction
International Nuclear Information System (INIS)
Al-Masri, M.S.; Al-Karfan, K.; Khalili, H.; Hassan, M.
2006-01-01
Speciation of 210 Po and 210 Pb in air particulates of two Syrian phosphate sites with different climate conditions has been studied. The sites are the mines and Tartous port at the Mediterranean Sea. Air filters were collected during September 2000 until February 2002 and extracted chemically using different selective fluids in an attempt to identify the different forms of these two radionuclides. The results have shown that the inorganic and insoluble 21 Po and 21 Pb (attached to silica and soluble in mineral acids) portion was found to be high in both sites and reached a maximum value of 94% and 77% in the mine site and Tartous port site, respectively. In addition, only 24% of 21 Pb in air particulates was found to be associated with organic materials probably produced from the incomplete burning of fuel vehicle and similar activities. Moreover, the 210 Po/ 21- Pb activity ratio in air particulates was higher than that in all samples at both sites and varied between 3.85 in November 2000 at Tartous port site and 20 in April 2001 at the mine area. These activity ratios were also higher than the natural levels. The 210 Po/ 210 Pb activity ratio was also determined in each portion resulting from the selective extraction and found to be higher than that in most samples. The sources of 210 Po excess in these portions are discussed. Soil suspension, which is common in the dry climate dominant in the area, sea water spray and heating of phosphate ores were considered; polonium is more volatile than the lead compounds at even moderate temperature. Furthermore, variations in the chemical forms of 210 Po and 210 Pb during the year were also investigated. However, the results of this study can also be utilized for dose assessment to phosphate industry workers
Bootstrapping the O(N) Archipelago
Kos, Filip; Simmons-Duffin, David; Vichi, Alessandro
2015-01-01
We study 3d CFTs with an $O(N)$ global symmetry using the conformal bootstrap for a system of mixed correlators. Specifically, we consider all nonvanishing scalar four-point functions containing the lowest dimension $O(N)$ vector $\\phi_i$ and the lowest dimension $O(N)$ singlet $s$, assumed to be the only relevant operators in their symmetry representations. The constraints of crossing symmetry and unitarity for these four-point functions force the scaling dimensions $(\\Delta_\\phi, \\Delta_s)$ to lie inside small islands. We also make rigorous determinations of current two-point functions in the $O(2)$ and $O(3)$ models, with applications to transport in condensed matter systems.
EBW-Bootstrap Current Synergy in the National Spherical Torus Experiment (NSTX)
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code, to determine the degree of synergy between them. A target β = 40% NSTX plasma is examined. A simple bootstrap model in the CQL3D Fokker-Planck code is used in these studies: the transiting electron distributions are connected in velocity-space at the trapped-passing boundary to trapped-electron distributions which are displaced radially by a half-banana width outwards/inwards for the co-/counter-passing regions. This model agrees well with standard bootstrap current calculations, over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
Coefficient Alpha Bootstrap Confidence Interval under Nonnormality
Padilla, Miguel A.; Divers, Jasmin; Newton, Matthew
2012-01-01
Three different bootstrap methods for estimating confidence intervals (CIs) for coefficient alpha were investigated. In addition, the bootstrap methods were compared with the most promising coefficient alpha CI estimation methods reported in the literature. The CI methods were assessed through a Monte Carlo simulation utilizing conditions…
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
2012-01-01
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key k which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: { Introducing and formally dening...... anonymous steganography, { A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), { A lower bound on the number of bits which are needed to bootstrap anonymous communication....
How to Bootstrap Anonymous Communication
DEFF Research Database (Denmark)
Jakobsen, Sune K.; Orlandi, Claudio
2015-01-01
formal study in this direction. To solve this problem, we introduce the concept of anonymous steganography: think of a leaker Lea who wants to leak a large document to Joe the journalist. Using anonymous steganography Lea can embed this document in innocent looking communication on some popular website...... (such as cat videos on YouTube or funny memes on 9GAG). Then Lea provides Joe with a short key $k$ which, when applied to the entire website, recovers the document while hiding the identity of Lea among the large number of users of the website. Our contributions include: - Introducing and formally...... defining anonymous steganography, - A construction showing that anonymous steganography is possible (which uses recent results in circuits obfuscation), - A lower bound on the number of bits which are needed to bootstrap anonymous communication....
Inverse bootstrapping conformal field theories
Li, Wenliang
2018-01-01
We propose a novel approach to study conformal field theories (CFTs) in general dimensions. In the conformal bootstrap program, one usually searches for consistent CFT data that satisfy crossing symmetry. In the new method, we reverse the logic and interpret manifestly crossing-symmetric functions as generating functions of conformal data. Physical CFTs can be obtained by scanning the space of crossing-symmetric functions. By truncating the fusion rules, we are able to concentrate on the low-lying operators and derive some approximate relations for their conformal data. It turns out that the free scalar theory, the 2d minimal model CFTs, the ϕ 4 Wilson-Fisher CFT, the Lee-Yang CFTs and the Ising CFTs are consistent with the universal relations from the minimal fusion rule ϕ 1 × ϕ 1 = I + ϕ 2 + T , where ϕ 1 , ϕ 2 are scalar operators, I is the identity operator and T is the stress tensor.
Bootstrapping SCFTs with Four Supercharges
Bobev, Nikolay; Mazac, Dalimil; Paulos, Miguel F
2015-01-01
We study the constraints imposed by superconformal symmetry, crossing symmetry, and unitarity for theories with four supercharges in spacetime dimension $2\\leq d\\leq 4$. We show how superconformal algebras with four Poincar\\'{e} supercharges can be treated in a formalism applicable to any, in principle continuous, value of $d$ and use this to construct the superconformal blocks for any $d\\leq 4$. We then use numerical bootstrap techniques to derive upper bounds on the conformal dimension of the first unprotected operator appearing in the OPE of a chiral and an anti-chiral superconformal primary. We obtain an intriguing structure of three distinct kinks. We argue that one of the kinks smoothly interpolates between the $d=2$, $\\mathcal N=(2,2)$ minimal model with central charge $c=1$ and the theory of a free chiral multiplet in $d=4$, passing through the critical Wess-Zumino model with cubic superpotential in intermediate dimensions.
Bootstrapping SCFTs with four supercharges
International Nuclear Information System (INIS)
Bobev, Nikolay; El-Showk, Sheer; Mazáč, Dalimil; Paulos, Miguel F.
2015-01-01
We study the constraints imposed by superconformal symmetry, crossing symmetry, and unitarity for theories with four supercharges in spacetime dimension 2≤d≤4. We show how superconformal algebras with four Poincaré supercharges can be treated in a formalism applicable to any, in principle continuous, value of d and use this to construct the superconformal blocks for any d≤4. We then use numerical bootstrap techniques to derive upper bounds on the conformal dimension of the first unprotected operator appearing in the OPE of a chiral and an anti-chiral superconformal primary. We obtain an intriguing structure of three distinct kinks. We argue that one of the kinks smoothly interpolates between the d=2, N=(2,2) minimal model with central charge c=1 and the theory of a free chiral multiplet in d=4, passing through the critical Wess-Zumino model with cubic superpotential in intermediate dimensions.
UFBoot2: Improving the Ultrafast Bootstrap Approximation.
Hoang, Diep Thi; Chernomor, Olga; von Haeseler, Arndt; Minh, Bui Quang; Vinh, Le Sy
2018-02-01
The standard bootstrap (SBS), despite being computationally intensive, is widely used in maximum likelihood phylogenetic analyses. We recently proposed the ultrafast bootstrap approximation (UFBoot) to reduce computing time while achieving more unbiased branch supports than SBS under mild model violations. UFBoot has been steadily adopted as an efficient alternative to SBS and other bootstrap approaches. Here, we present UFBoot2, which substantially accelerates UFBoot and reduces the risk of overestimating branch supports due to polytomies or severe model violations. Additionally, UFBoot2 provides suitable bootstrap resampling strategies for phylogenomic data. UFBoot2 is 778 times (median) faster than SBS and 8.4 times (median) faster than RAxML rapid bootstrap on tested data sets. UFBoot2 is implemented in the IQ-TREE software package version 1.6 and freely available at http://www.iqtree.org. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Roos, Per
2009-01-01
This article presents an automated method for the rapid determination of 239Pu and 240Pu in various environmental samples. The analytical method involves the in-line separation of Pu isotopes using extraction chromatography (TEVA) implemented in a sequential injection (SI) network followed...... of the in-line extraction chromatographic run was...
Directory of Open Access Journals (Sweden)
Mohtasham MOHAMMADI
2014-03-01
Full Text Available An experiment was conducted to evaluate 295 wheat genotypes in Alpha-Lattice design with two replications. The arithmetic mean and standard deviation of grain yield was 2706 and 950 (kg/ha,respectively. The results of correlation coefficients indicated that grain yield had significant and positive association with plant height, spike length, early growth vigor and agronomic score. Whereas there were negative correlation coefficients between grain yield and days to physiological maturity and canopy temperature before and during anthesis. Path analysis indicated agronomic score and plant height had high positive direct effects on grain yield, while canopy temperature before and during anthesis, and days to maturity, wes another trait having negative direct effect on grain yield. The results of sequential path analysis showed the traits that accounted as a criteria variable for high grain yield were agronomic score, plant height, canopy temperature, spike length, chlorophyll content and early growth vigor, which were determined as first, second and third order variables and had strong effects on grain yield via one or more paths. More important, as canopy temperature, agronomic score and early growth vigor can be evaluated quickly and easily, these traits may be used for evaluation of large populations.
van Staden, J F; Taljaard, R E
2004-12-15
Sequential injection analysis is still dominated by single component analysis. In this proposed robust, economical simple instrumental system seven different metal ions are determined simultaneously using thin-film sequential injection extraction (SIE) with multivariate calibration and multiwavelength detection. Dithizone, in ethanol, is used as extractant and the metal dithizonates' spectra are generated by a diode array spectrophotometer between 300 and 700nm. The SI thin-film extraction using water miscible with ethanol works due to the hydrophobic interaction of ethanol with the Teflon wall to create a thin film. A sample frequency of 27 samples per hour was obtained with a sample carry-over of less than 1%. The results of the proposed sequential injection extraction system compare favourably with the results obtained by using standard atomic absorption spectrometry (AAS) methods on conventional extraction samples.
Better Confidence Intervals: The Double Bootstrap with No Pivot
David Letson; B.D. McCullough
1998-01-01
The double bootstrap is an important advance in confidence interval generation because it converges faster than the already popular single bootstrap. Yet the usual double bootstrap requires a stable pivot that is not always available, e.g., when estimating flexibilities or substitution elasticities. A recently developed double bootstrap does not require a pivot. A Monte Carlo analysis with the Waugh data finds the double bootstrap achieves nominal coverage whereas the single bootstrap does no...
Austin, Peter C
2008-10-01
Researchers have proposed using bootstrap resampling in conjunction with automated variable selection methods to identify predictors of an outcome and to develop parsimonious regression models. Using this method, multiple bootstrap samples are drawn from the original data set. Traditional backward variable elimination is used in each bootstrap sample, and the proportion of bootstrap samples in which each candidate variable is identified as an independent predictor of the outcome is determined. The performance of this method for identifying predictor variables has not been examined. Monte Carlo simulation methods were used to determine the ability of bootstrap model selection methods to correctly identify predictors of an outcome when those variables that are selected for inclusion in at least 50% of the bootstrap samples are included in the final regression model. We compared the performance of the bootstrap model selection method to that of conventional backward variable elimination. Bootstrap model selection tended to result in an approximately equal proportion of selected models being equal to the true regression model compared with the use of conventional backward variable elimination. Bootstrap model selection performed comparatively to backward variable elimination for identifying the true predictors of a binary outcome.
Definition of total bootstrap current in tokamaks
International Nuclear Information System (INIS)
Ross, D.W.
1995-01-01
Alternative definitions of the total bootstrap current are compared. An analogous comparison is given for the ohmic and auxiliary currents. It is argued that different definitions than those usually employed lead to simpler analyses of tokamak operating scenarios
Mketo, Nomvano; Nomngongo, Philiswa N; Ngila, J Catherine
2018-05-15
A rapid three-step sequential extraction method was developed under microwave radiation followed by inductively coupled plasma-optical emission spectroscopic (ICP-OES) and ion-chromatographic (IC) analysis for the determination of sulphur forms in coal samples. The experimental conditions of the proposed microwave-assisted sequential extraction (MW-ASE) procedure were optimized by using multivariate mathematical tools. Pareto charts generated from 2 3 full factorial design showed that, extraction time has insignificant effect on the extraction of sulphur species, therefore, all the sequential extraction steps were performed for 5 min. The optimum values according to the central composite designs and counter plots of the response surface methodology were 200 °C (microwave temperature) and 0.1 g (coal amount) for all the investigated extracting reagents (H 2 O, HCl and HNO 3 ). When the optimum conditions of the proposed MW-ASE procedure were applied in coal CRMs, SARM 18 showed more organic sulphur (72%) and the other two coal CRMs (SARMs 19 and 20) were dominated by sulphide sulphur species (52-58%). The sum of the sulphur forms from the sequential extraction steps have shown consistent agreement (95-96%) with certified total sulphur values on the coal CRM certificates. This correlation, in addition to the good precision (1.7%) achieved by the proposed procedure, suggests that the sequential extraction method is reliable, accurate and reproducible. To safe-guard the destruction of pyritic and organic sulphur forms in extraction step 1, water was used instead of HCl. Additionally, the notorious acidic mixture (HCl/HNO 3 /HF) was replaced by greener reagent (H 2 O 2 ) in the last extraction step. Therefore, the proposed MW-ASE method can be applied in routine laboratories for the determination of sulphur forms in coal and coal related matrices. Copyright © 2018 Elsevier B.V. All rights reserved.
Wald, Abraham
2013-01-01
In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio
Moving Block Bootstrap for Analyzing Longitudinal Data.
Ju, Hyunsu
In a longitudinal study subjects are followed over time. I focus on a case where the number of replications over time is large relative to the number of subjects in the study. I investigate the use of moving block bootstrap methods for analyzing such data. Asymptotic properties of the bootstrap methods in this setting are derived. The effectiveness of these resampling methods is also demonstrated through a simulation study.
Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection
Kumar, Sricharan; Srivistava, Ashok N.
2012-01-01
Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Roberto S. Flowers-Cano; Ruperto Ortiz-Gómez; Jesús Enrique León-Jiménez; Raúl López Rivera; Luis A. Perera Cruz
2018-01-01
Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP), bias-corrected bootstrap (BC), accelerated bias-corrected bootstrap (BCA) and a modified version of the standard bootstrap (MSB). Different simulation scenarios were analyzed. In some cases, the mother distributi...
Al-Mudhafar, W. J.
2013-12-01
Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
Soybean yield modeling using bootstrap methods for small samples
Energy Technology Data Exchange (ETDEWEB)
Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.
2016-11-01
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-08-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs ( e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
CME Velocity and Acceleration Error Estimates Using the Bootstrap Method
Michalek, Grzegorz; Gopalswamy, Nat; Yashiro, Seiji
2017-01-01
The bootstrap method is used to determine errors of basic attributes of coronal mass ejections (CMEs) visually identified in images obtained by the Solar and Heliospheric Observatory (SOHO) mission's Large Angle and Spectrometric Coronagraph (LASCO) instruments. The basic parameters of CMEs are stored, among others, in a database known as the SOHO/LASCO CME catalog and are widely employed for many research studies. The basic attributes of CMEs (e.g. velocity and acceleration) are obtained from manually generated height-time plots. The subjective nature of manual measurements introduces random errors that are difficult to quantify. In many studies the impact of such measurement errors is overlooked. In this study we present a new possibility to estimate measurements errors in the basic attributes of CMEs. This approach is a computer-intensive method because it requires repeating the original data analysis procedure several times using replicate datasets. This is also commonly called the bootstrap method in the literature. We show that the bootstrap approach can be used to estimate the errors of the basic attributes of CMEs having moderately large numbers of height-time measurements. The velocity errors are in the vast majority small and depend mostly on the number of height-time points measured for a particular event. In the case of acceleration, the errors are significant, and for more than half of all CMEs, they are larger than the acceleration itself.
Generalised block bootstrap and its use in meteorology
Directory of Open Access Journals (Sweden)
L. Varga
2017-06-01
Full Text Available In an earlier paper, Rakonczai et al.(2014 emphasised the importance of investigating the effective sample size in case of autocorrelated data. The simulations were based on the block bootstrap methodology. However, the discreteness of the usual block size did not allow for exact calculations. In this paper we propose a new generalisation of the block bootstrap methodology, which allows for any positive real number as expected block size. We relate it to the existing optimisation procedures and apply it to a temperature data set. Our other focus is on statistical tests, where quite often the actual sample size plays an important role, even in the case of relatively large samples. This is especially the case for copulas. These are used for investigating the dependencies among data sets. As in quite a few real applications the time dependence cannot be neglected, we investigated the effect of this phenomenon on the used test statistic. The critical value can be computed by the proposed new block bootstrap simulation, where the block size is determined by fitting a VAR model to the observations. The results are illustrated for models of the used temperature data.
Stock Price Simulation Using Bootstrap and Monte Carlo
Directory of Open Access Journals (Sweden)
Pažický Martin
2017-06-01
Full Text Available In this paper, an attempt is made to assessment and comparison of bootstrap experiment and Monte Carlo experiment for stock price simulation. Since the stock price evolution in the future is extremely important for the investors, there is the attempt to find the best method how to determine the future stock price of BNP Paribas′ bank. The aim of the paper is define the value of the European and Asian option on BNP Paribas′ stock at the maturity date. There are employed four different methods for the simulation. First method is bootstrap experiment with homoscedastic error term, second method is blocked bootstrap experiment with heteroscedastic error term, third method is Monte Carlo simulation with heteroscedastic error term and the last method is Monte Carlo simulation with homoscedastic error term. In the last method there is necessary to model the volatility using econometric GARCH model. The main purpose of the paper is to compare the mentioned methods and select the most reliable. The difference between classical European option and exotic Asian option based on the experiment results is the next aim of tis paper.
Altundag, Huseyin; Imamoglu, Mustafa; Doganci, Secil; Baysal, Erkan; Albayrak, Sinem; Tuzen, Mustafa
2013-01-01
Sequential selective extraction techniques are commonly used to fractionate the solid-phase forms of metals in soils. This procedure provides measurements of extractable metals from media, such as acetic acid (0.11 M), hydroxyl ammonium chloride (0.1 M), hydrogen peroxide (8.8 M) plus ammonium acetate (1 M), and aqua regia stages of the sequential extraction procedure. In this work, the extractable Pb, Cu, Mn, Sr, Ni, V, Fe, Zn, and Cr were evaluated in street dust samples from Sakarya, Turkey, between May and October 2009 using the three-step sequential extraction procedure described by the Community Bureau of Reference (BCR, now the Standards, Measurements, and Testing Programme) of the European Union. The sampling sites were divided into 10 categories; a total of 50 street dusts were analyzed. The determination of multielements in the samples was performed by inductively coupled plasma-optical emission spectrometry. Validation of the proposed method was performed using BCR 701 certified reference material. The results showed good agreement between the obtained and the certified values for the metals analyzed.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
Early Stop Criterion from the Bootstrap Ensemble
DEFF Research Database (Denmark)
Hansen, Lars Kai; Larsen, Jan; Fog, Torben L.
1997-01-01
This paper addresses the problem of generalization error estimation in neural networks. A new early stop criterion based on a Bootstrap estimate of the generalization error is suggested. The estimate does not require the network to be trained to the minimum of the cost function, as required...... by other methods based on asymptotic theory. Moreover, in contrast to methods based on cross-validation which require data left out for testing, and thus biasing the estimate, the Bootstrap technique does not have this disadvantage. The potential of the suggested technique is demonstrated on various time...
Bayesian inference and the parametric bootstrap
Efron, Bradley
2013-01-01
The parametric bootstrap can be used for the efficient computation of Bayes posterior distributions. Importance sampling formulas take on an easy form relating to the deviance in exponential families, and are particularly simple starting from Jeffreys invariant prior. Because of the i.i.d. nature of bootstrap sampling, familiar formulas describe the computational accuracy of the Bayes estimates. Besides computational methods, the theory provides a connection between Bayesian and frequentist analysis. Efficient algorithms for the frequentist accuracy of Bayesian inferences are developed and demonstrated in a model selection example. PMID:23843930
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker...... (1989). In many cases validity of bootstrap-based inference procedures is found to depend crucially on whether the bandwidth sequence satisfies a particular (asymptotic linearity) condition. An exception to this rule occurs for inference procedures involving a studentized estimator employing a "robust...
Bootstrap percolation: a renormalisation group approach
International Nuclear Information System (INIS)
Branco, N.S.; Santos, Raimundo R. dos; Queiroz, S.L.A. de.
1984-02-01
In bootstrap percolation, sites are occupied at random with probability p, but each site is considered active only if at least m of its neighbours are also active. Within an approximate position-space renormalization group framework on a square lattice we obtain the behaviour of the critical concentration p (sub)c and of the critical exponents ν and β for m = 0 (ordinary percolation), 1,2 and 3. We find that the bootstrap percolation problem can be cast into different universality classes, characterized by the values of m. (author) [pt
Energy Technology Data Exchange (ETDEWEB)
Gratch, J. [Univ. of Southern California, Marina del Rey, CA (United States)
1996-12-31
This article advocates a new model for inductive learning. Called sequential induction, it helps bridge classical fixed-sample learning techniques (which are efficient but difficult to formally characterize), and worst-case approaches (which provide strong statistical guarantees but are too inefficient for practical use). Learning proceeds as a sequence of decisions which are informed by training data. By analyzing induction at the level of these decisions, and by utilizing the only enough data to make each decision, sequential induction provides statistical guarantees but with substantially less data than worst-case methods require. The sequential inductive model is also useful as a method for determining a sufficient sample size for inductive learning and as such, is relevant to learning problems where the preponderance of data or the cost of gathering data precludes the use of traditional methods.
In vivo precision of bootstrap algorithms applied to diffusion tensor imaging data.
Vorburger, Robert S; Reischauer, Carolin; Dikaiou, Katerina; Boesiger, Peter
2012-10-01
To determine the precision for in vivo applications of model and non-model-based bootstrap algorithms for estimating the measurement uncertainty of diffusion parameters derived from diffusion tensor imaging data. Four different bootstrap methods were applied to diffusion datasets acquired during 10 repeated imaging sessions. Measurement uncertainty was derived in eight manually selected regions of interest and in the entire brain white matter and gray matter. The precision of the bootstrap methods was analyzed using coefficients of variation and intra-class correlation coefficients. Comprehensive simulations were performed to validate the results. All bootstrap algorithms showed similar precision which slightly varied in dependence of the selected region of interest. The averaged coefficient of variation in the selected regions of interest was 13.81%, 12.35%, and 17.93% with respect to the apparent diffusion coefficient, the fractional anisotropy value, and the cone of uncertainty, respectively. The repeated measurements showed a very high similarity with intraclass-correlation coefficients larger than 0.96. The simulations confirmed most of the in vivo findings. All investigated bootstrap methods perform with a similar, high precision in deriving the measurement uncertainty of diffusion parameters. Thus, the time-efficient model-based bootstrap approaches should be the method of choice in clinical practice. Copyright © 2012 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Buanuam, Janya; Miró, Manuel; Hansen, Elo Harald
2006-01-01
associations for phosphorus, that is, exchangeable, Al- and Fe-bound and Ca-bound fractions, were elucidated by accommodation in the flow manifold of the 3 steps of the Hietjles-Litjkema (HL) scheme involving the use of 1.0 M NH4Cl, 0.1 M NaOH and 0.5 M HCl, respectively, as sequential leaching reagents....... The precise timing and versatility of SI for tailoring various operational extraction modes were utilised for investigating the extractability and extent of phosphorous re-distribution for variable partitioning times. Automatic spectrophotometric determination of soluble reactive phosphorous in soil extracts...
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Roos, Per
2010-01-01
This paper reports an automated analytical method for rapid and simultaneous determination of plutonium isotopes (239Pu and 240Pu) and neptunium (237Np) in environmental samples. An extraction chromatographic column packed with TrisKem TEVA® resin was incorporated in a sequential injection (SI...... procedures were investigated and compared for the adjustment of oxidation states of plutonium and neptunium to Pu(IV) and Np(IV), respectively. A two-step protocol using sulfite and concentrated nitric acid as redox reagents was proven to be the most effective method. The analytical results for both...
DEFF Research Database (Denmark)
Hansen, Elo Harald; Miró, Manuel; Petersen, Roongrat
In recent years sequential injection (SI) analysis and Lab-on-Valve (LOV) approaches have proven themselves as powerful and versatile front ends to implement suitable pre-treatment procedures (separation and pre-concentration) for the assay of low concentrations of metals, as amply reflected...... in the substantial number of papers that have emerged in the scientific literature. These novel generations of flow injection analysis have demonstrated themselves as attractive substitutes for labour-intensive, manual sample pre-treatment and solution handling methods prior to analyte detection by atomic absorption....../emission spectrometry (FAAS, ETAAS, ICP-AES, ICP-MS). The lecture will initially give an overview of various automated alternatives for facilitating appropriate SI/LOV-separation and pre-concentration schemes of metals in liquid samples, including examples of solid-phase extraction with permanent columns or renewable...
DEFF Research Database (Denmark)
Hansen, Elo Harald
In recent years sequential injection (SI) analysis and Lab-on-Valve (LOV) approaches have proven themselves as powerful and versatile front ends to implement suitable pre-treatment procedures (separation and pre-concentration) for the assay of low concentrations of metals, as amply reflected...... in the increasing number of papers appearing in the scientific literature. These novel generations of flow injection analysis have demonstrated themselves as attractive substitutes for labour-intensive, manual sample pre-treatment and solution handling methods prior to analyte detection by atomic absorption....../emission spectrometric devices such as FAAS, ETAAS, AFS, ICP-AES, and ICP-MS. While earlier separation and pre-concentration schemes of metals in liquid samples primarily have been centred on the use solvent extraction, or precipitate/(co)-precipitate collection in incorporated knotted reactors, or permanent columns...
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
A Bootstrap Procedure of Propensity Score Estimation
Bai, Haiyan
2013-01-01
Propensity score estimation plays a fundamental role in propensity score matching for reducing group selection bias in observational data. To increase the accuracy of propensity score estimation, the author developed a bootstrap propensity score. The commonly used propensity score matching methods: nearest neighbor matching, caliper matching, and…
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Bootstrapping Kernel-Based Semiparametric Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael
by accommodating a non-negligible bias. A noteworthy feature of the assumptions under which the result is obtained is that reliance on a commonly employed stochastic equicontinuity condition is avoided. The second main result shows that the bootstrap provides an automatic method of correcting for the bias even...... when it is non-negligible....
Climate time series analysis classical statistical and bootstrap methods
Mudelsee, Manfred
2010-01-01
This book presents bootstrap resampling as a computationally intensive method able to meet the challenges posed by the complexities of analysing climate data. It shows how the bootstrap performs reliably in the most important statistical estimation techniques.
Efficient generation of pronunciation dictionaries: human factors factors during bootstrapping
CSIR Research Space (South Africa)
Davel, MH
2004-10-01
Full Text Available Bootstrapping techniques have significant potential for the efficient generation of linguistic resources such as electronic pronunciation dictionaries. The authors describe a system and an approach to bootstrapping for the development...
Directory of Open Access Journals (Sweden)
Pimkwan Chantarateepra
2012-01-01
Full Text Available The use of fully automated online solid-phase extraction (SPE coupled with sequential injection analysis, high-performance liquid chromatography (HPLC, and electrochemical detection (EC for the separation and determination of sulfonamides has been developed. A homemade microcolumn SPE system coupled with sequential injection analysis (SIA was used to automate the sample cleanup and extraction of sulfonamides. The optimal flow rate of sample loading and elution was found to be 10 μL/s, and optimal elution time of zone was 20–24 s. Under the optimal conditions, a linear relationship between peak area and sulfonamide concentrations was obtained in the range of 0.01–8.0 μg mL−1. Detection limits for seven sulfonamides were between 1.2 ng mL−1 and 11.2 ng mL−1. The proposed method has been applied for the determination of sulfonamides in shrimp. Recoveries in the range of 84–107% and relative standard deviations (RSDs below 6.5% for intraday and 13% for inter-day were received for three concentration levels of spiking. The results showed that the present method was simple, rapid, accurate and highly sensitive for the determination of sulfonamides.
Energy Technology Data Exchange (ETDEWEB)
Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)
2011-09-02
Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.
International Nuclear Information System (INIS)
2014-01-01
Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of such analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. In this publication, a combined procedure for the sequential determination of 210Po, 210Pb, 226Ra, Th and U radioisotopes in phosphogypsum is described. The method is based on the dissolution of small amounts of phosphogypsum by microwave digestion, followed by sequential separation of 210Po, 210Pb, Th and U radioisotopes by selective extraction chromatography using Sr, TEVA and UTEVA resins. Radium-226 is separated from interfering elements using Ba(Ra)SO4 co-precipitation. Lead-210 is determined by liquid scintillation counting. The alpha source of 210Po is prepared by autodeposition on a silver plate. The alpha sources of Th and U are prepared by electrodeposition on a stainless steel plate. A comprehensive methodology for the calculation of results, including the quantification of measurement uncertainty, was also developed. The procedure is introduced as a recommended procedure and validated in terms of trueness, repeatability and reproducibility in accordance with ISO guidelines
International Nuclear Information System (INIS)
Mesquita, Raquel B.R.; Ferreira, M. Teresa S.O.B.; Toth, Ildiko V.; Bordalo, Adriano A.; McKelvie, Ian D.; Rangel, Antonio O.S.S.
2011-01-01
Highlights: → Sequential injection determination of phosphate in estuarine and freshwaters. → Alternative spectrophotometric flow cells are compared. → Minimization of schlieren effect was assessed. → Proposed method can cope with wide salinity ranges. → Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 μM PO 4 3- ) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 μM) was achieved using both detection systems.
Bootstrapping Relational Affordances of Object Pairs using Transfer
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert
2018-01-01
leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...
Higher Order Bootstrap likelihood | Ogbonmwam | Journal of the ...
African Journals Online (AJOL)
In this work, higher order optimal window width is used to generate bootstrap kernel density likelihood. A simulated study is conducted to compare the distributions of the higher order bootstrap likelihoods with the exact (empirical) bootstrap likelihood. Our results indicate that the optimal window width of orders 2 and 4 ...
El-Zahry, Marwa R; Refaat, Ibrahim H; Mohamed, Horria A; Lendl, Bernhard
2016-07-01
In this contribution, the utility of sequential injection analysis in combination with surface-enhanced Raman spectroscopy (SERS) as a detection technique was investigated for simultaneous determination of aspirin and vitamin C in their pharmaceutical dosage forms and in spiked urine samples. The silver substrate was synthesized in situ by laser-induced photochemical procedure. By focusing the laser on a flow cell at 1 ml/min of continuous flow of 0.5 mM silver nitrate and 5 mM sodium citrate mixture, an active silver spot on the inner wall of the flow cell was prepared in a few seconds. The whole setup is fully computer controlled using ATLAS software to combine the two techniques. The system allows sequential determination of aspirin concentrations ranging from 100 to 500 ng/ml and vitamin C concentrations between 10 and 110 ng/ml with good precision of relative standard deviations (RSDs) of 0.85 and 1.7 %, respectively. A comparison of these results with those of the reported procedures showed excellent results compared with t and F values, indicating good accuracy and precision. The detection limits were 32 and 3 ng/ml for aspirin and vitamin C, respectively.
International Nuclear Information System (INIS)
Pena, Francisco; Lavilla, Isela; Bendicho, Carlos
2008-01-01
Single-drop microextraction (SDME) and sequential injection analysis have been hyphenated for ultratrace metal determination by Electrothermal-Atomic Absorption Spectrometry (ETAAS). The novel method was targeted on extraction of the Cr(VI)-APDC chelate and encompasses the potential of SDME as a miniaturized and virtually solvent-free preconcentration technique, the ability of sequential injection analysis to handle samples and the versatility of furnace autosamplers for introducing microliter samples in ETAAS. The variables influencing the microextraction of Cr(VI) onto an organic solvent drop, i.e., type of organic solvent, microextraction time, stirring rate of the sample solution, drop volume, immersion depth of the drop, salting-out effect, temperature of the sample, concentration of the complexing agent and pH of the sample solution were fully investigated. For a 5 and 20 min microextraction time, the preconcentration factors were 20 and 70, respectively. The detection limit was 0.02 μg/L of Cr(VI) and the repeatability expressed as relative standard deviation was 7%. The SDME-SIA-ETAAS technique was validated against BCR CRM 544 (lyophilized solution) and applied to ultrasensitive determination of Cr(VI) in natural waters
The use of the bootstrap in the analysis of case-control studies with missing data
DEFF Research Database (Denmark)
Siersma, Volkert Dirk; Johansen, Christoffer
2004-01-01
nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study......nonparametric bootstrap, bootstrap confidence intervals, missing values, multiple imputation, matched case-control study...
Chocholous, Petr; Holík, Pavel; Satínský, Dalibor; Solich, Petr
2007-04-30
A novel and fast simultaneous determination of triamcinolone acetonide (TCA) and salicylic acid (SA) in topical pharmaceutical formulations by sequential injection chromatography (SIC) as an alternative to classical high performance liquid chromatography (HPLC) has been developed. A recently introduced Onyxtrade mark monolithic C18 (50mmx4.6mm, Phenomenex((R))) with 5mm monolithic precolumn were used for the first time for creating sequential injection chromatography system based on a FIAlab((R)) 3000 with a six-port selection valve and 5.0mL syringe pump in study. The mobile phase used was acetonitrile/water (35:65, v/v), pH 3.3 adjusted with acetic acid at flow rate 0.9mLmin(-1). UV detection provided by fibre-optic DAD detector was set up at 240nm. Propylparaben was chosen as suitable internal standard (IS). There is only simple pre-adjustment of the sample of topical solution (dilution with mobile phase) so the analysis is not uselessly elongated. Parameters of the method showed good linearity in wide range, correlation coefficient >0.999; system precision (relative standard deviation, R.S.D.) in the range 0.45-1.95% at three different concentration levels, detection limits (3sigma) 1.00mugmL(-1) (salicylic acid), 0.66mugmL(-1) (triamcinolone acetonide) and 0.33mugmL(-1) (propylparaben) and recovery from the pharmaceutical preparations in the range 97.50-98.94%. The chromatographic resolution between peaks of compounds was more than 4.5 and analysis time was 5.1min under the optimal conditions. The advantages of sequential injection chromatography against classical HPLC are discussed and showing that SIC can be a method of option in many cases.
Bootstrapping phylogenies inferred from rearrangement data
Directory of Open Access Journals (Sweden)
Lin Yu
2012-08-01
Full Text Available Abstract Background Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. Results We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Conclusions Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its
Bootstrapping phylogenies inferred from rearrangement data.
Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me
2012-08-29
Large-scale sequencing of genomes has enabled the inference of phylogenies based on the evolution of genomic architecture, under such events as rearrangements, duplications, and losses. Many evolutionary models and associated algorithms have been designed over the last few years and have found use in comparative genomics and phylogenetic inference. However, the assessment of phylogenies built from such data has not been properly addressed to date. The standard method used in sequence-based phylogenetic inference is the bootstrap, but it relies on a large number of homologous characters that can be resampled; yet in the case of rearrangements, the entire genome is a single character. Alternatives such as the jackknife suffer from the same problem, while likelihood tests cannot be applied in the absence of well established probabilistic models. We present a new approach to the assessment of distance-based phylogenetic inference from whole-genome data; our approach combines features of the jackknife and the bootstrap and remains nonparametric. For each feature of our method, we give an equivalent feature in the sequence-based framework; we also present the results of extensive experimental testing, in both sequence-based and genome-based frameworks. Through the feature-by-feature comparison and the experimental results, we show that our bootstrapping approach is on par with the classic phylogenetic bootstrap used in sequence-based reconstruction, and we establish the clear superiority of the classic bootstrap for sequence data and of our corresponding new approach for rearrangement data over proposed variants. Finally, we test our approach on a small dataset of mammalian genomes, verifying that the support values match current thinking about the respective branches. Our method is the first to provide a standard of assessment to match that of the classic phylogenetic bootstrap for aligned sequences. Its support values follow a similar scale and its receiver
Bootstrapping a change-point Cox model for survival data
Xu, Gongjun; Sen, Bodhisattva; Ying, Zhiliang
2014-01-01
This paper investigates the (in)-consistency of various bootstrap methods for making inference on a change-point in time in the Cox model with right censored survival data. A criterion is established for the consistency of any bootstrap method. It is shown that the usual nonparametric bootstrap is inconsistent for the maximum partial likelihood estimation of the change-point. A new model-based bootstrap approach is proposed and its consistency established. Simulation studies are carried out to assess the performance of various bootstrap schemes. PMID:25400719
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
Bootstrap for the case-cohort design.
Huang, Yijian
2014-06-01
The case-cohort design facilitates economical investigation of risk factors in a large survival study, with covariate data collected only from the cases and a simple random subset of the full cohort. Methods that accommodate the design have been developed for various semiparametric models, but most inference procedures are based on asymptotic distribution theory. Such inference can be cumbersome to derive and implement, and does not permit confidence band construction. While bootstrap is an obvious alternative, how to resample is unclear because of complications from the two-stage sampling design. We establish an equivalent sampling scheme, and propose a novel and versatile nonparametric bootstrap for robust inference with an appealingly simple single-stage resampling. Theoretical justification and numerical assessment are provided for a number of procedures under the proportional hazards model.
Kepler Planet Detection Metrics: Statistical Bootstrap Test
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
Chocholouš, Petr; Gil, Renato; Acebal, Carolina C; Kubala, Viktor; Šatínský, Dalibor; Solich, Petr
2017-03-01
A recently presented new type of "multilayered" organic-inorganic hybrid silica particle packed column YMC-Triart C 18 (50 mm × 4.6 mm, 5 μm) was used for the development of a sequential injection chromatography method for determination of five azo dyes (Sudan I, Sudan II, Sudan III, Sudan orange G, and para red) in selected food seasonings. The use of a novel sorbent brings attractive features, reduced backpressure, and broader chemical stability together with high separation performance, which are discussed and compared with that of three types of columns typically used in medium-pressure flow chromatography techniques (classic monolithic, narrow monolithic, and core-shell particle columns). The separation was performed in gradient elution mode created by the zone mixing of two mobile phases (acetonitrile/water 90:10, 1.5 mL + acetonitrile/water 100:0, 2.3 mL) at a flow rate of 0.60 mL/min and time of analysis <9.5 min. The spectrophotometric detection wavelengths were set to 400, 480, and 500 nm. The high performance of the developed method with multilayered particle column was well documented and the results indicate a broad capability of sequential injection chromatography. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
International Nuclear Information System (INIS)
Anthemidis, Aristidis N.; Adam, Ibrahim S.I.
2009-01-01
A novel automatic sequential injection (SI) single-drop micro-extraction (SDME) system is proposed as versatile approach for on-line metal preconcentration and/or separation. Coupled to electrothermal atomic absorption spectrometry (ETAAS) the potentials of this SI scheme are demonstrated for trace cadmium determination in water samples. A non-charged complex of cadmium with ammonium diethyldithiophosphate (DDPA) was produced and extracted on-line into a 60 μL micro-drop of di-isobutyl ketone (DIBK). The extraction procedure was performed into a newly designed flow-through extraction cell coupled on a sequential injection manifold. As the complex Cd(II)-DDPA flowed continuously around the micro-droplet, the analyte was extracting into the solvent micro-drop. All the critical parameters were optimized and offered good performance characteristics and high preconcentration ratios. For 600 s micro-extraction time, the enhancement factor was 10 and the sampling frequency was 6 h -1 . The detection limit was 0.01 μg L -1 and the precision (RSD at 0.1 μg L -1 of cadmium) was 3.9%. The proposed method was evaluated by analyzing certified reference material
Sartini, Raquel P; Oliveira, Cláudio C
2002-06-01
A novel strategy for exploiting ion exchange in sequential injection systems is proposed. The procedure is based on the selection of a defined volume of a resin suspension, which is introduced and packed in the analytical path, establishing a resin mini-column in the system. The passage of a selected sample volume through the resin mini-column leads to the retention of the analyte, while the sample matrix is discarded. The analyte is eluted during the passage of the eluant/reagent by the packed beads, being the analytical signal monitored (absorbance) in the liquid phase. The beads are then aspirated back to the holding coil and directed to a recovery flask, linked at the selection valve; then the system is ready to begin a new cycle. With the proposed strategy, the main characteristics of the sequential injection system are kept as any new artifact is added to the manifold and system reconfiguration is not required. The feasibility of the approach is demonstrated by the phytic acid determination in food samples. For this specific application, AG1-X8 was selected as ion exchanger, and a solution containing Cl- and Fe(III)-salicylate complex was used as eluant and spectrophotometric reagent.
A Bootstrap Test for Conditional Symmetry
Liangjun Su; Sainan Jin
2005-01-01
This paper proposes a simple consistent nonparametric test of conditional symmetry based on the principle of characteristic functions. The test statistic is shown to be asymptotically normal under the null hypothesis of conditional symmetry and consistent against any conditional asymmetric distributions. We also study the power against local alternatives, propose a bootstrap version of the test, and conduct a small Monte Carlo simulation to evaluate the finitesample performance of the test.
Estimasi Regresi Wavelet Thresholding Dengan Metode Bootstrap
Suparti, Suparti; Mustofa, Achmad; Rusgiyono, Agus
2007-01-01
Wavelet is a function that has the certainly characteristic for example, it oscillate about zero point ascillating, localized in the time and frequency domain and construct the orthogonal bases in L2(R) space. On of the wavelet application is to estimate non parametric regression function. There are two kinds of wavelet estimator, i.e., linear and non linear wavelet estimator. The non linear wavelet estimator is called a thresholding wavelet rstimator. The application of the bootstrap method...
Gómez-Nieto, Beatriz; Gismera, Ma Jesús; Sevilla, Ma Teresa; Procopio, Jesús R
2015-01-07
The fast sequential multi-element determination of 11 elements present at different concentration levels in environmental samples and drinking waters has been investigated using high-resolution continuum source flame atomic absorption spectrometry. The main lines for Cu (324.754 nm), Zn (213.857 nm), Cd (228.802 nm), Ni (232.003 nm) and Pb (217.001 nm), main and secondary absorption lines for Mn (279.482 and 279.827 nm), Fe (248.327, 248.514 and 302.064 nm) and Ca (422.673 and 239.856 nm), secondary lines with different sensitivities for Na (589.592 and 330.237 nm) and K (769.897 and 404.414 nm) and a secondary line for Mg (202.582 nm) have been chosen to perform the analysis. A flow injection system has been used for sample introduction so sample consumption has been reduced up to less than 1 mL per element, measured in triplicate. Furthermore, the use of multiplets for Fe and the side pixel registration approach for Mg have been studied in order to reduce sensitivity and extend the linear working range. The figures of merit have been calculated and the proposed method was applied to determine these elements in a pine needles reference material (SRM 1575a), drinking and natural waters and soil extracts. Recoveries of analytes added at different concentration levels to water samples and extracts of soils were within 88-115% interval. In this way, the fast sequential multi-element determination of major and minor elements can be carried out, in triplicate, with successful results without requiring additional dilutions of samples or several different strategies for sample preparation using about 8-9 mL of sample. Copyright © 2014 Elsevier B.V. All rights reserved.
H. Bunschoten; M. Gore (Milind); I.J.Th.M. Claassen (Ivo); F.G.C.M. Uytdehaag (Fons); B. Dietzschold; W.H. Wunner; A.D.M.E. Osterhaus (Albert)
1989-01-01
textabstractTwo new monoclonal antibodies (MAbs) derived from mice immunized with the Pitman-Moore (PM) strain of rabies virus were used to identify and characterize two unique antigenic determinants on the rabies virus glycoprotein. One of the determinants, which defined an additional antigenic
Electron Bernstein wave-bootstrap current synergy in the National Spherical Torus Experiment
International Nuclear Information System (INIS)
Harvey, R.W.; Taylor, G.
2005-01-01
Current driven by electron Bernstein waves (EBW) and by the electron bootstrap effect are calculated separately and concurrently with a kinetic code to determine the degree of synergy between them. A target β=40% NSTX [M. Ono, S. Kaye, M. Peng et al., Proceedings of the 17th IAEA Fusion Energy Conference, edited by M. Spak (IAEA, Vienna, Austria, 1999), Vol. 3, p. 1135] plasma is examined. A simple bootstrap model in the collisional-quasilinear CQL3D Fokker-Planck code (National Technical Information Service document No. DE93002962) is used in these studies: the transiting electron distributions are connected in velocity space at the trapped-passing boundary to trapped-electron distributions that are displaced radially by a half-banana-width outwards/inwards for the co-passing/counter-passing regions. This model agrees well with standard bootstrap current calculations over the outer 60% of the plasma radius. Relatively small synergy net bootstrap current is obtained for EBW power up to 4 MW. Locally, bootstrap current density increases in proportion to increased plasma pressure, and this effect can significantly affect the radial profile of driven current
Rao, B V; Gopinath, R
1989-08-01
A simple potentiometric method is presented for successive determination of iron(III) and cobalt(II) by complexometric titration of the iron(III) with EDTA at pH 2 and 40 degrees , followed by redox titration of the cobalt(II) complex with 1,10-phenanthroline or 2,2'-bipyridyl at pH 4-5 and 40 degrees , with gold(III). There is no interference in either determination from common metal ions other than copper(II), which severely affects the cobalt determination but can be removed by electrolysis. The method has been successfully applied to determination of iron and cobalt in Kovar and Alnico magnet alloys.
Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.
Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G
2016-01-01
Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.
Chu, Ning; Fan, Shihua
2009-12-01
A new analytical method was developed for the simultaneous kinetic spectrophotometric determination of a quaternary carbamate pesticide mixture consisting of carbofuran, propoxur, metolcarb and fenobucarb using sequential injection analysis (SIA). The procedure was based upon the different kinetic properties between the analytes reacted with reagent in flow system in the non-stopped-flow mode, in which their hydrolysis products coupled with diazotized p-nitroaniline in an alkaline medium to form the corresponding colored complexes. The absorbance data from SIA peak time profile were recorded at 510 nm and resolved by the use of back-propagation-artificial neural network (BP-ANN) algorithms for multivariate quantitative analysis. The experimental variables and main network parameters were optimized and each of the pesticides could be determined in the concentration range of 0.5-10.0 μg mL -1, at a sampling frequency of 18 h -1. The proposed method was compared to other spectrophotometric methods for simultaneous determination of mixtures of carbamate pesticides, and it was proved to be adequately reliable and was successfully applied to the simultaneous determination of the four pesticide residues in water and fruit samples, obtaining the satisfactory results based on recovery studies (84.7-116.0%).
Energy Technology Data Exchange (ETDEWEB)
Beni, Valerio; Newton, Hazel V.; Arrigan, Damien W.M.; Hill, Martin; Lane, William A.; Mathewson, Alan
2004-01-30
The development of mercury-free electroanalytical systems for in-field analysis of pollutants requires a foundation on the electrochemical behaviour of the chosen electrode material in the target sample matrices. In this work, the behaviour of gold working electrodes in the media employed in the BCR sequential extraction protocol, for the fractionation of metals in solid environmental matrices, is reported. All three of the BCR sequential extraction media are redox active, on the basis of acidity and oxygen content as well as the inherent reducing or oxidising nature of some of the reagents employed: 0.11 M acetic acid, 0.1 M hydroxylammonium chloride (adjusted to pH 2) and 1 M ammonium acetate (adjusted to pH 2) with added trace hydrogen peroxide. The available potential ranges together with the demonstrated detection of target metals in these media are presented. Stripping voltammetry of copper or lead in the BCR extract media solutions reveal a multi-peak behaviour due to the stripping of both bulk metal and underpotential metal deposits. A procedure based on underpotential deposition-stripping voltammetry (UPD-SV) was evaluated for application to determination of copper in 0.11 M acetic acid soil extracts. A preliminary screening step in which different deposition times are applied to the sample enables a deposition time commensurate with UPD-SV to be selected so that no bulk deposition or stripping occurs thus simplifying the shape and features of the resulting voltammograms. Choice of the suitable deposition time is then followed by standards addition calibration. The method was validated by the analysis of a number of BCR 0.11 M acetic acid soil extracts. Good agreement was obtained been the UPD-SV method and atomic spectroscopic results.
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
Roberto S. Flowers-Cano
2018-02-01
Full Text Available Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP, bias-corrected bootstrap (BC, accelerated bias-corrected bootstrap (BCA and a modified version of the standard bootstrap (MSB. Different simulation scenarios were analyzed. In some cases, the mother distribution function was fit to the random samples that were generated. In other cases, a distribution function different to the mother distribution was fit to the samples. When the fitted distribution had three parameters, and was the same as the mother distribution, the intervals constructed with the four techniques had acceptable coverage. However, the bootstrap techniques failed in several of the cases in which the fitted distribution had two parameters.
Towards bootstrapping QED{sub 3}
Energy Technology Data Exchange (ETDEWEB)
Chester, Shai M.; Pufu, Silviu S. [Joseph Henry Laboratories, Princeton University,Princeton, NJ 08544 (United States)
2016-08-02
We initiate the conformal bootstrap study of Quantum Electrodynamics in 2+1 space-time dimensions (QED{sub 3}) with N flavors of charged fermions by focusing on the 4-point function of four monopole operators with the lowest unit of topological charge. We obtain upper bounds on the scaling dimension of the doubly-charged monopole operator, with and without assuming other gaps in the operator spectrum. Intriguingly, we find a (gap-dependent) kink in these bounds that comes reasonably close to the large N extrapolation of the scaling dimensions of the singly-charged and doubly-charged monopole operators down to N=4 and N=6.
A tauberian theorem for the conformal bootstrap
Qiao, Jiaxin; Rychkov, Slava
2017-12-01
For expansions in one-dimensional conformal blocks, we provide a rigorous link between the asymptotics of the spectral density of exchanged primaries and the leading singularity in the crossed channel. Our result has a direct application to systems of SL(2, ℝ)-invariant correlators (also known as 1d CFTs). It also puts on solid ground a part of the lightcone bootstrap analysis of the spectrum of operators of high spin and bounded twist in CFTs in d > 2. In addition, a similar argument controls the spectral density asymptotics in large N gauge theories.
General bootstrap equations in 4D CFTs
Cuomo, Gabriel Francisco; Karateev, Denis; Kravchuk, Petr
2018-01-01
We provide a framework for generic 4D conformal bootstrap computations. It is based on the unification of two independent approaches, the covariant (embedding) formalism and the non-covariant (conformal frame) formalism. We construct their main ingredients (tensor structures and differential operators) and establish a precise connection between them. We supplement the discussion by additional details like classification of tensor structures of n-point functions, normalization of 2-point functions and seed conformal blocks, Casimir differential operators and treatment of conserved operators and permutation symmetries. Finally, we implement our framework in a Mathematica package and make it freely available.
DEFF Research Database (Denmark)
Long, Xiangbao; Chomchoei, Roongrat; Hansen, Elo Harald
2004-01-01
with an external packed column and in a sequential injection lab-on-valve (SI-LOV) system. Employed for the determination of cadmium(II), complexed with diethyldithiophosphate (DDPA), and detection by electrothermal atomic absorption spectrometry (ETAAS), its performance was compared to that of a previously used...
Evidence of Bootstrap Financing among Small Start-Up Firms
Howard E. Van Auken; Lynn Neeley
1996-01-01
This study examines the use of bootstrap financing for a sample of 78 firms in a Midwestern state. The results show that traditional sources of capital accounted for 65% of the firms' start-up capital and 35% of the start-up capital was obtained from bootstrap sources. A Chi-squared analysis indicates a significant difference between the percentage of (!) sole proprietorship versus other firms and (2) construction/manufacturing versus other types of firms using bootstrap financing as compared...
International Nuclear Information System (INIS)
2014-01-01
Since 2004, IAEA activities related to the terrestrial environment have aimed at the development of a set of procedures to determine radionuclides in environmental samples. Reliable, comparable and ‘fit for purpose’ results are an essential requirement for any decision based on analytical measurements. For the analyst, tested and validated analytical procedures are extremely important tools for the production of analytical data. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available for reference to both the analyst and the customer. This publication describes a combined procedure for the sequential determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples. The method is based on the chemical separation of strontium, americium and plutonium using ion exchange chromatography, extraction chromatography and precipitation followed by alpha spectrometric and liquid scintillation counting detection. The method was tested and validated in terms of repeatability and trueness in accordance with International Organization for Standardization (ISO) guidelines using reference materials and proficiency test samples. Reproducibility tests were performed later at the IAEA Terrestrial Environment Laboratory. The calculations of the massic activity, uncertainty budget, decision threshold and detection limit are also described in this publication. The procedure is introduced for the determination of 90 Sr, 241 Am and Pu radioisotopes in environmental samples such as soil, sediment, air filter and vegetation samples. It is expected to be of general use to a wide range of laboratories, including the Analytical Laboratories for the Measurement of Environmental Radioactivity (ALMERA) network for routine environmental monitoring purposes
Confidence Intervals for the Mean: To Bootstrap or Not to Bootstrap
Calzada, Maria E.; Gardner, Holly
2011-01-01
The results of a simulation conducted by a research team involving undergraduate and high school students indicate that when data is symmetric the student's "t" confidence interval for a mean is superior to the studied non-parametric bootstrap confidence intervals. When data is skewed and for sample sizes n greater than or equal to 10,…
Effects of parameter estimation on maximum-likelihood bootstrap analysis.
Ripplinger, Jennifer; Abdo, Zaid; Sullivan, Jack
2010-08-01
Bipartition support in maximum-likelihood (ML) analysis is most commonly assessed using the nonparametric bootstrap. Although bootstrap replicates should theoretically be analyzed in the same manner as the original data, model selection is almost never conducted for bootstrap replicates, substitution-model parameters are often fixed to their maximum-likelihood estimates (MLEs) for the empirical data, and bootstrap replicates may be subjected to less rigorous heuristic search strategies than the original data set. Even though this approach may increase computational tractability, it may also lead to the recovery of suboptimal tree topologies and affect bootstrap values. However, since well-supported bipartitions are often recovered regardless of method, use of a less intensive bootstrap procedure may not significantly affect the results. In this study, we investigate the impact of parameter estimation (i.e., assessment of substitution-model parameters and tree topology) on ML bootstrap analysis. We find that while forgoing model selection and/or setting substitution-model parameters to their empirical MLEs may lead to significantly different bootstrap values, it probably would not change their biological interpretation. Similarly, even though the use of reduced search methods often results in significant differences among bootstrap values, only omitting branch swapping is likely to change any biological inferences drawn from the data. Copyright 2010 Elsevier Inc. All rights reserved.
Fast, Exact Bootstrap Principal Component Analysis forp> 1 million.
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject ( p ) is much larger than the number of subjects ( n ), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n -dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n -dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p -dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings ( p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) ( p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods.
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Roos, Per
2013-01-01
. Several experimental parameters affecting the analytical performance were investigated and compared including sample preboiling operation, aging time, amount of coprecipitating reagent, reagent for pH adjustment, sedimentation time, and organic matter decomposition approach. The overall analytical results...... show that preboiling and aging are important for obtaining high chemical yields for both Pu and Np, which is possibly related to the aggregation and adsorption behavior of organic substances contained in urine. Although the optimal condition for Np and Pu simultaneous determination requires 5-day aging...... time, an immediate coprecipitation without preboiling and aging could also provide fairly satisfactory chemical yields for both Np and Pu (50-60%) with high sample throughput (4 h/sample). Within the developed method, (242)Pu was exploited as chemical yield tracer for both Pu and Np isotopes. (242)Pu...
International Nuclear Information System (INIS)
Fiorino, J.A.; Jones, J.W.; Capar, S.G.
1976-01-01
Analysis of acid digests of foods for As, Se, Sb, and Te was semiautomated. Hydrides generated by controlled addition of base stabilized NaBH 4 solution to acid digests are transported directly into a shielded, hydrogen (nitrogen diluted), entrained-air flame for atomic absorption spectrophotometric determination of the individual elements. The detection limits, based on 1 g of digested sample, are approximately 10 to 20 ng/g for all four elements. Measurement precision is 1 to 2 percent relative standard deviation for each element measured at 0.10 μg. A comparison is made of results of analysis of lyophilized fish tissues for As and Se by instrumental neutron activation (INAA), hydride generation with atomic absorption spectrometry, fluorometry, and spectrophotometry. NBS standard reference materials (orchard leaves and bovine liver) analyzed for As, Se, and Sb by this method show excellent agreement with certified values and with independent NAA values
Minimal sequential Hausdorff spaces
Directory of Open Access Journals (Sweden)
Bhamini M. P. Nayar
2004-01-01
Full Text Available A sequential space (X,T is called minimal sequential if no sequential topology on X is strictly weaker than T. This paper begins the study of minimal sequential Hausdorff spaces. Characterizations of minimal sequential Hausdorff spaces are obtained using filter bases, sequences, and functions satisfying certain graph conditions. Relationships between this class of spaces and other classes of spaces, for example, minimal Hausdorff spaces, countably compact spaces, H-closed spaces, SQ-closed spaces, and subspaces of minimal sequential spaces, are investigated. While the property of being sequential is not (in general preserved by products, some information is provided on the question of when the product of minimal sequential spaces is minimal sequential.
Giakisikli, Georgia; Anthemidis, Aristidis N
2013-06-15
A new automatic sequential injection (SI) system for on-line magnetic sorbent extraction coupled with electrothermal atomic absorption spectrometry (ETAAS) has been successfully developed for metal determination. In this work, we reported effective on-line immobilization of magnetic silica particles into a microcolumn by the external force of two strong neodymium iron boron (NdFeB) magnets across it, avoiding the use of frits. Octadecylsilane functionalized maghemite magnetic particles were used as sorbent material. The potentials of the system were demonstrated for trace cadmium determination in water samples. The method was based on the on-line complex formation with diethyldithiocarbamate (DDTC), retention of Cd-DDTC on the surface of the MPs and elution with isobutyl methyl ketone (IBMK). The formation mechanism of the magnetic solid phase packed column and all critical parameters (chemical, flow, graphite furnace) influencing the performance of the system were optimized and offered good analytical characteristics. For 5 mL sample volume, a detection limit of 3 ng L(-1), a relative standard deviation of 3.9% at 50 ng L(-1) level (n=11) and a linear range of 9-350 ng L(-1) were obtained. The column remained stable for more than 600 cycles keeping the cost down in routine analysis. The proposed method was evaluated by analyzing certified reference materials and natural waters. Copyright © 2013 Elsevier B.V. All rights reserved.
Gómez-Nieto, Beatriz; Gismera, Mª Jesús; Sevilla, Mª Teresa; Procopio, Jesús R
2017-03-15
A simple method based on FAAS was developed for the sequential multi-element determination of Cu, Zn, Mn, Mg and Si in beverages and food supplements with successful results. The main absorption lines for Cu, Zn and Si and secondary lines for Mn and Mg were selected to carry out the measurements. The sample introduction was performed using a flow injection system. Using the choice of the absorption line wings, the upper limit of the linear range increased up to 110mgL -1 for Mg, 200mgL -1 for Si and 13mgL -1 for Zn. The determination of the five elements was carried out, in triplicate, without the need of additional sample dilutions and/or re-measurements, using less than 3.5mL of sample to perform the complete analysis. The LODs were 0.008mgL -1 for Cu, 0.017mgL -1 for Zn, 0.011mgL -1 for Mn, 0.16mgL -1 for Si and 0.11mgL -1 for Mg. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bootstrap confidence intervals for the process capability index under half-logistic distribution
Wararit Panichkitkosolkul
2012-01-01
This study concerns the construction of bootstrap confidence intervals for theprocess capability index in the case of half-logistic distribution. The bootstrap confidence intervals applied consist of standard bootstrap confidence interval, percentile bootstrap confidence interval and bias-corrected percentile bootstrap confidence interval. Using Monte Carlo simulations, the estimated coverage probabilities and average widths ofbootstrap confidence intervals are compared, with results showing ...
International Nuclear Information System (INIS)
Mitani, Constantina; Anthemidis, Aristidis N.
2013-01-01
Highlights: ► Drop-in-plug micro-extraction based on SI-LAV platform for metal preconcentration. ► Automatic liquid phase micro-extraction coupled with FAAS. ► Organic solvents with density higher than water are used. ► Lead determination in environmental water and urine samples. -- Abstract: A novel automatic on-line liquid phase micro-extraction method based on drop-in-plug sequential injection lab-at-valve (LAV) platform was proposed for metal preconcentration and determination. A flow-through micro-extraction chamber mounted at the selection valve was adopted without the need of sophisticated lab-on-valve components. Coupled to flame atomic absorption spectrometry (FAAS), the potential of this lab-at-valve scheme is demonstrated for trace lead determination in environmental and biological water samples. A hydrophobic complex of lead with ammonium pyrrolidine dithiocarbamate (APDC) was formed on-line and subsequently extracted into an 80 μL plug of chloroform. The extraction procedure was performed by forming micro-droplets of aqueous phase into the plug of the extractant. All critical parameters that affect the efficiency of the system were studied and optimized. The proposed method offered good performance characteristics and high preconcentration ratios. For 10 mL sample consumption an enhancement factor of 125 was obtained. The detection limit was 1.8 μg L −1 and the precision expressed as relative standard deviation (RSD) at 50.0 μg L −1 of lead was 2.9%. The proposed method was evaluated by analyzing certified reference materials and applied for lead determination in natural waters and urine samples
Bootstrap support is not first-order correct.
Susko, Edward
2009-04-01
The appropriate interpretation of bootstrap support for splits and the question of what constitutes large bootstrap support have received considerable attention. One desirable interpretation, indeed the interpretation that was put forward when bootstrap support for splits was first introduced, is that 1-minus bootstrap support is a P value for the hypothesis that the split is not well resolved. As a P value, bootstrap support has been argued to be first-order correct. By obtaining the limiting distribution of bootstrap support for a split when maximum likelihood estimation is conducted, it is shown that bootstrap support is not first-order correct and insight is provided into the nature of the problem. Borrowing from earlier results, it is also shown that similar results hold when the neighbor-joining algorithm is used. Examples suggest that bootstrap support is generally conservative as a P value and give insight as to why this is usually the case. The analysis indicates that the problem is largely due to the unusual nature of tree space where boundary trees always have at least 2 neighbors.
Bootstrap Estimates of Standard Errors in Generalizability Theory
Tong, Ye; Brennan, Robert L.
2007-01-01
Estimating standard errors of estimated variance components has long been a challenging task in generalizability theory. Researchers have speculated about the potential applicability of the bootstrap for obtaining such estimates, but they have identified problems (especially bias) in using the bootstrap. Using Brennan's bias-correcting procedures…
Higher-order Gaussian kernel in bootstrap boosting algorithm ...
African Journals Online (AJOL)
The bootstrap boosting algorithm is a bias reduction scheme. The adoption of higher-order Gaussian kernel in a bootstrap boosting algorithm in kernel density estimation was investigated. The algorithm used the higher-order. Gaussian kernel instead of the regular fixed kernels. A comparison of the scheme with existing ...
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
DEFF Research Database (Denmark)
Long, Xiangbao; Hansen, Elo Harald; Miró, Manuel
2005-01-01
The analytical performance of an on-line sequential injection lab-on-valve (SI-LOV) system using chelating Sepharose beads as sorbent material for the determination of ultra trace levels of Cd(II), Pb(II) and Ni(II) by electrothermal atomic absorption spectrometry (ETAAS) is described and discussed...
International Nuclear Information System (INIS)
Costa Lauria, D. da.
1986-01-01
A sequential analytical method for the determination of U-238, U-234, Th-232, Th-230, Th-228, Ra-226 and Ra-228 in environmental samples and applied to the analysis of mineral waters is studied. Thorium isotopes are coprecipitated with lanthanium fluoride before counting in alpha spectrometer, the uranium isotopes are determined by alpha spectrometry following extraction with TOPO onto a polymenic membrane. Radium-226 is determined with the radom emanation technique. (M.J.C.) [pt
Jiménez, María. S.; Velarte, Rosario; Castillo, Juan R.
2002-03-01
A simple, versatile and economical method of sequential injection analysis and inductively coupled plasma-mass spectrometry with matrix removal in two different ion-exchange resins for the determination of Al, As, Co, Cu, Mn, Mo, Ni, Pb and V is reported. The resins used, both of which contain the iminodiacetic acid functional group, were Chelex 100 and Metpac CC-1. With both resins, a matrix removal step (alkaline and alkaline-earth metals) with ammonium acetate is required before elution of the analytes with 2 M HNO 3. The procedure was validated by analyzing CASS-3 seawater reference material and good agreement was found with the certified values. Precision ( n=8) for the nine elements was in the range 0.8-4.9% for the Chelex 100 column and the recoveries ranged from 87.4 to 107.9%, except for Cu (78.7%) and Pb (74.9%), owing to the formation of hydroxides. For the Metpac CC-1 column, precision for the nine elements was in the range 1.2-7.1% and the recoveries between 91.7 and 109.3%, except for Al (127.2%), Co (118.5%) and Ni (127.5%) due to contamination problems.
Directory of Open Access Journals (Sweden)
Gardolinski Paulo C. F. C.
2002-01-01
Full Text Available Trace concentrations of Cd, Cu, Pb and Zn in four different sediment fractions extracted in sequence were determined by isotope dilution inductively coupled mass spectrometry (IDICPMS. The metals from each fraction were extracted following the sequential extraction procedure recommended by the Bureau Commun de Référence (BCR of the Commission of the European Communities. As an alternative to external calibration, the elements were quantified by spiking the extracted solutions with 112Cd, 63Cu, 208Pb and 66Zn and application of isotope dilution. The proposed approach was applied to a sample collected from a lake and two standard reference materials, NIST2704 river sediment from the National Institute of Standards & Technology and the BCR-277 estuarine sediment. Detection limits, for each extracted solution, varied from 0.31 to 0.53 mug L¹ for Cd, 0.92 to 2.9 mug L¹ for Cu, 0.22 to 1.1 mug L¹ for Pb and 1.3 to 7.6 mug L¹ for Zn. The sum of the metals concentration in the different fractions was compatible with 95% confidence level found amounts obtained with complete digestion of the samples and with the certified values of the standard reference materials.
International Nuclear Information System (INIS)
Michel, H.; Levent, D.; Barci, V.; Barci-Funel, G.; Hurel, C.
2008-01-01
A new sequential method for the determination of both natural (U, Th) and anthropogenic (Sr, Cs, Pu, Am) radionuclides has been developed for application to soil and sediment samples. The procedure was optimised using a reference sediment (IAEA-368) and reference soils (IAEA-375 and IAEA-326). Reference materials were first digested using acids (leaching), 'total' acids on hot plate, and acids in microwave in order to compare the different digestion technique. Then, the separation and purification were made by anion exchange resin and selective extraction chromatography: Transuranic (TRU) and Strontium (SR) resins. Natural and anthropogenic alpha radionuclides were separated by Uranium and Tetravalent Actinide (UTEVA) resin, considering different acid elution medium. Finally, alpha and gamma semiconductor spectrometer and liquid scintillation spectrometer were used to measure radionuclide activities. The results obtained for strontium-90, cesium-137, thorium-232, uranium- 238, plutonium-239+240 and americium-241 isotopes by the proposed method for the reference materials provided excellent agreement with the recommended values and good chemical recoveries. (authors)
The analytic bootstrap in fermionic CFTs
van Loon, Mark
2018-01-01
We apply the method of the large spin bootstrap to analyse fermionic conformal field theories with weakly broken higher spin symmetry. Through the study of correlators of composite operators, we find the anomalous dimensions and OPE coefficients in the GrossNeveu model in d = 2 + ɛ dimensions and the Gross-Neveu-Yukawa model in d = 4 - ɛ dimensions, based only on crossing symmetry. Furthermore a non-trivial solution in the d = 2 + ɛ expansion is found for a fermionic theory in which the fundamental field is not part of the spectrum. The results are perturbative in ɛ and valid to all orders in the spin, reproducing known results for operator dimensions and providing some new results for operator dimensions and OPE coefficients.
Simplifying large spin bootstrap in Mellin space
Dey, Parijat; Ghosh, Kausik; Sinha, Aninda
2018-01-01
We set up the conventional conformal bootstrap equations in Mellin space and analyse the anomalous dimensions and OPE coefficients of large spin double trace operators. By decomposing the equations in terms of continuous Hahn polynomials, we derive explicit expressions as an asymptotic expansion in inverse conformal spin to any order, reproducing the contribution of any primary operator and its descendants in the crossed channel. The expressions are in terms of known mathematical functions and involve generalized Bernoulli (Nørlund) polynomials and the Mack polynomials and enable us to derive certain universal properties. Comparing with the recently introduced reformulated equations in terms of crossing symmetric tree level exchange Witten diagrams, we show that to leading order in anomalous dimension but to all orders in inverse conformal spin, the equations are the same as in the conventional formulation. At the next order, the polynomial ambiguity in the Witten diagram basis is needed for the equivalence and we derive the necessary constraints for the same.
Bootstrapping 3D fermions with global symmetries
Iliesiu, Luca; Kos, Filip; Poland, David; Pufu, Silviu S.; Simmons-Duffin, David
2018-01-01
We study the conformal bootstrap for 4-point functions of fermions 〈 ψ i ψ j ψ k ψ ℓ 〉 in parity-preserving 3d CFTs, where ψ i transforms as a vector under an O( N ) global symmetry. We compute bounds on scaling dimensions and central charges, finding features in our bounds that appear to coincide with the O( N ) symmetric Gross-Neveu-Yukawa fixed points. Our computations are in perfect agreement with the 1 /N expansion at large N and allow us to make nontrivial predictions at small N . For values of N for which the Gross-Neveu-Yukawa universality classes are relevant to condensed-matter systems, we compare our results to previous analytic and numerical results.
The ${\\mathcal N}=2$ superconformal bootstrap
Beem, Christopher; Liendo, Pedro; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
In this work we initiate the conformal bootstrap program for ${\\mathcal N}=2$ superconformal field theories in four dimensions. We promote an abstract operator-algebraic viewpoint in order to unify the description of Lagrangian and non-Lagrangian theories, and formulate various conjectures concerning the landscape of theories. We analyze in detail the four-point functions of flavor symmetry current multiplets and of ${\\mathcal N}=2$ chiral operators. For both correlation functions we review the solution of the superconformal Ward identities and describe their superconformal block decompositions. This provides the foundation for an extensive numerical analysis discussed in the second half of the paper. We find a large number of constraints for operator dimensions, OPE coefficients, and central charges that must hold for any ${\\mathcal N}=2$ superconformal field theory.
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
BOOTSTRAP-BASED INFERENCE FOR GROUPED DATA
Directory of Open Access Journals (Sweden)
Jorge Iván Vélez
2015-07-01
Full Text Available Grouped data refers to continuous variables that are partitioned in intervals, not necessarily of the same length, to facilitate its interpretation. Unlike in ungrouped data, estimating simple summary statistics as the mean and mode, or more complex ones as a percentile or the coefficient of variation, is a difficult endeavour in grouped data. When the probability distribution generating the data is unknown, inference in ungrouped data is carried out using parametric or nonparametric resampling methods. However, there are no equivalent methods in the case of grouped data. Here, a bootstrap-based procedure to estimate the parameters of an unknown distribution based on grouped data is proposed, described and illustrated.
International Nuclear Information System (INIS)
Tue-Ngeun, Orawan; Sandford, Richard C.; Jakmunee, Jaroon; Grudpan, Kate; McKelvie, Ian D.; Worsfold, Paul J.
2005-01-01
An automated sequential injection (SI) method for the determination of dissolved inorganic carbon (DIC) and dissolved organic carbon (DOC) in freshwaters is presented. For DIC measurement on-line sample acidification (sulphuric acid, pH 2 which subsequently diffused through a PTFE membrane into a basic, cresol red acceptor stream. The CO 2 increased the concentration of the acidic form of the cresol red indicator, with a resultant decrease in absorbance at 570 nm being directly proportional to DIC concentration. DIC + DOC was determined after on-line sample irradiation (15 W low power UV lamp) coupled with acid-peroxydisulfate digestion, with the subsequent detection of CO 2 as described above. DOC was determined by subtraction of DIC from (DIC + DOC). Analytical figures of merit were linear ranges of 0.05-5.0 mg C L -1 for both DIC and DIC + DOC, with typical R.S.D.s of less than 7% (0.05 mg C L -1 -5.3% for DIC and 6.6% for DIC + DOC; 4.0 mg C L -1 -2.6% for DIC and 2.4% for DIC + DOC, n = 3) and an LOD (blank + 3S.D.) of 0.05 mg C L -1 . Sample throughput for the automated system was 8 h -1 for DIC and DOC with low reagent consumption (acid/peroxydisulfate 200 μL per DIC + DOC analysis). A range of model carbon compounds and Tamar River (Plymouth, UK) samples were analysed for DIC and DOC and the results showed good agreement with a high temperature catalytic oxidation (HTCO) reference method (t-test, P = 0.05)
Sviggum, Hans P; Arendt, Katherine W; Jacob, Adam K; Niesen, Adam D; Johnson, Rebecca L; Schroeder, Darrell R; Tien, Michael; Mantilla, Carlos B
2016-09-01
Intrathecal (IT) morphine is considered the "gold standard" for analgesia after cesarean delivery under spinal anesthesia, most commonly administered at a dose of 100 to 200 μg. There is less experience with IT hydromorphone for postcesarean analgesia and limited information on its optimal analgesic dose. We conducted this study to determine the effective analgesic dose for 90% patients (ED90) of IT hydromorphone that provides effective analgesia for women undergoing elective cesarean delivery and its potency ratio to IT morphine. In this dose-finding trial, 80 patients received spinal anesthesia for cesarean delivery. Participants were randomized to receive IT morphine or IT hydromorphone at a dose determined using up-down sequential allocation with a biased-coin design to determine ED90. All patients received standardized multimodal analgesia postoperatively in addition to IT opioid. An effective dose was defined as a numeric response score for pain of ≤3 (scale 0-10) 12 hours after spinal injection. The ED90 was 75 μg (95% confidence interval [CI], 46-93 μg) for IT hydromorphone and 150 μg (95% CI, 145-185 μg) for IT morphine. At these doses, the 95% CI for the percentage of patients with effective analgesia (numeric rating scale ≤3) was 64% to 100% for hydromorphone and 68% to 100% for morphine. Exploratory findings showed that the incidence of nausea and pruritus was not different among the most commonly used doses of IT hydromorphone (P = 0.44 and P = 0.74) or IT morphine (P = 0.67 and P = 0.38, respectively). When administering IT opioids at ED90 doses or higher, 100% (21/21) of IT hydromorphone and 95% (37/39) of IT morphine patients were satisfied with their analgesia. The ratio of IT morphine to IT hydromorphone for effective postcesarean analgesia is 2:1. Patient satisfaction was high with both medications.
Uncertainty estimation in diffusion MRI using the nonlocal bootstrap.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2014-08-01
In this paper, we propose a new bootstrap scheme, called the nonlocal bootstrap (NLB) for uncertainty estimation. In contrast to the residual bootstrap, which relies on a data model, or the repetition bootstrap, which requires repeated signal measurements, NLB is not restricted by the data structure imposed by a data model and obviates the need for time-consuming multiple acquisitions. NLB hinges on the observation that local imaging information recurs in an image. This self-similarity implies that imaging information coming from spatially distant (nonlocal) regions can be exploited for more effective estimation of statistics of interest. Evaluations using in silico data indicate that NLB produces distribution estimates that are in closer agreement with those generated using Monte Carlo simulations, compared with the conventional residual bootstrap. Evaluations using in vivo data demonstrate that NLB produces results that are in agreement with our knowledge on white matter architecture.
Establishing Keypoint Matches on Multimodal Images with Bootstrap Strategy and Global Information.
Li, Yong; Jin, Hongbin; Wu, Jiatao; Liu, Jie
2017-04-19
This paper proposes an algorithm of building keypoint matches on multimodal images by combining a bootstrap process and global information. The correct ratio of keypoint matches built with descriptors is typically very low on multimodal images of large spectral difference. To identify correct matches, global information is utilized for evaluating keypoint matches and a bootstrap technique is employed to reduce the computational cost. A keypoint match determines a transformation T and a similarity metric between the reference and the transformed test image by T. The similarity metric encodes global information over entire images, and hence a higher similarity indicates the match can bring more image content into alignment, implying it tends to be correct. Unfortunately, exhausting triplets/quadruples of matches for affine/projective transformation is computationally intractable when the number of keypoints is large. To reduce the computational cost, a bootstrap technique is employed that starts from single matches for a translation and rotation model, and goes increasingly to quadruples of four matches for a projective model. The global information screens for "good" matches at each stage and the bootstrap strategy makes the screening process computationally feasible. Experimental results show that the proposed method can establish reliable keypoint matches on challenging multimodal images of strong multimodality.
Thanasarakhan, Wish; Kruanetr, Senee; Deming, Richard L; Liawruangrath, Boonsom; Wangkarn, Sunantha; Liawruangrath, Saisunee
2011-06-15
A sequential injection analysis (SIA) spectrophotometric method for determining tetracycline (TC), chlortetracycline (CTC) and oxytetracycline (OTC) in different sample matrices were described. The method was based on the reaction between tetracyclines and yttrium (III) in weak basic micellar medium, yielding the light yellow complexes, which were monitored at 390, 392 and 395 nm, respectively. A cationic surfactant, cetyltrimethylammonium bromide (CTAB) was used to obtain the micellar system. The linear ranges of calibration graphs were between 1.0 × 10(-5) and 4 × 10(-4) mol L(-1), respectively. The molar absorptivities were 5.24 × 10(5), 4.98 × 10(4) and 4.78 × 10(4) L mol(-1)cm(-1). The detection limits (3σ) were between 4.9 × 10(-6) and 7.8 × 10(-6) mol L(-1) whereas the limit of quantitations (10σ) were between 1.63 × 10(-5) and 2.60 × 10(-5) mol L(-1) the interday and intraday precisions within a weak revealed as the relative standard deviations (R.S.D., n=11) were less than 4%. The method was rapid with a sampling rate of over 60 samples h(-1) for the three drugs. The proposed method has been satisfactorily applied for the determination of tetracycline and its derivatives in pharmaceutical preparations together with their residues in milk and honey samples collected in Chiang Mai Province. The accuracy was found to be high as the Student's t-values were found to be less than the theoretical ones. The results were compared favorably with those obtained by the conventional spectrophotometric method. Copyright © 2011 Elsevier B.V. All rights reserved.
Thorn, Graeme J; King, John R
2016-01-01
The Gram-positive bacterium Clostridium acetobutylicum is an anaerobic endospore-forming species which produces acetone, butanol and ethanol via the acetone-butanol (AB) fermentation process, leading to biofuels including butanol. In previous work we looked to estimate the parameters in an ordinary differential equation model of the glucose metabolism network using data from pH-controlled continuous culture experiments. Here we combine two approaches, namely the approximate Bayesian computation via an existing sequential Monte Carlo (ABC-SMC) method (to compute credible intervals for the parameters), and the profile likelihood estimation (PLE) (to improve the calculation of confidence intervals for the same parameters), the parameters in both cases being derived from experimental data from forward shift experiments. We also apply the ABC-SMC method to investigate which of the models introduced previously (one non-sporulation and four sporulation models) have the greatest strength of evidence. We find that the joint approximate posterior distribution of the parameters determines the same parameters as previously, including all of the basal and increased enzyme production rates and enzyme reaction activity parameters, as well as the Michaelis-Menten kinetic parameters for glucose ingestion, while other parameters are not as well-determined, particularly those connected with the internal metabolites acetyl-CoA, acetoacetyl-CoA and butyryl-CoA. We also find that the approximate posterior is strongly non-Gaussian, indicating that our previous assumption of elliptical contours of the distribution is not valid, which has the effect of reducing the numbers of pairs of parameters that are (linearly) correlated with each other. Calculations of confidence intervals using the PLE method back this up. Finally, we find that all five of our models are equally likely, given the data available at present. Copyright © 2015 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Injang, Uthaitip; Noyrod, Peeyanun; Siangproh, Weena; Dungchai, Wijitar; Motomizu, Shoji; Chailapakul, Orawon
2010-01-01
A method for the simultaneous determination of Pb(II), Cd(II), and Zn(II) at low μg L -1 concentration levels by sequential injection analysis-anodic stripping voltammetry (SIA-ASV) using screen-printed carbon nanotubes electrodes (SPCNTE) was developed. A bismuth film was prepared by in situ plating of bismuth on the screen-printed carbon nanotubes electrode. Operational parameters such as ratio of carbon nanotubes to carbon ink, bismuth concentration, deposition time and flow rate during preconcentration step were optimized. Under the optimal conditions, the linear ranges were found to be 2-100 μg L -1 for Pb(II) and Cd(II), and 12-100 μg L -1 for Zn(II). The limits of detection (S bl /S = 3) were 0.2 μg L -1 for Pb(II), 0.8 μg L -1 for Cd(II) and 11 μg L -1 for Zn(II). The measurement frequency was found to be 10-15 stripping cycle h -1 . The present method offers high sensitivity and high throughput for on-line monitoring of trace heavy metals. The practical utility of our method was also demonstrated with the determination of Pb(II), Cd(II), and Zn(II) by spiking procedure in herb samples. Our methodology produced results that were correlated with ICP-AES data. Therefore, we propose a method that can be used for the automatic and sensitive evaluation of heavy metals contaminated in herb items.
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Phu, Jack; Bui, Bang V; Kalloniatis, Michael; Khuu, Sieu K
2018-03-01
The number of subjects needed to establish the normative limits for visual field (VF) testing is not known. Using bootstrap resampling, we determined whether the ground truth mean, distribution limits, and standard deviation (SD) could be approximated using different set size ( x ) levels, in order to provide guidance for the number of healthy subjects required to obtain robust VF normative data. We analyzed the 500 Humphrey Field Analyzer (HFA) SITA-Standard results of 116 healthy subjects and 100 HFA full threshold results of 100 psychophysically experienced healthy subjects. These VFs were resampled (bootstrapped) to determine mean sensitivity, distribution limits (5th and 95th percentiles), and SD for different ' x ' and numbers of resamples. We also used the VF results of 122 glaucoma patients to determine the performance of ground truth and bootstrapped results in identifying and quantifying VF defects. An x of 150 (for SITA-Standard) and 60 (for full threshold) produced bootstrapped descriptive statistics that were no longer different to the original distribution limits and SD. Removing outliers produced similar results. Differences between original and bootstrapped limits in detecting glaucomatous defects were minimized at x = 250. Ground truth statistics of VF sensitivities could be approximated using set sizes that are significantly smaller than the original cohort. Outlier removal facilitates the use of Gaussian statistics and does not significantly affect the distribution limits. We provide guidance for choosing the cohort size for different levels of error when performing normative comparisons with glaucoma patients.
Marill, Keith A; Chang, Yuchiao; Wong, Kim F; Friedman, Ari B
2017-08-01
Objectives Assessing high-sensitivity tests for mortal illness is crucial in emergency and critical care medicine. Estimating the 95% confidence interval (CI) of the likelihood ratio (LR) can be challenging when sample sensitivity is 100%. We aimed to develop, compare, and automate a bootstrapping method to estimate the negative LR CI when sample sensitivity is 100%. Methods The lowest population sensitivity that is most likely to yield sample sensitivity 100% is located using the binomial distribution. Random binomial samples generated using this population sensitivity are then used in the LR bootstrap. A free R program, "bootLR," automates the process. Extensive simulations were performed to determine how often the LR bootstrap and comparator method 95% CIs cover the true population negative LR value. Finally, the 95% CI was compared for theoretical sample sizes and sensitivities approaching and including 100% using: (1) a technique of individual extremes, (2) SAS software based on the technique of Gart and Nam, (3) the Score CI (as implemented in the StatXact, SAS, and R PropCI package), and (4) the bootstrapping technique. Results The bootstrapping approach demonstrates appropriate coverage of the nominal 95% CI over a spectrum of populations and sample sizes. Considering a study of sample size 200 with 100 patients with disease, and specificity 60%, the lowest population sensitivity with median sample sensitivity 100% is 99.31%. When all 100 patients with disease test positive, the negative LR 95% CIs are: individual extremes technique (0,0.073), StatXact (0,0.064), SAS Score method (0,0.057), R PropCI (0,0.062), and bootstrap (0,0.048). Similar trends were observed for other sample sizes. Conclusions When study samples demonstrate 100% sensitivity, available methods may yield inappropriately wide negative LR CIs. An alternative bootstrapping approach and accompanying free open-source R package were developed to yield realistic estimates easily. This
MPBoot: fast phylogenetic maximum parsimony tree inference and bootstrap approximation.
Hoang, Diep Thi; Vinh, Le Sy; Flouri, Tomáš; Stamatakis, Alexandros; von Haeseler, Arndt; Minh, Bui Quang
2018-02-02
The nonparametric bootstrap is widely used to measure the branch support of phylogenetic trees. However, bootstrapping is computationally expensive and remains a bottleneck in phylogenetic analyses. Recently, an ultrafast bootstrap approximation (UFBoot) approach was proposed for maximum likelihood analyses. However, such an approach is still missing for maximum parsimony. To close this gap we present MPBoot, an adaptation and extension of UFBoot to compute branch supports under the maximum parsimony principle. MPBoot works for both uniform and non-uniform cost matrices. Our analyses on biological DNA and protein showed that under uniform cost matrices, MPBoot runs on average 4.7 (DNA) to 7 times (protein data) (range: 1.2-20.7) faster than the standard parsimony bootstrap implemented in PAUP*; but 1.6 (DNA) to 4.1 times (protein data) slower than the standard bootstrap with a fast search routine in TNT (fast-TNT). However, for non-uniform cost matrices MPBoot is 5 (DNA) to 13 times (protein data) (range:0.3-63.9) faster than fast-TNT. We note that MPBoot achieves better scores more frequently than PAUP* and fast-TNT. However, this effect is less pronounced if an intensive but slower search in TNT is invoked. Moreover, experiments on large-scale simulated data show that while both PAUP* and TNT bootstrap estimates are too conservative, MPBoot bootstrap estimates appear more unbiased. MPBoot provides an efficient alternative to the standard maximum parsimony bootstrap procedure. It shows favorable performance in terms of run time, the capability of finding a maximum parsimony tree, and high bootstrap accuracy on simulated as well as empirical data sets. MPBoot is easy-to-use, open-source and available at http://www.cibiv.at/software/mpboot .
Šrámková, Ivana; Chocholouš, Petr; Sklenářová, Hana; Šatínský, Dalibor
2015-10-01
A novel approach for automation of Micro-Extraction by Packed Sorbent (MEPS), a solid phase extraction technique, is presented, enabling precise and repeatable liquid handling due to the employment of sequential injection technique. The developed system was used for human urine sample clean-up and pre-concentration of betaxolol before its separation and determination. A commercial MEPS C-18 cartridge was integrated into an SIChrom™ system. The chromatographic separation was performed on a monolithic High Resolution C18 (50×4.6 mm) column which was coupled on-line in the system with Micro-Extraction using an additional selection valve. A mixture of acetonitrile and aqueous solution of 0.5% triethylamine with acetic acid, pH adjusted to 4.5 in ratio 30:70 was used as a mobile phase for elution of betaxolol from MEPS directly onto the monolithic column where the separation took place. Betaxolol was quantified by a fluorescence detector at wavelengths λ(ex)=220 nm and λ(em)=305 nm. The linear calibration range of 5-400 ng mL(-1), with limit of detection 1.5 ng mL(-1) and limit of quantification 5 ng mL(-1) and correlation r=0.9998 for both the standard and urine matrix calibration were achieved. The system recovery was 105±5%; 100±4%; 108±1% for three concentration levels of betaxolol in 10 times diluted urine - 5, 20 and 200 ng mL(-1), respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Lemos Batista, Bruno; Lisboa Rodrigues, Jairo; Andrade Nunes, Juliana; Oliveira Souza, Vanessa Cristina de; Barbosa, Fernando
2009-01-01
Inductively coupled plasma mass spectrometry with quadrupole (q-ICP-MS) and dynamic reaction cell (DRC-ICP-MS) were evaluated for sequential determination of As, Cd, Co, Cr, Cu, Mn, Pb, Se, Tl, V and Zn in blood. The method requires as little as 100 μL of blood. Prior to analysis, samples (100 μL) were diluted 1:50 in a solution containing 0.01% (v/v) Triton X-100 and 0.5% (v/v) nitric acid. The use of the DRC was only mandatory for Cr, Cu, V and Zn. For the other elements the equipment may be operated in a standard mode (q-ICP-MS). Ammonia was used as reaction gas. Selection of best flow rate of ammonium gas and optimization of the quadrupole dynamic band-pass tuning parameter (RPq) were carried out, using a ovine base blood for Cr and V and a synthetic matrix solution (SMS) for Zn and Cu diluted 1:50 and spiked to contain 1 μg L -1 of each element. Method detection limits (3 s) for 75 As, 114 Cd, 59 Co, 51 Cr, 63 Cu 55 Mn, 208 Pb, 82 Se, 205 Tl, 51 V, and 64 Zn were 14.0, 3.0, 11.0, 7.0, 280, 9.0, 3.0, 264, 0.7, 6.0 and 800 ng L -1 , respectively. Method validation was accomplished by the analysis of blood Reference Materials produced by the L'Institut National de Sante Publique du Quebec (Canada).
Tayade, Amol B; Dhar, Priyanka; Kumar, Jatinder; Sharma, Manu; Chaurasia, Om P; Srivastava, Ravi B
2013-07-30
A rapid method was developed to determine both types of vitamins in Rhodiola imbricata root for the accurate quantification of free vitamin forms. Rapid resolution liquid chromatography/tandem mass spectrometry (RRLC-MS/MS) with electrospray ionization (ESI) source operating in multiple reactions monitoring (MRM) mode was optimized for the sequential analysis of nine water-soluble vitamins (B1, B2, two B3 vitamins, B5, B6, B7, B9, and B12) and six fat-soluble vitamins (A, E, D2, D3, K1, and K2). Both types of vitamins were separated by ion-suppression reversed-phase liquid chromatography with gradient elution within 30 min and detected in positive ion mode. Deviations in the intra- and inter-day precision were always below 0.6% and 0.3% for recoveries and retention time. Intra- and inter-day relative standard deviation (RSD) values of retention time for water- and fat-soluble vitamin were ranged between 0.02-0.20% and 0.01-0.15%, respectively. The mean recoveries were ranged between 88.95 and 107.07%. Sensitivity and specificity of this method allowed the limits of detection (LOD) and limits of quantitation (LOQ) of the analytes at ppb levels. The linear range was achieved for fat- and water-soluble vitamins at 100-1000 ppb and 10-100 ppb. Vitamin B-complex and vitamin E were detected as the principle vitamins in the root of this adaptogen which would be of great interest to develop novel foods from the Indian trans-Himalaya. Copyright © 2013 Elsevier B.V. All rights reserved.
Anthemidis, Aristidis N; Ioannou, Kallirroy-Ioanna G
2009-06-30
A simple, sensitive and powerful on-line sequential injection (SI) dispersive liquid-liquid microextraction (DLLME) system was developed as an alternative approach for on-line metal preconcentration and separation, using extraction solvent at microlitre volume. The potentials of this novel schema, coupled to flame atomic absorption spectrometry (FAAS), were demonstrated for trace copper and lead determination in water samples. The stream of methanol (disperser solvent) containing 2.0% (v/v) xylene (extraction solvent) and 0.3% (m/v) ammonium diethyldithiophosphate (chelating agent) was merged on-line with the stream of sample (aqueous phase), resulting a cloudy mixture, which was consisted of fine droplets of the extraction solvent dispersed entirely into the aqueous phase. By this continuous process, metal chelating complexes were formed and extracted into the fine droplets of the extraction solvent. The hydrophobic droplets of organic phase were retained into a microcolumn packed with PTFE-turnings. A portion of 300 microL isobutylmethylketone was used for quantitative elution of the analytes, which transported directly to the nebulizer of FAAS. All the critical parameters of the system such as type of extraction solvent, flow-rate of disperser and sample, extraction time as well as the chemical parameters were studied. Under the optimum conditions the enhancement factor for copper and lead was 560 and 265, respectively. For copper, the detection limit and the precision (R.S.D.) were 0.04 microg L(-1) and 2.1% at 2.0 microg L(-1) Cu(II), respectively, while for lead were 0.54 microg L(-1) and 1.9% at 30.0 microg L(-1) Pb(II), respectively. The developed method was evaluated by analyzing certified reference material and applied successfully to the analysis of environmental water samples.
Horstkotte, Burkhard; Jarošová, Patrícia; Chocholouš, Petr; Sklenářová, Hana; Solich, Petr
2015-05-01
In this work, the applicability of Sequential Injection Chromatography for the determination of transition metals in water is evaluated for the separation of copper(II), zinc(II), and iron(II) cations. Separations were performed using a Dionex IonPAC™ guard column (50mm×2mm i.d., 9 µm). Mobile phase composition and post-column reaction were optimized by modified SIMPLEX method with subsequent study of the concentration of each component. The mobile phase consisted of 2,6-pyridinedicarboxylic acid as analyte-selective compound, sodium sulfate, and formic acid/sodium formate buffer. Post-column addition of 4-(2-pyridylazo)resorcinol was carried out for spectrophotometric detection of the analytes׳ complexes at 530nm. Approaches to achieve higher robustness, baseline stability, and detection sensitivity by on-column stacking of the analytes and initial gradient implementation as well as air-cushion pressure damping for post-column reagent addition were studied. The method allowed the rapid separation of copper(II), zinc(II), and iron(II) within 6.5min including pump refilling and aspiration of sample and 1mmol HNO3 for analyte stacking on the separation column. High sensitivity was achieved applying an injection volume of up to 90µL. A signal repeatability of<2% RSD of peak height was found. Analyte recovery evaluated by spiking of different natural water samples was well suited for routine analysis with sub-micromolar limits of detection. Copyright © 2015 Elsevier B.V. All rights reserved.
Bootstrapping Relational Affordances of Object Pairs Using Transfers
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert
2018-01-01
a tool to retrieve a desired object. We investigate how these relational affordances could be learned by a robot from its own action experience. A major challenge in this approach is to reduce the number of training samples needed to achieve accuracy, and hence we investigate an approach which can...... leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn random forest-based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach [direct bootstrapping (DB)], the state-space for a new...
DEFF Research Database (Denmark)
Hansen, Elo Harald; Miró, Manuel; Long, Xiangbao
2006-01-01
. In this context sequential injection (SI) and lab-on-valve (LOV) schemes have proven themselves as superb vehicles to act as front-end microanalytical methodologies, particularly when employing solid-phase extraction (SPE) procedures. In this communication, selected SPE-procedures in a bead-renewable fashion...... are presented as based on the exploitation of micro-sequential injection (μSI-LOV) using hydrophobic as well as hydrophilic bead materials. The examples given comprise the presentation of a universal approach for SPE-assays, front-end speciation of Cr(III) and Cr(VI) in a fully automated and enclosed set...
Conformal bootstrap in the Regge limit
Li, Daliang; Meltzer, David; Poland, David
2017-12-01
We analytically solve the conformal bootstrap equations in the Regge limit for large N conformal field theories. For theories with a parametrically large gap, the amplitude is dominated by spin-2 exchanges and we show how the crossing equations naturally lead to the construction of AdS exchange Witten diagrams. We also show how this is encoded in the anomalous dimensions of double-trace operators of large spin and large twist. We use the chaos bound to prove that the anomalous dimensions are negative. Extending these results to correlators containing two scalars and two conserved currents, we show how to reproduce the CEMZ constraint that the three-point function between two currents and one stress tensor only contains the structure given by Einstein-Maxwell theory in AdS, up to small corrections. Finally, we consider the case where operators of unbounded spin contribute to the Regge amplitude, whose net effect is captured by summing the leading Regge trajectory. We compute the resulting anomalous dimensions and corrections to OPE coefficients in the crossed channel and use the chaos bound to show that both are negative.
Quantum bootstrapping via compressed quantum Hamiltonian learning
International Nuclear Information System (INIS)
Wiebe, Nathan; Granade, Christopher; Cory, D G
2015-01-01
A major problem facing the development of quantum computers or large scale quantum simulators is that general methods for characterizing and controlling are intractable. We provide a new approach to this problem that uses small quantum simulators to efficiently characterize and learn control models for larger devices. Our protocol achieves this by using Bayesian inference in concert with Lieb–Robinson bounds and interactive quantum learning methods to achieve compressed simulations for characterization. We also show that the Lieb–Robinson velocity is epistemic for our protocol, meaning that information propagates at a rate that depends on the uncertainty in the system Hamiltonian. We illustrate the efficiency of our bootstrapping protocol by showing numerically that an 8 qubit Ising model simulator can be used to calibrate and control a 50 qubit Ising simulator while using only about 750 kilobits of experimental data. Finally, we provide upper bounds for the Fisher information that show that the number of experiments needed to characterize a system rapidly diverges as the duration of the experiments used in the characterization shrinks, which motivates the use of methods such as ours that do not require short evolution times. (fast track communication)
Prior Pronunciation Knowledge Bootstraps Word Learning
Directory of Open Access Journals (Sweden)
Khia Anne Johnson
2018-02-01
Full Text Available Learners often struggle with L2 sounds, yet little is known about the role of prior pronunciation knowledge and explicit articulatory training in language acquisition. This study asks if existing pronunciation knowledge can bootstrap word learning, and whether short-term audiovisual articulatory training for tongue position with and without a production component has an effect on lexical retention. Participants were trained and tested on stimuli with perceptually salient segments that are challenging to produce. Results indicate that pronunciation knowledge plays an important role in word learning. While much about the extent and shape of this role remains unclear, this study sheds light in three main areas. First, prior pronunciation knowledge leads to increased accuracy in word learning, as all groups trended toward lower accuracy on pseudowords with two novel segments, when compared with those with one or none. Second, all training and control conditions followed similar patterns, with training neither aiding nor inhibiting retention; this is a noteworthy result as previous work has found that the inclusion of production in training leads to decreased performance when testing for retention. Finally, higher production accuracy during practice led to higher retention after the word-learning task, indicating that individual differences and successful training are potentially important indicators of retention. This study provides support for the claim that pronunciation matters in L2 word learning.
Directory of Open Access Journals (Sweden)
José Antonio Castorina
2005-12-01
Full Text Available El presente artículo expone la teoría explicativa propuesta por Carey para el cambio conceptual. Primeramente, se plantea la cuestión de la reorganización conceptual en la psicología cognitiva y la posición de Carey. En segundo lugar, se ponen de relieve las condiciones epistémica que deben cumplir las "teorías" infantiles para que la reestructuración conceptual sea posible, así como los modos que adopta esta última. En tercer lugar, se muestran los resultados de investigaciones que verifican el cambio conceptual entre teorías infantiles de biología intuitiva. En cuarto lugar, se plantean las dificultades de otras teorías del cambio conceptual, para luego formular los rasgos del mecanismo alternativo de bootstrapping y su pertinencia para interpretrar los datos de las indagaciones mencionadas. Finalmente, se evalúan la originalidad de la teoría del bootstrpping en el escenario de los debates contemporáneos. Muy especialmente, se esboza una posible aproximación con las tesis dialécticas de Piaget.This paper examines the Carey's theory of conceptual change. First, it describes the conceptual reorganization in cognitive psychology and the author position. Second, the epistemic conditions that children "theories" should fulfil to make conceptual restructuring possible, as well as the ways adopted by the latter, are analyzed. In third place, findings of researches testing the conceptual change among biology intuitive children theories are explained. Subsequently, it discusses the difficulties other theories of conceptual change present, in order to state features of bootstrapping as an alternative mechanism and its relevance for the interpretation of abovementioned researches results. Finally, it evaluates the originality of "bootstrapping" theory in the scene of contemporary debates. It particularly outlines a possible approach to Piaget's dialectic theses.
Climate time series analysis classical statistical and bootstrap methods
Mudelsee, Manfred
2014-01-01
Written for climatologists and applied statisticians, this book explains the bootstrap algorithms (including novel adaptions) and methods for confidence interval construction. The accuracy of the algorithms is tested by means of Monte Carlo experiments.
'Bootstrap' Configuration for Multistage Pulse-Tube Coolers
Nguyen, Bich; Nguyen, Lauren
2008-01-01
A bootstrap configuration has been proposed for multistage pulse-tube coolers that, for instance, provide final-stage cooling to temperatures as low as 20 K. The bootstrap configuration supplants the conventional configuration, in which customarily the warm heat exchangers of all stages reject heat at ambient temperature. In the bootstrap configuration, the warm heat exchanger, the inertance tube, and the reservoir of each stage would be thermally anchored to the cold heat exchanger of the next warmer stage. The bootstrapped configuration is superior to the conventional setup, in some cases increasing the 20 K cooler's coefficient of performance two-fold over that of an otherwise equivalent conventional layout. The increased efficiency could translate into less power consumption, less cooler mass, and/or lower cost for a given amount of cooling.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
The main contribution of this paper is to propose a bootstrap method for inference on integrated volatility based on the pre-averaging approach of Jacod et al. (2009), where the pre-averaging is done over all possible overlapping blocks of consecutive observations. The overlapping nature of the pre......-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995......)) is valid only when volatility is constant. The failure of the blocks of blocks bootstrap is due to the heterogeneity of the squared pre-averaged returns when volatility is stochastic. To preserve both the dependence and the heterogeneity of squared pre-averaged returns, we propose a novel procedure...
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Bootstrap prediction and Bayesian prediction under misspecified models
Fushiki, Tadayoshi
2005-01-01
We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...
Efficient generation of pronunciation dictionaries: machine learning factors during bootstrapping
CSIR Research Space (South Africa)
Davel, MH
2004-10-01
Full Text Available of Pronunciation Dictionaries: Machine Learning Factors during Bootstrapping Marelie Davel and Etienne Barnard CSIR / University of Pretoria Pretoria, South Africa mdavel@csir.co.za ebarnard@up.ac.za Abstract Several factors affect the efficiency... of bootstrapping approaches to the generation of pronunciation dictionaries. We focus on factors related to the underlying rule-extraction algorithms, and demonstrate variants of the Dynamically Expanding Context al- gorithm, which are beneficial...
Generalized bootstrap equations and possible implications for the NLO Odderon
Energy Technology Data Exchange (ETDEWEB)
Bartels, J. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Vacca, G.P. [INFN, Sezione di Bologna (Italy)
2013-07-15
We formulate and discuss generalized bootstrap equations in nonabelian gauge theories. They are shown to hold in the leading logarithmic approximation. Since their validity is related to the self-consistency of the Steinmann relations for inelastic production amplitudes they can be expected to be valid also in NLO. Specializing to the N=4 SYM, we show that the validity in NLO of these generalized bootstrap equations allows to find the NLO Odderon solution with intercept exactly at one.
Bootstrap current calculations for TJ-II stellarator
Martinell, Julio J.; Camacho, Katia
2016-10-01
Bootstrap current is stellarators is usually very small since they operate solely with the magnetic confinement provided by the external currents. Since plasma pressure gradients are always present the bootstrap current is always finite, but the magnetic design can be optimized to minimize it. In the helias configuration there is no optimization and therefore it is important to estimate the actual bootstrap current generated by given pressure profiles. Here, we use the configuration of the TJ-II helias to calculate the bootstrap current for various density regimes using the kinetic code DKES. We compute the monoenergetic transport coefficients D11 and D13 to find first the thermal ambipolar diffusion coefficients and the corresponding radial electric field and then the respective bootstrap current. This is made taking experimental density and electron and ion temperature profiles. In spite of the convergence problems of DKES at low collisionality, we can obtain bootstrap current values with acceptable uncertainties, without using Monte Carlo methods. The results are compared with axisymmetric neoclassical computations. The resulting rotational transform is used to obtain the rational surfaces location and predict the transport barriers observed in the experiments. Funded by projects PAPIIT IN109115 and Conacyt 152905.
A Bootstrap Approach to Martian Manufacturing
Dorais, Gregory A.
2004-01-01
In-Situ Resource Utilization (ISRU) is an essential element of any affordable strategy for a sustained human presence on Mars. Ideally, Martian habitats would be extremely massive to allow plenty of room to comfortably live and work, as well as to protect the occupants from the environment. Moreover, transportation and power generation systems would also require significant mass if affordable. For our approach to ISRU, we use the industrialization of the U.S. as a metaphor. The 19th century started with small blacksmith shops and ended with massive steel mills primarily accomplished by blacksmiths increasing their production capacity and product size to create larger shops, which produced small mills, which produced the large steel mills that industrialized the country. Most of the mass of a steel mill is comprised of steel in simple shapes, which are produced and repaired with few pieces of equipment also mostly made of steel in basic shapes. Due to this simplicity, we expect that the 19th century manufacturing growth can be repeated on Mars in the 21st century using robots as the primary labor force. We suggest a "bootstrap" approach to manufacturing on Mars that uses a "seed" manufacturing system that uses regolith to create major structural components and spare parts. The regolith would be melted, foamed, and sintered as needed to fabricate parts using casting and solid freeform fabrication techniques. Complex components, such as electronics, would be brought from Earth and integrated as needed. These parts would be assembled to create additional manufacturing systems, which can be both more capable and higher capacity. These subsequent manufacturing systems could refine vast amounts of raw materials to create large components, as well as assemble equipment, habitats, pressure vessels, cranes, pipelines, railways, trains, power generation stations, and other facilities needed to economically maintain a sustained human presence on Mars.
Deng, Nina; Allison, Jeroan J; Fang, Hua Julia; Ash, Arlene S; Ware, John E
2013-05-31
Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r 0.9).
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
Assessment of bootstrap resampling performance for PET data.
Markiewicz, P J; Reader, A J; Matthews, J C
2015-01-07
Bootstrap resampling has been successfully used for estimation of statistical uncertainty of parameters such as tissue metabolism, blood flow or displacement fields for image registration. The performance of bootstrap resampling as applied to PET list-mode data of the human brain and dedicated phantoms is assessed in a novel and systematic way such that: (1) the assessment is carried out in two resampling stages: the 'real world' stage where multiple reference datasets of varying statistical level are generated and the 'bootstrap world' stage where corresponding bootstrap replicates are generated from the reference datasets. (2) All resampled datasets were reconstructed yielding images from which multiple voxel and regions of interest (ROI) values were extracted to form corresponding distributions between the two stages. (3) The difference between the distributions from both stages was quantified using the Jensen-Shannon divergence and the first four moments. It was found that the bootstrap distributions are consistently different to the real world distributions across the statistical levels. The difference was explained by a shift in the mean (up to 33% for voxels and 14% for ROIs) being proportional to the inverse square root of the statistical level (number of counts). Other moments were well replicated by the bootstrap although for very low statistical levels the estimation of the variance was poor. Therefore, the bootstrap method should be used with care when estimating systematic errors (bias) and variance when very low statistical levels are present such as in early time frames of dynamic acquisitions, when the underlying population may not be sufficiently represented.
Bootstrap-based Support of HGT Inferred by Maximum Parsimony
Directory of Open Access Journals (Sweden)
Nakhleh Luay
2010-05-01
Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
Bootstrap-based support of HGT inferred by maximum parsimony.
Park, Hyun Jung; Jin, Guohua; Nakhleh, Luay
2010-05-05
Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/), and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Anderst, William J
2015-05-01
There is substantial inter-subject variability in intervertebral range of motion (ROM) in the cervical spine. This makes it difficult to define "normal" ROM, and to assess the effects of age, injury, and surgical procedures on spine kinematics. The objective of this study was to define normal intervertebral kinematics in the cervical spine during dynamic functional loading. Twenty-nine participants performed dynamic flexion\\extension, axial rotation, and lateral bending while biplane radiographs were collected at 30 images/s. Vertebral motion was tracked with sub-millimeter accuracy using a validated volumetric model-based tracking process that matched subject-specific CT-based bone models to the radiographs. Gaussian point-by-point and bootstrap techniques were used to determine 90% prediction bands for the intervertebral kinematic curves at 1% intervals of each movement cycle. Cross validation was performed to estimate the true achieved coverage for each method. For a targeted coverage of 90%, the estimated true coverage using bootstrap prediction bands averaged 86±5%, while the estimated true coverage using Gaussian point-by-point intervals averaged 56±10% over all movements and all motion segments. Bootstrap prediction bands are recommended as the standard for evaluating full ROM cervical spine kinematic curves. The data presented here can be used to identify abnormal motion in patients presenting with neck pain, to drive computational models, and to assess the biofidelity of in vitro loading paradigms. Copyright © 2015 Elsevier Ltd. All rights reserved.
Neural Network Computed Bootstrap Current for Real Time Control in DIII-D
Tema Biwole, Arsene; Smith, Sterling P.; Meneghini, Orso; Belli, Emily; Candy, Jeff
2017-10-01
In an effort to provide a fast and accurate calculation of the bootstrap current density for use as a constraint in real-time equilibrium reconstructions, we have developed a neural network (NN) non-linear regression of the NEO code calculated bootstrap current jBS. A new formulation for jBS in NEO allows for a determination of the coefficients on the density and temperature scale lengths. The new formulation reduces the number of inputs to the NN, and the number of output coefficients is 2 times the number of species (including electrons). The NN can reproduce the NEO and Sauter coefficients to a high degree of accuracy (bootstrap current density calculated in NEO has been used as a constraint in an offline equilibrium reconstruction for comparison to the NN calculation. The computational time of this method (μs) makes it ideal for real time calculation in DIII-D. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656, DE-FC02-06ER54873.
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Sequential charged particle reaction
International Nuclear Information System (INIS)
Hori, Jun-ichi; Ochiai, Kentaro; Sato, Satoshi; Yamauchi, Michinori; Nishitani, Takeo
2004-01-01
The effective cross sections for producing the sequential reaction products in F82H, pure vanadium and LiF with respect to the 14.9-MeV neutron were obtained and compared with the estimation ones. Since the sequential reactions depend on the secondary charged particles behavior, the effective cross sections are corresponding to the target nuclei and the material composition. The effective cross sections were also estimated by using the EAF-libraries and compared with the experimental ones. There were large discrepancies between estimated and experimental values. Additionally, we showed the contribution of the sequential reaction on the induced activity and dose rate in the boundary region with water. From the present study, it has been clarified that the sequential reactions are of great importance to evaluate the dose rates around the surface of cooling pipe and the activated corrosion products. (author)
Bootstrap testing for cross-correlation under low firing activity.
González-Montoro, Aldana M; Cao, Ricardo; Espinosa, Nelson; Cudeiro, Javier; Mariño, Jorge
2015-06-01
A new cross-correlation synchrony index for neural activity is proposed. The index is based on the integration of the kernel estimation of the cross-correlation function. It is used to test for the dynamic synchronization levels of spontaneous neural activity under two induced brain states: sleep-like and awake-like. Two bootstrap resampling plans are proposed to approximate the distribution of the test statistics. The results of the first bootstrap method indicate that it is useful to discern significant differences in the synchronization dynamics of brain states characterized by a neural activity with low firing rate. The second bootstrap method is useful to unveil subtle differences in the synchronization levels of the awake-like state, depending on the activation pathway.
Point Set Denoising Using Bootstrap-Based Radial Basis Function.
Liew, Khang Jie; Ramli, Ahmad; Abd Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study.
Investigations of dipole localization accuracy in MEG using the bootstrap.
Darvas, F; Rautiainen, M; Pantazis, D; Baillet, S; Benali, H; Mosher, J C; Garnero, L; Leahy, R M
2005-04-01
We describe the use of the nonparametric bootstrap to investigate the accuracy of current dipole localization from magnetoencephalography (MEG) studies of event-related neural activity. The bootstrap is well suited to the analysis of event-related MEG data since the experiments are repeated tens or even hundreds of times and averaged to achieve acceptable signal-to-noise ratios (SNRs). The set of repetitions or epochs can be viewed as a set of independent realizations of the brain's response to the experiment. Bootstrap resamples can be generated by sampling with replacement from these epochs and averaging. In this study, we applied the bootstrap resampling technique to MEG data from somatotopic experimental and simulated data. Four fingers of the right and left hand of a healthy subject were electrically stimulated, and about 400 trials per stimulation were recorded and averaged in order to measure the somatotopic mapping of the fingers in the S1 area of the brain. Based on single-trial recordings for each finger we performed 5000 bootstrap resamples. We reconstructed dipoles from these resampled averages using the Recursively Applied and Projected (RAP)-MUSIC source localization algorithm. We also performed a simulation for two dipolar sources with overlapping time courses embedded in realistic background brain activity generated using the prestimulus segments of the somatotopic data. To find correspondences between multiple sources in each bootstrap, sample dipoles with similar time series and forward fields were assumed to represent the same source. These dipoles were then clustered by a Gaussian Mixture Model (GMM) clustering algorithm using their combined normalized time series and topographies as feature vectors. The mean and standard deviation of the dipole position and the dipole time series in each cluster were computed to provide estimates of the accuracy of the reconstructed source locations and time series.
Testing Process Factor Analysis Models Using the Parametric Bootstrap.
Zhang, Guangjian
2018-01-01
Process factor analysis (PFA) is a latent variable model for intensive longitudinal data. It combines P-technique factor analysis and time series analysis. The goodness-of-fit test in PFA is currently unavailable. In the paper, we propose a parametric bootstrap method for assessing model fit in PFA. We illustrate the test with an empirical data set in which 22 participants rated their effects everyday over a period of 90 days. We also explore Type I error and power of the parametric bootstrap test with simulated data.
A critique of astrophysical applications of Hagedorn's bootstrap
Nahm, W
1980-01-01
It has been shown that Hagedorn's bootstrap should not be applied to hadronic matter at densities large against nuclear densities. The correct predictions of the thermodynamical model do not use any relation between the mass of the fireballs and their size, whereas the astrophysical applications depend on the unreasonable assumption that the size is independent of the mass. The most spectacular prediction of the bootstrap, namely violent black hole explosions yielding 10/sup 15/ g in the last millisecond, is shown to be completely unfounded, even if such an assumption is made. (21 refs).
Bootstrapped efficiency measures of oil blocks in Angola
International Nuclear Information System (INIS)
Barros, C.P.; Assaf, A.
2009-01-01
This paper investigates the technical efficiency of Angola oil blocks over the period 2002-2007. A double bootstrap data envelopment analysis (DEA) model is adopted composed in the first stage of a DEA-variable returns to scale (VRS) model and then followed in the second stage by a bootstrapped truncated regression. Results showed that on average, the technical efficiency has fluctuated over the period of study, but deep and ultradeep oil blocks have generally maintained a consistent efficiency level. Policy implications are derived.
Directory of Open Access Journals (Sweden)
Kristina Novak Zelenika
2014-07-01
Full Text Available Geostatistical methods are very successfully used in Upper Miocene (Lower Pontian Kloštar structure modelling. Mapping of the two variables (porosity and thickness and their common observation in certain cut-off values gave the insight in depositional channel location, transitional lithofacies, material transport direction and variables distribution within representative Lower Pontian reservoir. It was possible to observe direction of the turbidites and role of the normal fault in detritus flow direction in the analyzed structure. Intercalation between turbiditic sandstones and basinal pelitic marls were the locations with the highest thicknesses. Sequential Indicator Simulations highlighted porosity maps as primary and thickness maps as secondary (additional data source (the paper is published in Croatian.
Sieve bootstrapping in the Lee-Carter model
Heinemann, A.
2013-01-01
This paper studies an alternative approach to construct confidence intervals for parameter estimates of the Lee-Carter model. First, the procedure of obtaining confidence intervals using regular nonparametric i.i.d. bootstrap is specified. Empirical evidence seems to invalidate this approach as it
Properties of bootstrap tests for N-of-1 studies.
Lin, Sharon X; Morrison, Leanne; Smith, Peter W F; Hargood, Charlie; Weal, Mark; Yardley, Lucy
2016-11-01
N-of-1 study designs involve the collection and analysis of repeated measures data from an individual not using an intervention and using an intervention. This study explores the use of semi-parametric and parametric bootstrap tests in the analysis of N-of-1 studies under a single time series framework in the presence of autocorrelation. When the Type I error rates of bootstrap tests are compared to Wald tests, our results show that the bootstrap tests have more desirable properties. We compare the results for normally distributed errors with those for contaminated normally distributed errors and find that, except when there is relatively large autocorrelation, there is little difference between the power of the parametric and semi-parametric bootstrap tests. We also experiment with two intervention designs: ABAB and AB, and show the ABAB design has more power. The results provide guidelines for designing N-of-1 studies, in the sense of how many observations and how many intervention changes are needed to achieve a certain level of power and which test should be performed. © 2016 The Authors British Journal of Mathematical and Statistical Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.
Finite-Size Effects for Some Bootstrap Percolation Models
Enter, A.C.D. van; Adler, Joan; Duarte, J.A.M.S.
The consequences of Schonmann's new proof that the critical threshold is unity for certain bootstrap percolation models are explored. It is shown that this proof provides an upper bound for the finite-size scaling in these systems. Comparison with data for one case demonstrates that this scaling
Bootstrap quantification of estimation uncertainties in network degree distributions.
Gel, Yulia R; Lyubchich, Vyacheslav; Ramirez Ramirez, L Leticia
2017-07-19
We propose a new method of nonparametric bootstrap to quantify estimation uncertainties in functions of network degree distribution in large ultra sparse networks. Both network degree distribution and network order are assumed to be unknown. The key idea is based on adaptation of the "blocking" argument, developed for bootstrapping of time series and re-tiling of spatial data, to random networks. We first sample a set of multiple ego networks of varying orders that form a patch, or a network block analogue, and then resample the data within patches. To select an optimal patch size, we develop a new computationally efficient and data-driven cross-validation algorithm. The proposed fast patchwork bootstrap (FPB) methodology further extends the ideas for a case of network mean degree, to inference on a degree distribution. In addition, the FPB is substantially less computationally expensive, requires less information on a graph, and is free from nuisance parameters. In our simulation study, we show that the new bootstrap method outperforms competing approaches by providing sharper and better-calibrated confidence intervals for functions of a network degree distribution than other available approaches, including the cases of networks in an ultra sparse regime. We illustrate the FPB in application to collaboration networks in statistics and computer science and to Wikipedia networks.
Automatic shape model building based on principal geodesic analysis bootstrapping
DEFF Research Database (Denmark)
Dam, Erik B; Fletcher, P Thomas; Pizer, Stephen M
2008-01-01
shape representation is deformed into the training shapes followed by computation of the shape mean and modes of shape variation. In the first iteration, a generic shape model is used as starting point - in the following iterations in the bootstrap method, the resulting mean and modes from the previous...
Metastability thresholds for anisotropic bootstrap percolation in three dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Van Enter, A.C.D.; Fey, A.
2012-01-01
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference
Olea, R.A.; Pardo-Iguzquiza, E.
2011-01-01
The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.
A Bootstrap Cointegration Rank Test for Panels of VAR Models
DEFF Research Database (Denmark)
Callot, Laurent
functions of the individual Cointegrated VARs (CVAR) models. A bootstrap based procedure is used to compute empirical distributions of the trace test statistics for these individual models. From these empirical distributions two panel trace test statistics are constructed. The satisfying small sample...
Finite-size effects for anisotropic bootstrap percolation : Logarithmic corrections
van Enter, Aernout C. D.; Hulshof, Tim
In this note we analyse an anisotropic, two-dimensional bootstrap percolation model introduced by Gravner and Griffeath. We present upper and lower bounds on the finite-size effects. We discuss the similarities with the semi-oriented model introduced by Duarte.
Metastability Thresholds for Anisotropic Bootstrap Percolation in Three Dimensions
Enter, Aernout C.D. van; Fey, Anne
In this paper we analyze several anisotropic bootstrap percolation models in three dimensions. We present the order of magnitude for the metastability thresholds for a fairly general class of models. In our proofs, we use an adaptation of the technique of dimensional reduction. We find that the
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
Bootstrap confidence intervals for model-based surveys | Ouma ...
African Journals Online (AJOL)
To deal with the problem, Chambers and Dorfam (1994) suggested a n alternative method based on the bootstrap methodology. Their method is meant for model-based surveys. It starts by assuming a simple linear regression model as a working model in which the ratio estimator is optimal for estimating the population total.
Bootstrap confidence intervals for three-way methods
Kiers, Henk A.L.
Results from exploratory three-way analysis techniques such as CANDECOMP/PARAFAC and Tucker3 analysis are usually presented without giving insight into uncertainties due to sampling. Here a bootstrap procedure is proposed that produces percentile intervals for all output parameters. Special
Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method
Directory of Open Access Journals (Sweden)
Yi-Ming Hu
2013-01-01
Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.
Bootstrapping the energy flow in the beginning of life.
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in
Sidecoin: a snapshot mechanism for bootstrapping a blockchain
Krug, Joseph; Peterson, Jack
2015-01-01
Sidecoin is a mechanism that allows a snapshot to be taken of Bitcoin's blockchain. We compile a list of Bitcoin's unspent transaction outputs, then use these outputs and their corresponding balances to bootstrap a new blockchain. This allows the preservation of Bitcoin's economic state in the context of a new blockchain, which may provide new features and technical innovations.
Adaptive Kernel In The Bootstrap Boosting Algorithm In KDE ...
African Journals Online (AJOL)
This paper proposes the use of adaptive kernel in a bootstrap boosting algorithm in kernel density estimation. The algorithm is a bias reduction scheme like other existing schemes but uses adaptive kernel instead of the regular fixed kernels. An empirical study for this scheme is conducted and the findings are comparatively ...
Assessing blood flow control through a bootstrap method
Simpson, D.M.; Panerai, R.B.; Ramos, E.G.; Lopes, J.M.A.; Villar Marinatto, M.N.; Nadal, J.; Evans, D.H.
2004-01-01
In order to assess blood flow control, the relationship between blood pressure and blood flow can be modeled by linear filters. We present a bootstrap method, which allows the statistical analysis of an index of blood flow control that is obtained from constrained system identification using an established set of pre-defined filters.
Integrable deformations of conformal theories and bootstrap trees
International Nuclear Information System (INIS)
Mussardo, G.
1991-01-01
I present recent results in the study of massive integrable quantum field theories in (1+1) dimensions considered as perturbed conformal minimal models. The on mass-shell properties of such theories, with a particular emphasis on the bootstrap principle, are investigated. (orig.)
Spurious 99% bootstrap and jackknife support for unsupported clades.
Simmons, Mark P; Freudenstein, John V
2011-10-01
Quantifying branch support using the bootstrap and/or jackknife is generally considered to be an essential component of rigorous parsimony and maximum likelihood phylogenetic analyses. Previous authors have described how application of the frequency-within-replicates approach to treating multiple equally optimal trees found in a given bootstrap pseudoreplicate can provide apparent support for otherwise unsupported clades. We demonstrate how a similar problem may occur when a non-representative subset of equally optimal trees are held per pseudoreplicate, which we term the undersampling-within-replicates artifact. We illustrate the frequency-within-replicates and undersampling-within-replicates bootstrap and jackknife artifacts using both contrived and empirical examples, demonstrate that the artifacts can occur in both parsimony and likelihood analyses, and show that the artifacts occur in outputs from multiple different phylogenetic-inference programs. Based on our results, we make the following five recommendations, which are particularly relevant to supermatrix analyses, but apply to all phylogenetic analyses. First, when two or more optimal trees are found in a given pseudoreplicate they should be summarized using the strict-consensus rather than frequency-within-replicates approach. Second jackknife resampling should be used rather than bootstrap resampling. Third, multiple tree searches while holding multiple trees per search should be conducted in each pseudoreplicate rather than conducting only a single search and holding only a single tree. Fourth, branches with a minimum possible optimized length of zero should be collapsed within each tree search rather than collapsing branches only if their maximum possible optimized length is zero. Fifth, resampling values should be mapped onto the strict consensus of all optimal trees found rather than simply presenting the ≥ 50% bootstrap or jackknife tree or mapping the resampling values onto a single optimal tree
Sequential stochastic optimization
Cairoli, Renzo
1996-01-01
Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet
Plowshare sequential device test
Energy Technology Data Exchange (ETDEWEB)
Ballou, L. B.
1971-08-02
For over a year we have been advocating the development of a hardened or ruggedized version of Diamond which will be suitable for sequential detonation of multiple explosives in one emplacement hole. A Plowshare-sponsored device development test, named `Yacht` is proposed for execution in Area 15 at the Nevada Test Site [NTS] in late September 1972. The test is designed to evaluate the ability of a ruggedized Diamond-type explosive assembly to withstand the effects of an adjacent nuclear detonation in the same emplacement hole and then be sequentially fired. The objectives and experimental plan for this concept is provided.
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
RANDOM QUADRATIC-FORMS AND THE BOOTSTRAP FOR U-STATISTICS
DEHLING, H; MIKOSCH, T
1994-01-01
We study the bootstrap distribution for U-statistics with special emphasis on the degenerate case. For the Efron bootstrap we give a short proof of the consistency using Mallows' metrics. We also study the i.i.d. weighted bootstrap [GRAPHICS] where (X(i)) and (xi(i)) are two i.i.d. sequences,
International Nuclear Information System (INIS)
Wang, Guodong; He, Zhen; Xue, Li; Cui, Qingan; Lv, Shanshan; Zhou, Panpan
2017-01-01
Factors which significantly affect product reliability are of great interest to reliability practitioners. This paper proposes a bootstrap-based methodology for identifying significant factors when both location and scale parameters of the smallest extreme value distribution vary over experimental factors. An industrial thermostat experiment is presented, analyzed, and discussed as an illustrative example. The analysis results show that 1) the misspecification of a constant scale parameter may lead to misidentify spurious effects; 2) the important factors identified by different bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping) are different; 3) the number of factors affecting 10th percentile lifetime significantly is less than the number of important factors identified at 63.21th percentile. - Highlights: • Product reliability is improved by design of experiments under both scale and location parameters of smallest extreme value distribution vary with experimental factors. • A bootstrap-based methodology is proposed to identify important factors which affect 100pth lifetime percentile significantly. • Bootstrapping confidence intervals associating experimental factors are obtained by using three bootstrap methods (i.e., percentile bootstrapping, bias-corrected percentile bootstrapping, and bias-corrected and accelerated percentile bootstrapping). • The important factors identified by different bootstrap methods are different. • The number of factors affecting 10th percentile significantly is less than the number of important factors identified at 63.21th percentile.
Sequential memory: Binding dynamics
Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail
2015-10-01
Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.
Language specific bootstraps for UG categories
van Kampen, N.J.
2005-01-01
This paper argues that the universal categories N/V are not applied to content words before the grammatical markings for reference D(eterminers) and predication I(nflection) have been acquired (van Kampen, 1997, contra Pinker, 1984). Child grammar starts as proto-grammar with language-specific
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2002-01-01
A sequential injection (SI) on-line matrix removal and trace metal preconcentration procedure by using a novel microcolumn packed with PTFE beads is described, and demonstrated for trace cadmium analysis with detection by electrothermal atomic absorption spectrometry (ETAAS). The analyte is initi......A sequential injection (SI) on-line matrix removal and trace metal preconcentration procedure by using a novel microcolumn packed with PTFE beads is described, and demonstrated for trace cadmium analysis with detection by electrothermal atomic absorption spectrometry (ETAAS). The analyte...... is initially complexed with diethyldithiophosphate (DDPA) and adsorbed onto the column, which is afterwards eluted with 50 mul of ethanol and subsequently introduced via air segmentation into the graphite tube for quantification. The ETAAS determination is synchronized with sample pre-treatment in the SI....../h, quantitative adsorption of cadmium (99% retention efficiency) and an enrichment factor of 59.4 were obtained, as compared with only 46.7% and 28.0 by using a knotted reactor of similar internal surface area as the packed column. The detection limits and precision (RSD, 0.1 mug/l Cd) are at the same levels, i...
A sequential tree approach for incremental sequential pattern mining
Indian Academy of Sciences (India)
Data mining; STISPM; sequential tree; incremental mining; backward tracking. Abstract. ''Sequential pattern mining'' is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases.Pragmatically, looking into the ...
A proof of fulfillment of the strong bootstrap condition
International Nuclear Information System (INIS)
Fadin, V.S.; Papa, A.
2002-01-01
It is shown that the kernel of the BFKL equation for the octet color state of two Reggeized gluons satisfies the strong bootstrap condition in the next-to-leading order. This condition is much more restrictive than the one obtained from the requirement of the Reggeized form for the elastic scattering amplitudes in the next-to-leading approximation. It is necessary, however, for self-consistency of the assumption of the Reggeized form of the production amplitudes in multi-Regge kinematics, which are used in the derivation of the BFKL equation. The fulfillment of the strong bootstrap condition for the kernel opens the way to a rigorous proof of the BFKL equation in the next-to-leading approximation. (author)
A bootstrap lunar base: Preliminary design review 2
1987-01-01
A bootstrap lunar base is the gateway to manned solar system exploration and requires new ideas and new designs on the cutting edge of technology. A preliminary design for a Bootstrap Lunar Base, the second provided by this contractor, is presented. An overview of the work completed is discussed as well as the technical, management, and cost strategies to complete the program requirements. The lunar base design stresses the transforming capabilities of its lander vehicles to aid in base construction. The design also emphasizes modularity and expandability in the base configuration to support the long-term goals of scientific research and profitable lunar resource exploitation. To successfully construct, develop, and inhabit a permanent lunar base, however, several technological advancements must first be realized. Some of these technological advancements are also discussed.
On Comparison of Stochastic Reserving Methods with Bootstrapping
Directory of Open Access Journals (Sweden)
Liivika Tee
2017-01-01
Full Text Available We consider the well-known stochastic reserve estimation methods on the basis of generalized linear models, such as the (over-dispersed Poisson model, the gamma model and the log-normal model. For the likely variability of the claims reserve, bootstrap method is considered. In the bootstrapping framework, we discuss the choice of residuals, namely the Pearson residuals, the deviance residuals and the Anscombe residuals. In addition, several possible residual adjustments are discussed and compared in a case study. We carry out a practical implementation and comparison of methods using real-life insurance data to estimate reserves and their prediction errors. We propose to consider proper scoring rules for model validation, and the assessments will be drawn from an extensive case study.
A Double Parametric Bootstrap Test for Topic Models
Seto, Skyler; Tan, Sarah; Hooker, Giles; Wells, Martin T.
2017-01-01
Non-negative matrix factorization (NMF) is a technique for finding latent representations of data. The method has been applied to corpora to construct topic models. However, NMF has likelihood assumptions which are often violated by real document corpora. We present a double parametric bootstrap test for evaluating the fit of an NMF-based topic model based on the duality of the KL divergence and Poisson maximum likelihood estimation. The test correctly identifies whether a topic model based o...
Higgs Critical Exponents and Conformal Bootstrap in Four Dimensions
DEFF Research Database (Denmark)
Antipin, Oleg; Mølgaard, Esben; Sannino, Francesco
2015-01-01
We investigate relevant properties of composite operators emerging in nonsupersymmetric, four-dimensional gauge-Yukawa theories with interacting conformal fixed points within a precise framework. The theories investigated in this work are structurally similar to the standard model of particle int...... bootstrap results are then compared to precise four dimensional conformal field theoretical results. To accomplish this, it was necessary to calculate explicitly the crossing symmetry relations for the global symmetry group SU($N$)$\\times$SU($N$)....
'Bootstrap' charging of surfaces composed of multiple materials
Stannard, P. R.; Katz, I.; Parks, D. E.
1981-01-01
The paper examines the charging of a checkerboard array of two materials, only one of which tends to acquire a negative potential alone, using the NASA Charging Analyzer Program (NASCAP). The influence of the charging material's field causes the otherwise 'non-charging' material to acquire a negative potential due to the suppression of its secondary emission ('bootstrap' charging). The NASCAP predictions for the equilibrium potential difference between the two materials are compared to results based on an analytical model.
TruSDN: Bootstrapping Trust in Cloud Network Infrastructure
Paladi, Nicolae; Gehrmann, Christian
2017-01-01
Software-Defined Networking (SDN) is a novel architectural model for cloud network infrastructure, improving resource utilization, scalability and administration. SDN deployments increasingly rely on virtual switches executing on commodity operating systems with large code bases, which are prime targets for adversaries attacking the net- work infrastructure. We describe and implement TruSDN, a framework for bootstrapping trust in SDN infrastructure using Intel Software Guard Extensions (SGX),...
Necessary Condition for Emergent Symmetry from the Conformal Bootstrap.
Nakayama, Yu; Ohtsuki, Tomoki
2016-09-23
We use the conformal bootstrap program to derive the necessary conditions for emergent symmetry enhancement from discrete symmetry (e.g., Z_{n}) to continuous symmetry [e.g., U(1)] under the renormalization group flow. In three dimensions, in order for Z_{2} symmetry to be enhanced to U(1) symmetry, the conformal bootstrap program predicts that the scaling dimension of the order parameter field at the infrared conformal fixed point must satisfy Δ_{1}>1.08. We also obtain the similar necessary conditions for Z_{3} symmetry with Δ_{1}>0.580 and Z_{4} symmetry with Δ_{1}>0.504 from the simultaneous conformal bootstrap analysis of multiple four-point functions. As applications, we show that our necessary conditions impose severe constraints on the nature of the chiral phase transition in QCD, the deconfinement criticality in Néel valence bond solid transitions, and anisotropic deformations in critical O(n) models. We prove that some fixed points proposed in the literature are unstable under the perturbation that cannot be forbidden by the discrete symmetry. In these situations, the second-order phase transition with enhanced symmetry cannot happen.
Bootstrap method of interior-branch test for phylogenetic trees.
Sitnikova, T
1996-04-01
Statistical properties of the bootstrap test of interior branch lengths of phylogenetic trees have been studied and compared with those of the standard interior-branch test in computer simulations. Examination of the properties of the tests under the null hypothesis showed that both tests for an interior branch of a predetermined topology are quite reliable when the distribution of the branch length estimate approaches a normal distribution. Unlike the standard interior-branch test, the bootstrap test appears to retain this property even when the substitution rate varies among sites. In this case, the distribution of the branch length estimate deviates from a normal distribution, and the standard interior-branch test gives conservative confidence probability values. A simple correction method was developed for both interior-branch tests to be applied for testing the reliability of tree topologies estimated from sequence data. This correction for the standard interior-branch test appears to be as effective as that obtained in our previous study, though it is much simpler. The bootstrap and standard interior-branch tests for estimated topologies become conservative as the number of sequence groups in a star-like tree increases.
Wang, Huai-Chun; Susko, Edward; Roger, Andrew J
2016-12-01
Assessing the robustness of an inferred phylogeny is an important element of phylogenetics. This is typically done with measures of stabilities at the internal branches and the variation of the positions of the leaf nodes. The bootstrap support for branches in maximum parsimony, distance and maximum likelihood estimation, or posterior probabilities in Bayesian inference, measure the uncertainty about a branch due to the sampling of the sites from genes or sampling genes from genomes. However, these measures do not reveal how taxon sampling affects branch support and the effects of taxon sampling on the estimated phylogeny. An internal branch in a phylogenetic tree can be viewed as a split that separates the taxa into two nonempty complementary subsets. We develop several split-specific measures of stability determined from bootstrap support for quartets. These include BPtaxon_split (average bootstrap percentage [BP] for all quartets involving a taxon within a split), BPsplit (BPtaxon_split averaged over taxa), BPtaxon (BPtaxon_split averaged over splits) and RBIC-taxon (average BP over all splits after removing a taxon). We also develop a pruned-tree distance metric. Application of our measures to empirical and simulated data illustrate that existing measures of overall stability can fail to detect taxa that are the primary source of a split-specific instability. Moreover, we show that the use of many reduced sets of quartets is important in being able to detect the influence of joint sets of taxa rather than individual taxa. These new measures are valuable diagnostic tools to guide taxon sampling in phylogenetic experimental design. Copyright © 2016 Elsevier Inc. All rights reserved.
Sequential measurements of conjugate observables
Energy Technology Data Exchange (ETDEWEB)
Carmeli, Claudio [Dipartimento di Fisica, Universita di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Heinosaari, Teiko [Department of Physics and Astronomy, Turku Centre for Quantum Physics, University of Turku, 20014 Turku (Finland); Toigo, Alessandro, E-mail: claudio.carmeli@gmail.com, E-mail: teiko.heinosaari@utu.fi, E-mail: alessandro.toigo@polimi.it [Dipartimento di Matematica ' Francesco Brioschi' , Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)
2011-07-15
We present a unified treatment of sequential measurements of two conjugate observables. Our approach is to derive a mathematical structure theorem for all the relevant covariant instruments. As a consequence of this result, we show that every Weyl-Heisenberg covariant observable can be implemented as a sequential measurement of two conjugate observables. This method is applicable both in finite- and infinite-dimensional Hilbert spaces, therefore covering sequential spin component measurements as well as position-momentum sequential measurements.
Bootstrapping the Three-Loop Hexagon
Energy Technology Data Exchange (ETDEWEB)
Dixon, Lance J.; /CERN /SLAC; Drummond, James M.; /CERN /Annecy, LAPTH; Henn, Johannes M.; /Humboldt U., Berlin /Santa Barbara, KITP
2011-11-08
We consider the hexagonal Wilson loop dual to the six-point MHV amplitude in planar N = 4 super Yang-Mills theory. We apply constraints from the operator product expansion in the near-collinear limit to the symbol of the remainder function at three loops. Using these constraints, and assuming a natural ansatz for the symbol's entries, we determine the symbol up to just two undetermined constants. In the multi-Regge limit, both constants drop out from the symbol, enabling us to make a non-trivial confirmation of the BFKL prediction for the leading-log approximation. This result provides a strong consistency check of both our ansatz for the symbol and the duality between Wilson loops and MHV amplitudes. Furthermore, we predict the form of the full three-loop remainder function in the multi-Regge limit, beyond the leading-log approximation, up to a few constants representing terms not detected by the symbol. Our results confirm an all-loop prediction for the real part of the remainder function in multi-Regge 3 {yields} 3 scattering. In the multi-Regge limit, our result for the remainder function can be expressed entirely in terms of classical polylogarithms. For generic six-point kinematics other functions are required.
Forced Sequence Sequential Decoding
DEFF Research Database (Denmark)
Jensen, Ole Riis
is possible as low as Eb/No=0.6 dB, which is about 1.7 dB below the signal-to-noise ratio that marks the cut-off rate for the convolutional code. This is possible since the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability...
The Index of Biological Integrity and the bootstrap revisited: an example from Minnesota streams
Dolph, Christine L.; Sheshukov, Aleksey Y.; Chizinski, Christopher J.; Vondracek, Bruce C.; Wilson, Bruce
2010-01-01
Multimetric indices, such as the Index of Biological Integrity (IBI), are increasingly used by management agencies to determine whether surface water quality is impaired. However, important questions about the variability of these indices have not been thoroughly addressed in the scientific literature. In this study, we used a bootstrap approach to quantify variability associated with fish IBIs developed for streams in two Minnesota river basins. We further placed this variability into a management context by comparing it to impairment thresholds currently used in water quality determinations for Minnesota streams. We found that 95% confidence intervals ranged as high as 40 points for IBIs scored on a 0–100 point scale. However, on average, 90% of IBI scores calculated from bootstrap replicate samples for a given stream site yielded the same impairment status as the original IBI score. We suggest that sampling variability in IBI scores is related to both the number of fish and the number of rare taxa in a field collection. A comparison of the effects of different scoring methods on IBI variability indicates that a continuous scoring method may reduce the amount of bias in IBI scores.
DEFF Research Database (Denmark)
Long, Xiangbao; Miró, Manuel; Hansen, Elo Harald
-Lab-on-Valve (SI-LOV) mode. The methodology uses poly(styrene-divinylbenzene) beads containing pendant octadecyl moieties (C18-PS/DVB), which are pre-impregnated with a selective organic metal chelating agent prior to the automatic manipulation of the beads in the microbore conduits of the LOV unit. By adapting......A new sample pretreatment approach is presented for selective and sensitive determination of trace metals via electrothermal atomic absorption spectrometry (ETAAS) based on the principle of bead injection (BI) with renewable reagent-loaded hydrophobic beads in a Sequential Injection...... in selecting the most favourable elution mode in order to attain the highest sensitivity. Cr(VI) is used as model analyte to demonstrate the potential of the SI-BI-LOV scheme, in which 1,5-diphenylcarbazide (DPC) is pre-impregnated on the beads, the surface of which serves as the active microzone. It is shown...
International Nuclear Information System (INIS)
Tekula-Buxbaum, P.
1981-01-01
An indirect atomic-absorption spectrophotometric method based on selective extraction of heteropolymolybdic acids has been developed for determination of small quantities of P and As in high-purity tungsten metal and tungsten compounds. The method is suitable for determination of 5-100 ppm of phosphorus and arsenic. The relative standard deviation is 38-5% for P and 31-3% for As, depending on the concentrations. (auth.)
Dobolyi, David G; Dodson, Chad S
2013-12-01
Confidence judgments for eyewitness identifications play an integral role in determining guilt during legal proceedings. Past research has shown that confidence in positive identifications is strongly associated with accuracy. Using a standard lineup recognition paradigm, we investigated accuracy using signal detection and ROC analyses, along with the tendency to choose a face with both simultaneous and sequential lineups. We replicated past findings of reduced rates of choosing with sequential as compared to simultaneous lineups, but notably found an accuracy advantage in favor of simultaneous lineups. Moreover, our analysis of the confidence-accuracy relationship revealed two key findings. First, we observed a sequential mistaken identification overconfidence effect: despite an overall reduction in false alarms, confidence for false alarms that did occur was higher with sequential lineups than with simultaneous lineups, with no differences in confidence for correct identifications. This sequential mistaken identification overconfidence effect is an expected byproduct of the use of a more conservative identification criterion with sequential than with simultaneous lineups. Second, we found a steady drop in confidence for mistaken identifications (i.e., foil identifications and false alarms) from the first to the last face in sequential lineups, whereas confidence in and accuracy of correct identifications remained relatively stable. Overall, we observed that sequential lineups are both less accurate and produce higher confidence false identifications than do simultaneous lineups. Given the increasing prominence of sequential lineups in our legal system, our data argue for increased scrutiny and possibly a wholesale reevaluation of this lineup format. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Lozano, J C; Tomé, F Vera; Rodríguez, P Blanco; Prieto, C
2010-01-01
A procedure for the determination of (210)Pb, and alpha-emitting radioisotopes of uranium, thorium, and radium from the same aliquot of a sample has been proposed. The key step consisted in the recovery of Pb(II) and Ra by precipitation of insoluble Pb(NO(3))(2), the uranium and thorium radioisotopes remaining in solution. Afterwards, the fractions were handled by specific, well consolidated procedures. Lead-210 was determined by the LSC technique while the uranium, thorium, and radium radioisotopes were measured with silicon alpha-spectrometers. The procedure was applied to a reference sample and several environmental samples obtaining satisfactory results. Copyright 2009 Elsevier Ltd. All rights reserved.
Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle
2013-01-01
A version of the nonparametric bootstrap, which resamples the entire subjects from original data, called the case bootstrap, has been increasingly used for estimating uncertainty of parameters in mixed-effects models. It is usually applied to obtain more robust estimates of the parameters and more realistic confidence intervals (CIs). Alternative bootstrap methods, such as residual bootstrap and parametric bootstrap that resample both random effects and residuals, have been proposed to better take into account the hierarchical structure of multi-level and longitudinal data. However, few studies have been performed to compare these different approaches. In this study, we used simulation to evaluate bootstrap methods proposed for linear mixed-effect models. We also compared the results obtained by maximum likelihood (ML) and restricted maximum likelihood (REML). Our simulation studies evidenced the good performance of the case bootstrap as well as the bootstraps of both random effects and residuals. On the other hand, the bootstrap methods that resample only the residuals and the bootstraps combining case and residuals performed poorly. REML and ML provided similar bootstrap estimates of uncertainty, but there was slightly more bias and poorer coverage rate for variance parameters with ML in the sparse design. We applied the proposed methods to a real dataset from a study investigating the natural evolution of Parkinson's disease and were able to confirm that the methods provide plausible estimates of uncertainty. Given that most real-life datasets tend to exhibit heterogeneity in sampling schedules, the residual bootstraps would be expected to perform better than the case bootstrap. Copyright © 2013 John Wiley & Sons, Ltd.
International Nuclear Information System (INIS)
Santos, Inês C.; Mesquita, Raquel B.R.; Rangel, António O.S.S.
2015-01-01
This work describes the development of a solid phase spectrophotometry method in a μSI-LOV system for cadmium, zinc, and copper determination in freshwaters. NTA (Nitrilotriacetic acid) beads with 60–160 μm diameter were packed in the flow cell of the LOV for a μSPE column of 1 cm length. The spectrophotometric determination is based on the colourimetric reaction between dithizone and the target metals, previously retained on NTA resin. The absorbance of the coloured product formed is measured, at 550 nm, on the surface of the NTA resin beads in a solid phase spectrophotometry approach. The developed method presented preconcentration factors in the range of 11–21 for the metal ions. A LOD of 0.23 μg L −1 for cadmium, 2.39 μg L −1 for zinc, and 0.11 μg L −1 for copper and a sampling rate of 12, 13, and 15 h −1 for cadmium, zinc, and copper were obtained, respectively. The proposed method was successfully applied to freshwater samples. - Highlights: • Multi-parametric determination of cadmium, zinc, and copper at the μg L −1 level. • In-line metal ions preconcentration using NTA resin. • Minimization of matrix interferences by performing solid phase spectrometry in a SI-LOV platform. • Successful application to metal ions determination in freshwaters.
Bootstrap resampling approach to disaggregate analysis of road crashes in Hong Kong.
Pei, Xin; Sze, N N; Wong, S C; Yao, Danya
2016-10-01
Road safety affects health and development worldwide; thus, it is essential to examine the factors that influence crashes and injuries. As the relationships between crashes, crash severity, and possible risk factors can vary depending on the type of collision, we attempt to develop separate prediction models for different crash types (i.e., single- versus multi-vehicle crashes and slight injury versus killed and serious injury crashes). Taking advantage of the availability of crash and traffic data disaggregated by time and space, it is possible to identify the factors that may contribute to crash risks in Hong Kong, including traffic flow, road design, and weather conditions. To remove the effects of excess zeros on prediction performance in a highly disaggregated crash prediction model, a bootstrap resampling method is applied. The results indicate that more accurate and reliable parameter estimates, with reduced standard errors, can be obtained with the use of a bootstrap resampling method. Results revealed that factors including rainfall, geometric design, traffic control, and temporal variations all determined the crash risk and crash severity. This helps to shed light on the development of remedial engineering and traffic management and control measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Synthetic Aperture Sequential Beamforming
DEFF Research Database (Denmark)
Kortbek, Jacob; Jensen, Jørgen Arendt; Gammelmark, Kim Løkke
2008-01-01
A synthetic aperture focusing (SAF) technique denoted Synthetic Aperture Sequential Beamforming (SASB) suitable for 2D and 3D imaging is presented. The technique differ from prior art of SAF in the sense that SAF is performed on pre-beamformed data contrary to channel data. The objective...... is stored. The second stage applies the focused image lines from the first stage as input data. The SASB method has been investigated using simulations in Field II and by off-line processing of data acquired with a commercial scanner. The performance of SASB with a static image object is compared with DRF...
DEFF Research Database (Denmark)
Long, Xiangbao; Miró, Manuel; Hansen, Elo Harald
2005-01-01
that the bead material exhibits excellent chemical stability at low pH values. On-line pH sample adjustment prevents alteration of the original distribution of chromium species while assuring fast rates for the DPC-Cr(VI) reaction. The proposed procedure was successfully applied to the determination of trace...... levels of Cr(VI) in natural waters containing high levels of dissolved salts (such as seawater and hard tap water) without requiring any dilution step. Method validation was performed by determination of total chromium in an NIST standard reference material (NIST 1640, natural water) after Cr...... favourable elution mode in order to attain the highest sensitivity. The potential of the SI-BI-LOV scheme is demonstrated by taking Cr(VI) as a model analyte, using a 1,5-diphenylcarbazide (DPC)-loaded bead column as the active microzone. As this reaction requires the use of high acidity, it is also shown...
DEFF Research Database (Denmark)
Sørensen, P.; Jensen, E.S.
1991-01-01
A novel diffusion method was used for preparation of NH4+- and NO3--N samples from soil extracts for N-15 determination. Ammonium, and nitrate following reduction to ammonia, are allowed to diffuse to an acid-wetted glass filter enclosed in polytetrafluoroethylene tape. The method was evaluated w......-degrees-C). Owing to the presence of inorganic nitrogen impurities in the potassium chloride, the N-15 enrichments should be corrected for the blank nitrogen content....
Energy Technology Data Exchange (ETDEWEB)
Santos, Inês C. [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal); Mesquita, Raquel B.R. [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal); Laboratório de Hidrobiologia, Instituto de Ciências Biomédicas Abel Salazar (ICBAS), Universidade do Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal); Rangel, António O.S.S., E-mail: arangel@porto.ucp.pt [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal)
2015-09-03
This work describes the development of a solid phase spectrophotometry method in a μSI-LOV system for cadmium, zinc, and copper determination in freshwaters. NTA (Nitrilotriacetic acid) beads with 60–160 μm diameter were packed in the flow cell of the LOV for a μSPE column of 1 cm length. The spectrophotometric determination is based on the colourimetric reaction between dithizone and the target metals, previously retained on NTA resin. The absorbance of the coloured product formed is measured, at 550 nm, on the surface of the NTA resin beads in a solid phase spectrophotometry approach. The developed method presented preconcentration factors in the range of 11–21 for the metal ions. A LOD of 0.23 μg L{sup −1} for cadmium, 2.39 μg L{sup −1} for zinc, and 0.11 μg L{sup −1} for copper and a sampling rate of 12, 13, and 15 h{sup −1} for cadmium, zinc, and copper were obtained, respectively. The proposed method was successfully applied to freshwater samples. - Highlights: • Multi-parametric determination of cadmium, zinc, and copper at the μg L{sup −1} level. • In-line metal ions preconcentration using NTA resin. • Minimization of matrix interferences by performing solid phase spectrometry in a SI-LOV platform. • Successful application to metal ions determination in freshwaters.
Palazón-Bru, Antonio; Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3-100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35-3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease.
Gudicha, Dereje W; Schmittmann, Verena D; Tekle, Fetene B; Vermunt, Jeroen K
2016-01-01
The latent Markov (LM) model is a popular method for identifying distinct unobserved states and transitions between these states over time in longitudinally observed responses. The bootstrap likelihood-ratio (BLR) test yields the most rigorous test for determining the number of latent states, yet little is known about power analysis for this test. Power could be computed as the proportion of the bootstrap p values (PBP) for which the null hypothesis is rejected. This requires performing the full bootstrap procedure for a large number of samples generated from the model under the alternative hypothesis, which is computationally infeasible in most situations. This article presents a computationally feasible shortcut method for power computation for the BLR test. The shortcut method involves the following simple steps: (1) obtaining the parameters of the model under the null hypothesis, (2) constructing the empirical distributions of the likelihood ratio under the null and alternative hypotheses via Monte Carlo simulations, and (3) using these empirical distributions to compute the power. We evaluate the performance of the shortcut method by comparing it to the PBP method and, moreover, show how the shortcut method can be used for sample-size determination.
Weighted bootstrapping: a correction method for assessing the robustness of phylogenetic trees
Directory of Open Access Journals (Sweden)
Makarenkov Vladimir
2010-08-01
Full Text Available Abstract Background Non-parametric bootstrapping is a widely-used statistical procedure for assessing confidence of model parameters based on the empirical distribution of the observed data 1 and, as such, it has become a common method for assessing tree confidence in phylogenetics 2. Traditional non-parametric bootstrapping does not weigh each tree inferred from resampled (i.e., pseudo-replicated sequences. Hence, the quality of these trees is not taken into account when computing bootstrap scores associated with the clades of the original phylogeny. As a consequence, traditionally, the trees with different bootstrap support or those providing a different fit to the corresponding pseudo-replicated sequences (the fit quality can be expressed through the LS, ML or parsimony score contribute in the same way to the computation of the bootstrap support of the original phylogeny. Results In this article, we discuss the idea of applying weighted bootstrapping to phylogenetic reconstruction by weighting each phylogeny inferred from resampled sequences. Tree weights can be based either on the least-squares (LS tree estimate or on the average secondary bootstrap score (SBS associated with each resampled tree. Secondary bootstrapping consists of the estimation of bootstrap scores of the trees inferred from resampled data. The LS and SBS-based bootstrapping procedures were designed to take into account the quality of each "pseudo-replicated" phylogeny in the final tree estimation. A simulation study was carried out to evaluate the performances of the five weighting strategies which are as follows: LS and SBS-based bootstrapping, LS and SBS-based bootstrapping with data normalization and the traditional unweighted bootstrapping. Conclusions The simulations conducted with two real data sets and the five weighting strategies suggest that the SBS-based bootstrapping with the data normalization usually exhibits larger bootstrap scores and a higher robustness
Collins, Jon W; Heyward Hull, J; Dumond, Julie B
2017-12-01
Sparse tissue sampling with intensive plasma sampling creates a unique data analysis problem in determining drug exposure in clinically relevant tissues. Tissue exposure may govern drug efficacy, as many drugs exert their actions in tissues. We compared tissue area-under-the-curve (AUC) generated from bootstrapped noncompartmental analysis (NCA) methods and compartmental nonlinear mixed effect (NLME) modeling. A model of observed data after single-dose tenofovir disoproxil fumarate was used to simulate plasma and tissue concentrations for two destructive tissue sampling schemes. Two groups of 100 data sets with densely-sampled plasma and one tissue sample per individual were created. The bootstrapped NCA (SAS 9.3) used a trapezoidal method to calculate geometric mean tissue AUC per dataset. For NLME, individual post hoc estimates of tissue AUC were determined, and the geometric mean from each dataset calculated. Median normalized prediction error (NPE) and absolute normalized prediction error (ANPE) were calculated for each method from the true values of the modeled concentrations. Both methods produced similar tissue AUC estimates close to true values. Although the NLME-generated AUC estimates had larger NPEs, it had smaller ANPEs. Overall, NLME NPEs showed AUC under-prediction but improved precision and fewer outliers. The bootstrapped NCA method produced more accurate estimates but with some NPEs > 100%. In general, NLME is preferred, as it accommodates less intensive tissue sampling with reasonable results, and provides simulation capabilities for optimizing tissue distribution. However, if the main goal is an accurate AUC for the studied scenario, and relatively intense tissue sampling is feasible, the NCA bootstrap method is a reasonable, and potentially less time-intensive solution.
Two novel applications of bootstrap currents: Snakes and jitter stabilization
Thyagaraja, A.; Haas, F. A.
1993-09-01
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of noninductive current are considered in this work. In the first, an earlier model of the nonlinearly saturated m=1 tearing mode is extended to include the stabilizing effect of a bootstrap current inside the island. This is used to explain several observed features of the so-called ``snake'' reported in the Joint European Torus (JET) [R. D. Gill, A. W. Edwards, D. Pasini, and A. Weller, Nucl. Fusion 32, 723 (1992)]. The second application involves an alternating current (ac) form of bootstrap current, produced by pressure-gradient fluctuations. It is suggested that a time-dependent (in the plasma frame), radio-frequency (rf) power source can be used to produce localized pressure fluctuations of suitable frequency and amplitude to implement the dynamic stabilization method for suppressing gross modes in tokamaks suggested in a recent paper [A. Thyagaraja, R. D. Hazeltine, and A. Y. Aydemir, Phys. Fluids B 4, 2733 (1992)]. This method works by ``detuning'' the resonant layer by rapid current/shear fluctuations. Estimates made for the power source requirements both for small machines such as COMPASS and for larger machines like JET suggest that the method could be practically feasible. This ``jitter'' (i.e., dynamic) stabilization method could provide a useful form of active instability control to avoid both gross/disruptive and fine-scale/transportive instabilities, which may set severe operating/safety constraints in the reactor regime. The results are also capable, in principle, of throwing considerable light on the local properties of current generation and diffusion in tokamaks, which may be enhanced by turbulence, as has been suggested recently by several researchers.
A Bootstrap Approach to an Affordable Exploration Program
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and
DEFF Research Database (Denmark)
Hounyo, Ulrich
covariance estimator. As an application of our results, we also consider the bootstrap for regression coefficients. We show that the wild blocks of bootstrap, appropriately centered, is able to mimic both the dependence and heterogeneity of the scores, thus justifying the construction of bootstrap percentile......We propose a bootstrap mehtod for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootsratp method suggested for the pre-averaged realized volatility estimator......-studentized statistics, our results justify using the bootstrap to esitmate the covariance matrix of a broad class of covolatility estimators. The bootstrap variance estimator is positive semi-definite by construction, an appealing feature that is not always shared by existing variance estimators of the integrated...
The S-matrix bootstrap II: two dimensional amplitudes
Paulos, Miguel F.; Penedones, Joao; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-01
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Check of the bootstrap conditions for the gluon Reggeization
International Nuclear Information System (INIS)
Papa, A.
2000-01-01
The property of gluon Reggeization plays an essential role in the derivation of the Balitsky-Fadin-Kuraev-Lipatov (BFKL) equation for the cross sections at high energy √s in perturbative QCD. This property has been proved to all orders of perturbation theory in the leading logarithmic approximation and it is assumed to be valid also in the next-to-leading logarithmic approximation, where it has been checked only to the first three orders of perturbation theory. From s-channel unitarity, however, very stringent 'bootstrap' conditions can be derived which, if fulfilled, leave no doubts that gluon Reggeization holds
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Dimensional Reduction via Noncommutative Spacetime: Bootstrap and Holography
Li, Miao
2002-05-01
Unlike noncommutative space, when space and time are noncommutative, it seems necessary to modify the usual scheme of quantum mechanics. We propose in this paper a simple generalization of the time evolution equation in quantum mechanics to incorporate the feature of a noncommutative spacetime. This equation is much more constraining than the usual Schrödinger equation in that the spatial dimension noncommuting with time is effectively reduced to a point in low energy. We thus call the new evolution equation the spacetime bootstrap equation, the dimensional reduction called for by this evolution seems close to what is required by the holographic principle. We will discuss several examples to demonstrate this point.
Santos, Inês C; Mesquita, Raquel B R; Rangel, António O S S
2015-09-03
This work describes the development of a solid phase spectrophotometry method in a μSI-LOV system for cadmium, zinc, and copper determination in freshwaters. NTA (Nitrilotriacetic acid) beads with 60-160 μm diameter were packed in the flow cell of the LOV for a μSPE column of 1 cm length. The spectrophotometric determination is based on the colourimetric reaction between dithizone and the target metals, previously retained on NTA resin. The absorbance of the coloured product formed is measured, at 550 nm, on the surface of the NTA resin beads in a solid phase spectrophotometry approach. The developed method presented preconcentration factors in the range of 11-21 for the metal ions. A LOD of 0.23 μg L(-1) for cadmium, 2.39 μg L(-1) for zinc, and 0.11 μg L(-1) for copper and a sampling rate of 12, 13, and 15 h(-1) for cadmium, zinc, and copper were obtained, respectively. The proposed method was successfully applied to freshwater samples. Copyright © 2015 Elsevier B.V. All rights reserved.
Hernando, M D; Agüera, A; Fernández-Alba, A R; Piedra, L; Contreras, M
2001-01-01
A selective and sensitive chromatographic method is described for the determination of nine organochlorine and organophosphorus pesticides in vegetable samples by gas chromatography-mass spectrometry. The proposed method combines the use of positive and negative chemical ionisation and tandem mass spectrometric fragmentation, resulting in a significant increase in selectivity and allowing the simultaneous confirmation and quantification of trace levels of pesticides in complex vegetable matrices. Parameters relative to ionisation and fragmentation processes were optimised to obtain maximum sensitivity. Repeatability and reproducibility studies yielded relative standard deviations lower than 25% in all cases. Identification criteria, such as retention time and relative abundance of characteristic product ions, were also evaluated in order to guarantee the correct identification of the target compounds. The method was applied to real vegetable samples to demonstrate its use in routine analysis.
International Nuclear Information System (INIS)
Bersch, Beate; Rossy, Emmanuel; Coves, Jacques; Brutscher, Bernhard
2003-01-01
NMR experiments are presented which allow backbone resonance assignment, secondary structure identification, and in favorable cases also molecular fold topology determination from a series of two-dimensional 1 H- 15 N HSQC-like spectra. The 1 H- 15 N correlation peaks are frequency shifted by an amount ± ω X along the 15 N dimension, where ω X is the C α , C β , or H α frequency of the same or the preceding residue. Because of the low dimensionality (2D) of the experiments, high-resolution spectra are obtained in a short overall experimental time. The whole series of seven experiments can be performed in typically less than one day. This approach significantly reduces experimental time when compared to the standard 3D-based methods. The here presented methodology is thus especially appealing in the context of high-throughput NMR studies of protein structure, dynamics or molecular interfaces
DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations
National Aeronautics and Space Administration — DMSP SSM/I Daily and Monthly Polar Gridded Bootstrap Sea Ice Concentrations in polar stereographic projection currently include Defense Meteorological Satellite...
What Teachers Should Know About the Bootstrap: Resampling in the Undergraduate Statistics Curriculum
Hesterberg, Tim C.
2015-01-01
Bootstrapping has enormous potential in statistics education and practice, but there are subtle issues and ways to go wrong. For example, the common combination of nonparametric bootstrapping and bootstrap percentile confidence intervals is less accurate than using t-intervals for small samples, though more accurate for larger samples. My goals in this article are to provide a deeper understanding of bootstrap methods—how they work, when they work or not, and which methods work better—and to highlight pedagogical issues. Supplementary materials for this article are available online. [Received December 2014. Revised August 2015] PMID:27019512
Assessing statistical reliability of phylogenetic trees via a speedy double bootstrap method.
Ren, Aizhen; Ishida, Takashi; Akiyama, Yutaka
2013-05-01
Evaluating the reliability of estimated phylogenetic trees is of critical importance in the field of molecular phylogenetics, and for other endeavors that depend on accurate phylogenetic reconstruction. The bootstrap method is a well-known computational approach to phylogenetic tree assessment, and more generally for assessing the reliability of statistical models. However, it is known to be biased under certain circumstances, calling into question the accuracy of the method. Several advanced bootstrap methods have been developed to achieve higher accuracy, one of which is the double bootstrap approach, but the computational burden of this method has precluded its application to practical problems of phylogenetic tree selection. We address this issue by proposing a simple method called the speedy double bootstrap, which circumvents the second-tier resampling step in the regular double bootstrap approach. We also develop an implementation of the regular double bootstrap for comparison with our speedy method. The speedy double bootstrap suffers no significant loss of accuracy compared with the regular double bootstrap, while performing calculations significantly more rapidly (at minimum around 371 times faster, based on analysis of mammalian mitochondrial amino acid sequences and 12S and 16S rRNA genes). Our method thus enables, for the first time, the practical application of the double bootstrap technique in the context of molecular phylogenetics. The approach can also be used more generally for model selection problems wherever the maximum likelihood criterion is used. Copyright © 2013 Elsevier Inc. All rights reserved.
Optical Flow of Small Objects Using Wavelets, Bootstrap Methods, and Synthetic Discriminant Filters
National Research Council Canada - National Science Library
Hewer, Gary
1997-01-01
...) targets in highly cluttered and noisy environments. In this paper; we present a novel wavelet detection algorithm which incorporates adaptive CFAR detection statistics using the bootstrap method...
Energy confinement of tokamak plasma with consideration of bootstrap current effect
International Nuclear Information System (INIS)
Yuan Ying; Gao Qingdi
1992-01-01
Based on the η i -mode induced anomalous transport model of Lee et al., the energy confinement of tokamak plasmas with auxiliary heating is investigated with consideration of bootstrap current effect. The results indicate that energy confinement time increases with plasma current and tokamak major radius, and decreases with heating power, toroidal field and minor radius. This is in reasonable agreement with the Kaye-Goldston empirical scaling law. Bootstrap current always leads to an improvement of energy confinement and the contraction of inversion radius. When γ, the ratio between bootstrap current and total plasma current, is small, the part of energy confinement time contributed from bootstrap current will be about γ/2
Zhu, Yu-Min; Zhang, Hua; Fan, Shi-Suo; Wang, Si-Jia; Xia, Yi; Shao, Li-Ming; He, Pin-Jing
2014-07-15
Due to the heterogeneity of metal distribution, it is challenging to identify the speciation, source and fate of metals in solid samples at micro scales. To overcome these challenges single particles of air pollution control residues were detected in situ by synchrotron microprobe after each step of chemical extraction and analyzed by multivariate statistical analysis. Results showed that Pb, Cu and Zn co-existed as acid soluble fractions during chemical extraction, regardless of their individual distribution as chlorides or oxides in the raw particles. Besides the forms of Fe2O3, MnO2 and FeCr2O4, Fe, Mn, Cr and Ni were closely associated with each other, mainly as reducible fractions. In addition, the two groups of metals had interrelations with the Si-containing insoluble matrix. The binding could not be directly detected by micro-X-ray diffraction (μ-XRD) and XRD, suggesting their partial existence as amorphous forms or in the solid solution. The combined method on single particles can effectively determine metallic multi-associations and various extraction behaviors that could not be identified by XRD, μ-XRD or X-ray absorption spectroscopy. The results are useful for further source identification and migration tracing of heavy metals. Copyright © 2014 Elsevier B.V. All rights reserved.
Sequential auctions and price anomalies
Directory of Open Access Journals (Sweden)
Trifunović Dejan
2014-01-01
Full Text Available In sequential auctions objects are sold one by one in separate auctions. These sequential auctions might be organized as sequential first-price, second-price, or English auctions. We will derive equilibrium bidding strategies for these auctions. Theoretical models suggest that prices in sequential auctions with private values or with randomly assigned heterogeneous objects should have no trend. However, empirical research contradicts this result and prices exhibit a declining or increasing trend, which is called declining and increasing price anomaly. We will present a review of these empirical results, as well as different theoretical explanations for these anomalies.
Lukin, Vladimir V.; Abramov, Sergey K.; Vozel, Benoit; Chehdi, Kacem
2005-10-01
Multichannel (multispectral) remote sensing (MRS) is widely used for various applications nowadays. However, original images are commonly corrupted by noise and other distortions. This prevents reliable retrieval of useful information from remote sensing data. Because of this, image pre-filtering and/or reconstruction are typical stages of multichannel image processing. And majority of modern efficient methods for image pre-processing requires availability of a priori information concerning noise type and its statistical characteristics. Thus, there is a great need in automatic blind methods for determination of noise type and its characteristics. However, almost all such methods fail to perform appropriately well if an image under consideration contains a large percentage of texture regions, details and edges. In this paper we demonstrate that by applying bootstrap it is possible to obtain rather accurate estimates of noise variance that can be used either as the final or preliminary ones. Different quantiles (order statistics) are used as initial estimates of mode location for distribution of noise variance local estimations and then bootstrap is applied for their joint analysis. To further improve accuracy of noise variance estimations, it is proposed under certain condition to apply myriad operation with tunable parameter k set in accordance with preliminary estimate obtained by bootstrap. Numerical simulation results confirm applicability of the proposed approach and produce data allowing to evaluate method accuracy.
Interaction of bootstrap-current-driven magnetic islands
International Nuclear Information System (INIS)
Hegna, C.C.; Callen, J.D.
1991-10-01
The formation and interaction of fluctuating neoclassical pressure gradient driven magnetic islands is examined. The interaction of magnetic islands produces a stochastic region around the separatrices of the islands. This interaction causes the island pressure profile to be broadened, reducing the island bootstrap current and drive for the magnetic island. A model is presented that describes the magnetic topology as a bath of interacting magnetic islands with low to medium poloidal mode number (m congruent 3-30). The islands grow by the bootstrap current effect and damp due to the flattening of the pressure profile near the island separatrix caused by the interaction of the magnetic islands. The effect of this sporadic growth and decay of the islands (''magnetic bubbling'') is not normally addressed in theories of plasma transport due to magnetic fluctuations. The nature of the transport differs from statistical approaches to magnetic turbulence since the radial step size of the plasma transport is now given by the characteristic island width. This model suggests that tokamak experiments have relatively short-lived, coherent, long wavelength magnetic oscillations present in the steep pressure-gradient regions of the plasma. 42 refs
Quantifying uncertainty on sediment loads using bootstrap confidence intervals
Slaets, Johanna I. F.; Piepho, Hans-Peter; Schmitter, Petra; Hilger, Thomas; Cadisch, Georg
2017-01-01
Load estimates are more informative than constituent concentrations alone, as they allow quantification of on- and off-site impacts of environmental processes concerning pollutants, nutrients and sediment, such as soil fertility loss, reservoir sedimentation and irrigation channel siltation. While statistical models used to predict constituent concentrations have been developed considerably over the last few years, measures of uncertainty on constituent loads are rarely reported. Loads are the product of two predictions, constituent concentration and discharge, integrated over a time period, which does not make it straightforward to produce a standard error or a confidence interval. In this paper, a linear mixed model is used to estimate sediment concentrations. A bootstrap method is then developed that accounts for the uncertainty in the concentration and discharge predictions, allowing temporal correlation in the constituent data, and can be used when data transformations are required. The method was tested for a small watershed in Northwest Vietnam for the period 2010-2011. The results showed that confidence intervals were asymmetric, with the highest uncertainty in the upper limit, and that a load of 6262 Mg year-1 had a 95 % confidence interval of (4331, 12 267) in 2010 and a load of 5543 Mg an interval of (3593, 8975) in 2011. Additionally, the approach demonstrated that direct estimates from the data were biased downwards compared to bootstrap median estimates. These results imply that constituent loads predicted from regression-type water quality models could frequently be underestimating sediment yields and their environmental impact.
A wild bootstrap approach for the Aalen-Johansen estimator.
Bluhmki, Tobias; Schmoor, Claudia; Dobler, Dennis; Pauly, Markus; Finke, Juergen; Schumacher, Martin; Beyersmann, Jan
2018-02-16
We suggest a wild bootstrap resampling technique for nonparametric inference on transition probabilities in a general time-inhomogeneous Markov multistate model. We first approximate the limiting distribution of the Nelson-Aalen estimator by repeatedly generating standard normal wild bootstrap variates, while the data is kept fixed. Next, a transformation using a functional delta method argument is applied. The approach is conceptually easier than direct resampling for the transition probabilities. It is used to investigate a non-standard time-to-event outcome, currently being alive without immunosuppressive treatment, with data from a recent study of prophylactic treatment in allogeneic transplanted leukemia patients. Due to non-monotonic outcome probabilities in time, neither standard survival nor competing risks techniques apply, which highlights the need for the present methodology. Finite sample performance of time-simultaneous confidence bands for the outcome probabilities is assessed in an extensive simulation study motivated by the clinical trial data. Example code is provided in the web-based Supplementary Materials. © 2018, The International Biometric Society.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design. PMID:26845694
Bootstrap equations for N=4 SYM with defects
Energy Technology Data Exchange (ETDEWEB)
Liendo, Pedro [IMIP, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Meneghelli, Carlo [Simons Center for Geometry and Physics, Stony Brook University,Stony Brook, NY 11794-3636 (United States)
2017-01-27
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4{sup ∗}|4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Bootstrap equations for N=4 SYM with defects
International Nuclear Information System (INIS)
Liendo, Pedro; Meneghelli, Carlo
2017-01-01
This paper focuses on the analysis of 4dN=4 superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will obtain the Ward identities associated to two-point functions of (1/2)-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to 4dN=4 superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: 4dN=4 superconformal theories with a line defect, 3dN=4 superconformal theories with no defect, and OSP(4 ∗ |4) superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootstrap equations become polynomial constraints on the CFT data. We derive these truncated equations and initiate the study of their solutions.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.
A sequential tree approach for incremental sequential pattern mining
Indian Academy of Sciences (India)
''Sequential pattern mining'' is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases.Pragmatically, looking into the functional and actual execution, the database grows exponentially thereby leading to ...
Sequential Methods and Their Applications
Mukhopadhyay, Nitis
2008-01-01
Illustrates the efficiency of sequential methodologies when dealing with contemporary statistical challenges in many areas. This book explores fixed sample size, sequential probability ratio, and nonparametric tests. It also presents multistage estimation methods for fixed-width confidence interval as well as minimum and bounded risk problems.
The use of bootstrap resampling to assess the variability of Draize tissue scores.
Worth, A P; Cronin, M T
2001-01-01
The acute dermal and ocular effects of chemicals are generally assessed by performing the Draize skin and eye tests, respectively. Because the animal data obtained in these tests are also used for the development and validation of alternative methods for skin and eye irritation, it is important to assess the inherent variability of the animal data, since this variability places an upper limit on the predictive performance that can be expected of any alternative model. The statistical method of bootstrap resampling was used to estimate the variability arising from the use of different animals and time-points, and the estimates of variability were used to determine the maximal extent to which Draize test tissue scores can be predicted.
Economou, Anastasios; Voulgaropoulos, Anastasios
2003-01-01
The development of a dedicated automated sequential-injection analysis apparatus for anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (AdSV) is reported. The instrument comprised a peristaltic pump, a multiposition selector valve and a home-made potentiostat and used a mercury-film electrode as the working electrodes in a thin-layer electrochemical detector. Programming of the experimental sequence was performed in LabVIEW 5.1. The sequence of operations included formation of the mercury film, electrolytic or adsorptive accumulation of the analyte on the electrode surface, recording of the voltammetric current-potential response, and cleaning of the electrode. The stripping step was carried out by applying a square-wave (SW) potential-time excitation signal to the working electrode. The instrument allowed unattended operation since multiple-step sequences could be readily implemented through the purpose-built software. The utility of the analyser was tested for the determination of copper(II), cadmium(II), lead(II) and zinc(II) by SWASV and of nickel(II), cobalt(II) and uranium(VI) by SWAdSV.
Energy Technology Data Exchange (ETDEWEB)
Michel, H.; Levent, D.; Barci, V.; Barci-Funel, G.; Hurel, C. [Laboratoire de Radiochimie, Sciences Analytiques et Environnement (LRSAE), Universite de Nice Sophia-Antipolis 06108 Nice Cedex (France)
2008-07-01
A new sequential method for the determination of both natural (U, Th) and anthropogenic (Sr, Cs, Pu, Am) radionuclides has been developed for application to soil and sediment samples. The procedure was optimised using a reference sediment (IAEA-368) and reference soils (IAEA-375 and IAEA-326). Reference materials were first digested using acids (leaching), 'total' acids on hot plate, and acids in microwave in order to compare the different digestion technique. Then, the separation and purification were made by anion exchange resin and selective extraction chromatography: Transuranic (TRU) and Strontium (SR) resins. Natural and anthropogenic alpha radionuclides were separated by Uranium and Tetravalent Actinide (UTEVA) resin, considering different acid elution medium. Finally, alpha and gamma semiconductor spectrometer and liquid scintillation spectrometer were used to measure radionuclide activities. The results obtained for strontium-90, cesium-137, thorium-232, uranium- 238, plutonium-239+240 and americium-241 isotopes by the proposed method for the reference materials provided excellent agreement with the recommended values and good chemical recoveries. (authors)
International Nuclear Information System (INIS)
Leon, Zacarias; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo
2010-01-01
2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL -1 , respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.
Energy Technology Data Exchange (ETDEWEB)
Leon, Zacarias; Chisvert, Alberto; Balaguer, Angel [Departamento de Quimica Analitica, Facultad de Quimica, Universitat de Valencia, Doctor Moliner 50, 46100 Burjassot, Valencia (Spain); Salvador, Amparo, E-mail: amparo.salvador@uv.es [Departamento de Quimica Analitica, Facultad de Quimica, Universitat de Valencia, Doctor Moliner 50, 46100 Burjassot, Valencia (Spain)
2010-04-07
2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL{sup -1}, respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2002-01-01
An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium pyrrolidinedithioc......An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium...
Do monoclonal antibodies recognize linear sequential determinants?
Camera, M; Muratti, E; Trinca, M L; Chersi, A
1988-01-01
A group of 19 anti-class II monoclonal antibodies produced in different laboratories were tested in ELISA for their ability to bind to a panel of synthetic peptides selected from HLA-DQ alpha and beta chains. No one of the antibodies tested was found to react with the synthetic fragments, thus confirming the common finding that MoAbs generally fail to recognize fragments of the native antigen. The possibility that this result might be partly due to the procedure used for screening hybridoma supernatants is discussed.
Randomized Sequential Trial of Valproic Acid in Amyotrophic Lateral Sclerosis
Piepers, Sanne; Veldink, Jan H.; de Jong, Sonja W.; van der Tweel, Ingeborg; van der Pol, W.-Ludo; Uijtendaal, Esther V.; Schelhaas, H. Jurgen; Scheffer, Hans; de Visser, Marianne; de Jong, J. M. B. Vianney; Wokke, John H. J.; Groeneveld, Geert Jan; van den Berg, Leonard H.
2009-01-01
Objective: To determine whether valproic acid (VPA), a histone deacetylase inhibitor that showed antioxidative and antiapoptotic properties and reduced glutamate toxicity in preclinical studies, is safe and effective in amyotrophic lateral sclerosis (ALS) using a sequential trial design. Methods:
Early Astronomical Sequential Photography, 1873-1923
Bonifácio, Vitor
2011-11-01
In 1873 Jules Janssen conceived the first automatic sequential photographic apparatus to observe the eagerly anticipated 1874 transit of Venus. This device, the 'photographic revolver', is commonly considered today as the earliest cinema precursor. In the following years, in order to study the variability or the motion of celestial objects, several instruments, either manually or automatically actuated, were devised to obtain as many photographs as possible of astronomical events in a short time interval. In this paper we strive to identify from the available documents the attempts made between 1873 and 1923, and discuss the motivations behind them and the results obtained. During the time period studied astronomical sequential photography was employed to determine the time of the instants of contact in transits and occultations, and to study total solar eclipses. The technique was seldom used but apparently the modern film camera invention played no role on this situation. Astronomical sequential photographs were obtained both before and after 1895. We conclude that the development of astronomical sequential photography was constrained by the reduced number of subjects to which the technique could be applied.
Kim, Se-Kang
The effect of bootstrapping was studied by examining whether major profile patterns were replicated when sample sizes were reduced. Profile patterns estimated from the original sample (n=645) of the Wechsler Preschool and Primary Scale of IntelligenceThird Edition (WPPSI-III) Standardization Data were considered major profiles. For bootstrapping,…
Kaufmann, Esther; Wittmann, Werner W.
2016-01-01
The success of bootstrapping or replacing a human judge with a model (e.g., an equation) has been demonstrated in Paul Meehl’s (1954) seminal work and bolstered by the results of several meta-analyses. To date, however, analyses considering different types of meta-analyses as well as the potential dependence of bootstrapping success on the decision domain, the level of expertise of the human judge, and the criterion for what constitutes an accurate decision have been missing from the literature. In this study, we addressed these research gaps by conducting a meta-analysis of lens model studies. We compared the results of a traditional (bare-bones) meta-analysis with findings of a meta-analysis of the success of bootstrap models corrected for various methodological artifacts. In line with previous studies, we found that bootstrapping was more successful than human judgment. Furthermore, bootstrapping was more successful in studies with an objective decision criterion than in studies with subjective or test score criteria. We did not find clear evidence that the success of bootstrapping depended on the decision domain (e.g., education or medicine) or on the judge’s level of expertise (novice or expert). Correction of methodological artifacts increased the estimated success of bootstrapping, suggesting that previous analyses without artifact correction (i.e., traditional meta-analyses) may have underestimated the value of bootstrapping models. PMID:27327085
Internal validation of risk models in clustered data: a comparison of bootstrap schemes
Bouwmeester, W.; Moons, K.G.M.; Kappen, T.H.; van Klei, W.A.; Twisk, J.W.R.; Eijkemans, M.J.C.; Vergouwe, Y.
2013-01-01
Internal validity of a risk model can be studied efficiently with bootstrapping to assess possible optimism in model performance. Assumptions of the regular bootstrap are violated when the development data are clustered. We compared alternative resampling schemes in clustered data for the estimation
On the consistency of bootstrap testing for a parameter on the boundary of the parameter space
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Heino Bohn; Rahbek, Anders
2017-01-01
It is well known that with a parameter on the boundary of the parameter space, such as in the classic cases of testing for a zero location parameter or no autoregressive conditional heteroskedasticity (ARCH) effects, the classic nonparametric bootstrap – based on unrestricted parameter estimates...... the standard and bootstrap Lagrange multiplier tests as well as the asymptotic quasi-likelihood ratio test....
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
A bootstrap method for estimating uncertainty of water quality trends
Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura
2015-01-01
Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.
Integral equations of hadronic correlation functions a functional- bootstrap approach
Manesis, E K
1974-01-01
A reasonable 'microscopic' foundation of the Feynman hadron-liquid analogy is offered, based on a class of models for hadron production. In an external field formalism, the equivalence (complementarity) of the exclusive and inclusive descriptions of hadronic reactions is specifically expressed in a functional-bootstrap form, and integral equations between inclusive and exclusive correlation functions are derived. Using the latest CERN-ISR data on the two-pion inclusive correlation function, and assuming rapidity translational invariance for the exclusive one, the simplest integral equation is solved in the 'central region' and an exclusive correlation length in rapidity predicted. An explanation is also offered for the unexpected similarity observed between pi /sup +/ pi /sup -/ and pi /sup -/ pi /sup -/ inclusive correlations. (31 refs).
arXiv Bootstrapping the QCD soft anomalous dimension
Almelid, Øyvind; Gardi, Einan; McLeod, Andrew; White, Chris D.
2017-09-18
The soft anomalous dimension governs the infrared singularities of scattering amplitudes to all orders in perturbative quantum field theory, and is a crucial ingredient in both formal and phenomenological applications of non-abelian gauge theories. It has recently been computed at three-loop order for massless partons by explicit evaluation of all relevant Feynman diagrams. In this paper, we show how the same result can be obtained, up to an overall numerical factor, using a bootstrap procedure. We first give a geometrical argument for the fact that the result can be expressed in terms of single-valued harmonic polylogarithms. We then use symmetry considerations as well as known properties of scattering amplitudes in collinear and high-energy (Regge) limits to constrain an ansatz of basis functions. This is a highly non-trivial cross-check of the result, and our methods pave the way for greatly simplified higher-order calculations.
Hepatobiliary sequential scintiscanning
International Nuclear Information System (INIS)
Eissner, D.
1985-01-01
The main criteria for interpreting hepatobiliary sequential scintiscanning (HBSS) data are given to be the following: (1) In young infants - without previous parenteral feeding - a normal up to a slightly increased activity uptake in the liver, accompanied by lack of activity excretion into the intestine (which requires a 24 - hour scan for detection) is a clear indication of bile duct atresia. However, the same findings can be obtained on very young newborn (up to one week of age) in case of a hepatitis with defined cholestasis. (2) In case of a comparably high activity uptake in the liver together with activity excretion into the intestine, which may be detectable in due time or after a delay (24-hour scan required), bile duct atresia can be excluded, the diagnosis being hepatitis. In general, hepatitis will cause a stronger liver cell damage, which excludes the criterion of activity excretion into the intestine. Similar findings can be obtained on infants with bile duct atresia with previous parenteral feeding. This is why the interpretation of HBSS data can only be effectively carried out in close cooperation with the pediatrician, and on the basis of profound knowledge of the overall clinical state of the infant. (orig.) [de
Sequential Design of Experiments
Energy Technology Data Exchange (ETDEWEB)
Anderson-Cook, Christine Michaela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-06-30
A sequential design of experiments strategy is being developed and implemented that allows for adaptive learning based on incoming results as the experiment is being run. The plan is to incorporate these strategies for the NCCC and TCM experimental campaigns to be run in the coming months. This strategy for experimentation has the advantages of allowing new data collected during the experiment to inform future experimental runs based on their projected utility for a particular goal. For example, the current effort for the MEA capture system at NCCC plans to focus on maximally improving the quality of prediction of CO_{2} capture efficiency as measured by the width of the confidence interval for the underlying response surface that is modeled as a function of 1) Flue Gas Flowrate [1000-3000] kg/hr; 2) CO_{2} weight fraction [0.125-0.175]; 3) Lean solvent loading [0.1-0.3], and; 4) Lean solvent flowrate [3000-12000] kg/hr.
Adaptive sequential controller
Energy Technology Data Exchange (ETDEWEB)
El-Sharkawi, Mohamed A. (Renton, WA); Xing, Jian (Seattle, WA); Butler, Nicholas G. (Newberg, OR); Rodriguez, Alonso (Pasadena, CA)
1994-01-01
An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.
Adaptive sequential controller
El-Sharkawi, Mohamed A.; Xing, Jian; Butler, Nicholas G.; Rodriguez, Alonso
1994-01-01
An adaptive sequential controller (50/50') for controlling a circuit breaker (52) or other switching device to substantially eliminate transients on a distribution line caused by closing and opening the circuit breaker. The device adaptively compensates for changes in the response time of the circuit breaker due to aging and environmental effects. A potential transformer (70) provides a reference signal corresponding to the zero crossing of the voltage waveform, and a phase shift comparator circuit (96) compares the reference signal to the time at which any transient was produced when the circuit breaker closed, producing a signal indicative of the adaptive adjustment that should be made. Similarly, in controlling the opening of the circuit breaker, a current transformer (88) provides a reference signal that is compared against the time at which any transient is detected when the circuit breaker last opened. An adaptive adjustment circuit (102) produces a compensation time that is appropriately modified to account for changes in the circuit breaker response, including the effect of ambient conditions and aging. When next opened or closed, the circuit breaker is activated at an appropriately compensated time, so that it closes when the voltage crosses zero and opens when the current crosses zero, minimizing any transients on the distribution line. Phase angle can be used to control the opening of the circuit breaker relative to the reference signal provided by the potential transformer.
Performance of Bootstrap MCEWMA: Study case of Sukuk Musyarakah data
Safiih, L. Muhamad; Hila, Z. Nurul
2014-07-01
Sukuk Musyarakah is one of several instruments of Islamic bond investment in Malaysia, where the form of this sukuk is actually based on restructuring the conventional bond to become a Syariah compliant bond. The Syariah compliant is based on prohibition of any influence of usury, benefit or fixed return. Despite of prohibition, daily returns of sukuk are non-fixed return and in statistic, the data of sukuk returns are said to be a time series data which is dependent and autocorrelation distributed. This kind of data is a crucial problem whether in statistical and financing field. Returns of sukuk can be statistically viewed by its volatility, whether it has high volatility that describing the dramatically change of price and categorized it as risky bond or else. However, this crucial problem doesn't get serious attention among researcher compared to conventional bond. In this study, MCEWMA chart in Statistical Process Control (SPC) is mainly used to monitor autocorrelated data and its application on daily returns of securities investment data has gained widespread attention among statistician. However, this chart has always been influence by inaccurate estimation, whether on base model or its limit, due to produce large error and high of probability of signalling out-of-control process for false alarm study. To overcome this problem, a bootstrap approach used in this study, by hybridise it on MCEWMA base model to construct a new chart, i.e. Bootstrap MCEWMA (BMCEWMA) chart. The hybrid model, BMCEWMA, will be applied to daily returns of sukuk Musyarakah for Rantau Abang Capital Bhd. The performance of BMCEWMA base model showed that its more effective compare to real model, MCEWMA based on smaller error estimation, shorter the confidence interval and smaller false alarm. In other word, hybrid chart reduce the variability which shown by smaller error and false alarm. It concludes that the application of BMCEWMA is better than MCEWMA.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
. The bootstrap could also offer confidence limits for contribution plots with acceptable fault diagnostic power. The performance of bootstrap-based and asymptotic confidence limits was compared in batch MSPC (Paper III). Real and simulated batch process datasets were used to build the limits for five PCA....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...... be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q...
Bootstrapping realized volatility and realized beta under a local Gaussianity assumption
DEFF Research Database (Denmark)
Hounyo, Ulrich
The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap...... matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able...
DEFF Research Database (Denmark)
Hounyo, Ulrich; Varneskov, Rasmus T.
We provide a new resampling procedure - the local stable bootstrap - that is able to mimic the dependence properties of realized power variations for pure-jump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity...... index of the underlying process, β, as well as a bootstrap test for whether it obeys a jump-diffusion or a pure-jump process, that is, of the null hypothesis H₀: β=2 against the alternative H₁: βbootstrap power variations, activity index...... estimator, and diffusion test for H0. Moreover, the finite sample size and power properties of the proposed diffusion test are compared to those of benchmark tests using Monte Carlo simulations. Unlike existing procedures, our bootstrap test is correctly sized in general settings. Finally, we illustrate use...
The Finite Population Bootstrap - From the Maximum Likelihood to the Horvitz-Thompson Approach
Directory of Open Access Journals (Sweden)
Andreas Quatember
2014-06-01
Full Text Available The finite population bootstrap method is used as a computer-intensive alternative to estimate the sampling distribution of a sample statis-tic. The generation of a so-called “bootstrap population” is the necessarystep between the original sample drawn and the resamples needed to mimicthis distribution. The most important question for researchers to answer ishow to create an adequate bootstrap population, which may serve as a close-to-reality basis for the resampling process. In this paper, a review of someapproaches to answer this fundamental question is presented. Moreover, anapproach based on the idea behind the Horvitz-Thompson estimator allow-ing not only whole units in the bootstrap population but also parts of wholeunits is proposed. In a simulation study, this method is compared with a moreheuristic technique from the bootstrap literature.
Expected improvement in efficient global optimization through bootstrapped kriging
Kleijnen, Jack P.C.; van Beers, W.C.M.; van Nieuwenhuyse, I.
2012-01-01
This article uses a sequentialized experimental design to select simulation input combinations for global optimization, based on Kriging (also called Gaussian process or spatial correlation modeling); this Kriging is used to analyze the input/output data of the simulation model (computer code). This
Mining Frequent Max and Closed Sequential Patterns
Afshar, Ramin
2002-01-01
Although frequent sequential pattern mining has an important role in many data mining tasks, however, it often generates a large number of sequential patterns, which reduces its efficiency and effectiveness. For many applications mining all the frequent sequential patterns is not necessary, and mining frequent Max, or Closed sequential patterns will provide the same amount of information. Comparing to frequent sequential pattern mining, frequent Max, or Closed sequential pattern mining g...
Frentiu, Tiberiu; Ponta, Michaela; Hategan, Raluca
2013-03-01
The aim of this paper was the validation of a new analytical method based on the high-resolution continuum source flame atomic absorption spectrometry for the fast-sequential determination of several hazardous/priority hazardous metals (Ag, Cd, Co, Cr, Cu, Ni, Pb and Zn) in soil after microwave assisted digestion in aqua regia. Determinations were performed on the ContrAA 300 (Analytik Jena) air-acetylene flame spectrometer equipped with xenon short-arc lamp as a continuum radiation source for all elements, double monochromator consisting of a prism pre-monocromator and an echelle grating monochromator, and charge coupled device as detector. For validation a method-performance study was conducted involving the establishment of the analytical performance of the new method (limits of detection and quantification, precision and accuracy). Moreover, the Bland and Altman statistical method was used in analyzing the agreement between the proposed assay and inductively coupled plasma optical emission spectrometry as standardized method for the multielemental determination in soil. The limits of detection in soil sample (3σ criterion) in the high-resolution continuum source flame atomic absorption spectrometry method were (mg/kg): 0.18 (Ag), 0.14 (Cd), 0.36 (Co), 0.25 (Cr), 0.09 (Cu), 1.0 (Ni), 1.4 (Pb) and 0.18 (Zn), close to those in inductively coupled plasma optical emission spectrometry: 0.12 (Ag), 0.05 (Cd), 0.15 (Co), 1.4 (Cr), 0.15 (Cu), 2.5 (Ni), 2.5 (Pb) and 0.04 (Zn). Accuracy was checked by analyzing 4 certified reference materials and a good agreement for 95% confidence interval was found in both methods, with recoveries in the range of 94-106% in atomic absorption and 97-103% in optical emission. Repeatability found by analyzing real soil samples was in the range 1.6-5.2% in atomic absorption, similar with that of 1.9-6.1% in optical emission spectrometry. The Bland and Altman method showed no statistical significant difference between the two spectrometric
Zakus, P; Arzola, C; Bittencourt, R; Downey, K; Ye, X Y; Carvalho, J C
2018-04-01
The optimum time interval for 10 ml boluses of bupivacaine 0.0625% + fentanyl 2 μg.ml -1 as part of a programmed intermittent epidural bolus regimen has been found to be 40 min. This regimen was shown to be effective without the use of supplementary patient-controlled epidural analgesia boluses in 90% of women during the first stage of labour, although with a rate of sensory block to ice above T6 in 34% of women. We aimed to determine the optimum programmed intermittent epidural bolus volume at a 40 min interval to provide effective analgesia in 90% of women (EV 90 ) during the first stage of labour, without the use of patient-controlled epidural analgesia. We performed a prospective double-blind dose-finding study using the biased coin up-and-down sequential allocation method in 40 women. The estimated EV 90 was 11.0 (95%CI 10.0-11.7) ml with the isotonic regression method and 10.7 (95%CI 10.3-11.0) ml with the truncated Dixon and Mood method. Overall, 18 women had a sensory block above T6, and 37 women exhibited no motor block. No women required treatment for hypotension. In conclusion, it is not possible to reduce the programmed intermittent epidural bolus volume from 10 ml, used in our current regimen, without compromising the quality of analgesia. Using this regimen, a high proportion of women will develop a sensory block above T6. © 2017 The Association of Anaesthetists of Great Britain and Ireland.
Comparing bootstrap and posterior probability values in the four-taxon case.
Cummings, Michael P; Handley, Scott A; Myers, Daniel S; Reed, David L; Rokas, Antonis; Winka, Katarina
2003-08-01
Assessment of the reliability of a given phylogenetic hypothesis is an important step in phylogenetic analysis. Historically, the nonparametric bootstrap procedure has been the most frequently used method for assessing the support for specific phylogenetic relationships. The recent employment of Bayesian methods for phylogenetic inference problems has resulted in clade support being expressed in terms of posterior probabilities. We used simulated data and the four-taxon case to explore the relationship between nonparametric bootstrap values (as inferred by maximum likelihood) and posterior probabilities (as inferred by Bayesian analysis). The results suggest a complex association between the two measures. Three general regions of tree space can be identified: (1) the neutral zone, where differences between mean bootstrap and mean posterior probability values are not significant, (2) near the two-branch corner, and (3) deep in the two-branch corner. In the last two regions, significant differences occur between mean bootstrap and mean posterior probability values. Whether bootstrap or posterior probability values are higher depends on the data in support of alternative topologies. Examination of star topologies revealed that both bootstrap and posterior probability values differ significantly from theoretical expectations; in particular, there are more posterior probability values in the range 0.85-1 than expected by theory. Therefore, our results corroborate the findings of others that posterior probability values are excessively high. Our results also suggest that extrapolations from single topology branch-length studies are unlikely to provide any general conclusions regarding the relationship between bootstrap and posterior probability values.
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify usage extensively with several real-world applications. In passing we give a pedagogical introduction to the numerical bootstrap methods.
A Smooth Bootstrap Procedure towards Deriving Confidence Intervals for the Relative Risk.
Wang, Dongliang; Hutson, Alan D
Given a pair of sample estimators of two independent proportions, bootstrap methods are a common strategy towards deriving the associated confidence interval for the relative risk. We develop a new smooth bootstrap procedure, which generates pseudo-samples from a continuous quantile function. Under a variety of settings, our simulation studies show that our method possesses a better or equal performance in comparison with asymptotic theory based and existing bootstrap methods, particularly for heavily unbalanced data in terms of coverage probability and power. We illustrate our procedure as applied to several published data sets.
Validation of Nonparametric Two-Sample Bootstrap in ROC Analysis on Large Datasets.
Wu, Jin Chu; Martin, Alvin F; Kacker, Raghu N
The nonparametric two-sample bootstrap is applied to computing uncertainties of measures in ROC analysis on large datasets in areas such as biometrics, speaker recognition, etc., when the analytical method cannot be used. Its validation was studied by computing the SE of the area under ROC curve using the well-established analytical Mann-Whitney-statistic method and also using the bootstrap. The analytical result is unique. The bootstrap results are expressed as a probability distribution due to its stochastic nature. The comparisons were carried out using relative errors and hypothesis testing. They match very well. This validation provides a sound foundation for such computations.
ASAL BİLEŞENLER ANALİZİNE BOOTSTRAP YAKLAŞIMI
AKTÜKÜN, Dr. Aylin
2011-01-01
Bu çalışmada, bootstrap yöntemlerin asal bileşenler analizine uygulanma sürecini sunduk. Hipotetik bir veri ile asal bileşenler analizinde başvurulan bazı güven aralıklarının bootstrap yöntemlerle nasıl gerçekleştirilebileceğini gösterdik. Makaledeki tüm bootstrap süreçleri Mathematica dilinde yazdığımız bir programla gerçekleştirdik.
Habova, K; Smirenin, Eksp; Fetisov, D; Tamberg, Eksp
2015-01-01
The objective of the present study was to determine the diagnostic coefficients (DC) for the injuries to the upper and lower extremities of the vehicle drivers inflicted inside the passenger compartment in the case of a traffic accident. We have analysed the archival expert documents collected from 45 regional bureaus of forensic medical expertise during the period from 1995 to 2014 that contained the results of examination of 200 corpses and 300 survivors who had suffered injuries in the traffic accidents. The statistical and mathematical treatment of these materials with the use of sequential mathematical analysis based on the Bayes and Wald formulas yielded diagnostic coefficients that make it possible to elucidate the most informative features characterizing the driver of a vehicle. In case of a lethal outcome, the most significant injuries include bleeding from the posterior left elbow region (DC +7.6), skin scratches on the palm surface of the right wrist (DC +7.6), bleeding from the postrerior region of the left lower leg (DC +7.6), wounds on the dorsal surface of the left wrist (DC +6.3), bruises at the anterior surface of the left knee (DC +6.3), etc. The most informative features in the survivals of the traffic accidents are bone fractures (DC +7.0), tension of ligaments and dislocation of the right talocrural joint (DC +6.5), fractures of the left kneecap and left tibial epiphysis (DC +5.4), hemorrhage and bruises in the anterior right knee region (DC + 5.4 each), skin scratches in the right posterior carpal region (DC +5.1). It is concluded that the use of the diagnostic coefficients makes it possible to draw the attention of the experts to the above features and to objectively determine the driver's seat position inside the car passenger compartment in the case of a traffic accident. Moreover such an approach contributes to the improvement of the quality of expert conclusions and the results of forensic medical expertise of the circumstance of traffic
Wells, Russell Frederick
Reported is a study investigating the use of sequential still photographs rather than motion picture or slides as a media for instructional purposes for a botany course at the university level. Six experimental groups of randomly chosen students viewed visual materials presenting given concepts by three modes--motion pictures, slides, and…
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2002-01-01
A sequential injection (SI) on-line matrix removal and trace metal preconcentration procedure by using a novel microcolumn packed with PTFE beads is described, and demonstrated for trace cadmium analysis with detection by electrothermal atomic absorption spectrometry (ETAAS). The analyte...
A voltage biased superconducting quantum interference device bootstrap circuit
International Nuclear Information System (INIS)
Xie Xiaoming; Wang Huiwu; Wang Yongliang; Dong Hui; Jiang Mianheng; Zhang Yi; Krause, Hans-Joachim; Braginski, Alex I; Offenhaeusser, Andreas; Mueck, Michael
2010-01-01
We present a dc superconducting quantum interference device (SQUID) readout circuit operating in the voltage bias mode and called a SQUID bootstrap circuit (SBC). The SBC is an alternative implementation of two existing methods for suppression of room-temperature amplifier noise: additional voltage feedback and current feedback. Two circuit branches are connected in parallel. In the dc SQUID branch, an inductively coupled coil connected in series provides the bias current feedback for enhancing the flux-to-current coefficient. The circuit branch parallel to the dc SQUID branch contains an inductively coupled voltage feedback coil with a shunt resistor in series for suppressing the preamplifier noise current by increasing the dynamic resistance. We show that the SBC effectively reduces the preamplifier noise to below the SQUID intrinsic noise. For a helium-cooled planar SQUID magnetometer with a SQUID inductance of 350 pH, a flux noise of about 3 μΦ 0 Hz -1/2 and a magnetic field resolution of less than 3 fT Hz -1/2 were obtained. The SBC leads to a convenient direct readout electronics for a dc SQUID with a wider adjustment tolerance than other feedback schemes.
High efficiency fusion reactor based on bootstrap current
International Nuclear Information System (INIS)
Kikuchi, Mitsuru
1990-01-01
The establishment of the steady operation technology which has been the largest subject of the research on the nuclear fusion reactors utilizing tokamak type confinement principle advanced largely by the recent research, and the concept of the power reactor which can be made steady with high efficiency was established. This is to utilize positively the bootstrap current naturally flowing in tokamak plasma. This power reactor can be materialized with the technologies in near future, and there is the possibility that it can become the clear target of developing nuclear fusion reactors as electric power generation plants. In this report, explanation is made centering around the high efficiency nuclear fusion reactor SSTR, for which Japan Atomic Energy Research Institute advances the conceptual design as the prototype power reactor. In nuclear fusion energy CO 2 gas is not generated, essentially nuclear runaway never occurs, and radioactive wastes can be relatively reduced, therefore, it is expected to become a powerful substitute energy source. The main parameters and features of the steady state tokamak nuclear fusion power reactor (SSTR) are reported. (K.I.)
N = 4 superconformal bootstrap of the K3 CFT
Lin, Ying-Hsuan; Shao, Shu-Heng; Simmons-Duffin, David; Wang, Yifan; Yin, Xi
2017-05-01
We study two-dimensional (4, 4) superconformal field theories of central charge c = 6, corresponding to nonlinear sigma models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N = 4 superconformal blocks with c = 6 and bosonic Virasoro conformal blocks with c = 28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A 1 N = 4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence between a class of BPS N = 2 superconformal blocks and Virasoro conformal blocks in two dimensions, and an upper bound on the four-point functions of operators of sufficiently low scaling dimension in three and four dimensional CFTs.
N=4 Superconformal Bootstrap of the K3 CFT
CERN. Geneva
2015-01-01
We study two-dimensional (4,4) superconformal field theories of central charge c=6, corresponding to nonlinear σ models on K3 surfaces, using the superconformal bootstrap. This is made possible through a surprising relation between the BPS N=4 superconformal blocks with c=6 and bosonic Virasoro conformal blocks with c=28, and an exact result on the moduli dependence of a certain integrated BPS 4-point function. Nontrivial bounds on the non-BPS spectrum in the K3 CFT are obtained as functions of the CFT moduli, that interpolate between the free orbifold points and singular CFT points. We observe directly from the CFT perspective the signature of a continuous spectrum above a gap at the singular moduli, and find numerically an upper bound on this gap that is saturated by the A1 N=4 cigar CFT. We also derive an analytic upper bound on the first nonzero eigenvalue of the scalar Laplacian on K3 in the large volume regime, that depends on the K3 moduli data. As two byproducts, we find an exact equivalence...
Bias Reversal Technique in SQUID Bootstrap Circuit (SBC) Scheme
Rong, Liangliang; Zhang, Yi; Zhang, Guofeng; Wu, Jun; Dong, Hui; Qiu, Longqing; Xie, Xiaoming; Offenhüusser, Andreas
Recently, a SQUID direct readout scheme called voltage-biased SQUID Bootstrap Circuit (SBC) is introduced to reduce preamplifier noise contribution. In this paper, we describe a concept of SBC with bias reversal technique which can suppress SQUID intrinsic 1/f noise. When applying a symmetrically rectangular voltage across SBC, two I-Φ characteristics appear at the amplifier output. In order to return to one I - Φ curve, a demodulation technique is required. Because of the asymmetry of typical SBC I-Φ curve, the demodulation method is realized by using a flux compensation of one half Φ0 flux shift. The output signal is then filtered and returned to one I-Φ curve for ordinary FLL readout. It was found, the reversal frequency fR can be dramatically enhanced when using a preamplifier consisting of two operational amplifiers. A planar Nb SQUID magnetometer with a loop-inductance of 350 pH, fR =50 kHz and a second order low pass filter with 10 kHz cut off frequency was employed in our experiment. Results prove the feasibility of SBC bias reversal method. Comparative experiment on noise performance will be carried out in further studies.
International Nuclear Information System (INIS)
Wesseh, Presley K.; Zoumara, Babette
2012-01-01
This contribution investigates causal interdependence between energy consumption and economic growth in Liberia and proposes application of a bootstrap methodology. To better reflect causality, employment is incorporated as additional variable. The study demonstrates evidence of distinct bidirectional Granger causality between energy consumption and economic growth. Additionally, the results show that employment in Liberia Granger causes economic growth and apply irrespective of the short-run or long-run. Evidence from a Monte Carlo experiment reveals that the asymptotic Granger causality test suffers size distortion problem for Liberian data, suggesting that the bootstrap technique employed in this study is more appropriate. Given the empirical results, implications are that energy expansion policies like energy subsidy or low energy tariff for instance, would be necessary to cope with demand exerted as a result of economic growth in Liberia. Furthermore, Liberia might have the performance of its employment generation on the economy partly determined by adequate energy. Therefore, it seems fully justified that a quick shift towards energy production based on clean energy sources may significantly slow down economic growth in Liberia. Hence, the government’s target to implement a long-term strategy to make Liberia a carbon neutral country, and eventually less carbon dependent by 2050 is understandable. - Highlights: ► Causality between energy consumption and economic growth in Liberia investigated. ► There is bidirectional causality between energy consumption and economic growth. ► Energy expansion policies are necessary to cope with demand from economic growth. ► Asymptotic Granger causality test suffers size distortion problem for Liberian data. ► The bootstrap methodology employed in our study is more appropriate.
Using the Bootstrap Concept to Build an Adaptable and Compact Subversion Artifice
National Research Council Canada - National Science Library
Lack, Lindsey
2003-01-01
.... Early tiger teams recognized the possibility of this design and compared it to the two-card bootstrap loader used in mainframes since both exhibit the characteristics of compactness and adaptability...
International Nuclear Information System (INIS)
Luiz Raposo, Jorge; Ruella de Oliveira, Silvana; Caldas, Naise Mary; Neto, Jose Anchieta Gomes
2008-01-01
The usefulness of the secondary line at 252.744 nm and the approach of side pixel registration were evaluated for the development of a method for sequential multi-element determination of Cu, Fe, Mn and Zn in soil extracts by high-resolution continuum source flame atomic absorption spectrometry (HR-CS FAAS). The influence of side pixel registration on the sensitivity and linearity was investigated by measuring at wings (248.325, 248.323, 248.321, 248.329, and 248.332 nm) of the main line for Fe at 248.327 nm. For the secondary line at 252.744 nm or side pixel registration at 248.325 nm, main lines for Cu (324.754 nm), Mn (279.482 nm) and Zn (213.875 nm), sample flow-rate of 5.0 mL min -1 and calibration by matrix matching, analytical curves in the 0.2-1.0 mg L -1 Cu, 1.0-20.0 mg L -1 Fe, 0.2-2.0 mg L -1 Mn, 0.1-1.0 mg L -1 Zn ranges were obtained with linear correlations better than 0.998. The proposed method was applied to seven soil samples and two soil reference materials (IAC 277; IAC 280). Results were in agreement at a 95% confidence level (paired t-test) with reference values. Recoveries of analytes added to soil extracts containing 0.15 and 0.30 mg L -1 Cu, 7.0 and 14 mg L -1 Fe, 0.60 and 1.20 mg L -1 Mn, 0.07 and 0.15 mg L -1 Zn, varied within the 94-99, 92-98, 93-101, and 93-103% intervals, respectively. The relative standard deviations (n = 12) were 2.7% (Cu), 1.4% (Fe - 252.744 nm), 5.7% (Fe - 248.325 nm), 3.2% (Mn) and 2.8% (Zn) for an extract containing 0.35 mg L -1 Cu, 14 mg L -1 Fe, 1.1 mg L -1 Mn and 0.12 mg L -1 Zn. Detection limits were 5.4 μg L -1 Cu, 55 μg L -1 Fe (252.744 nm), 147 μg L -1 Fe (248.325 nm), 3.0 μg L -1 Mn and 4.2 μg L -1 Zn
Blocking for Sequential Political Experiments.
Moore, Ryan T; Moore, Sally A
2013-10-01
In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects "trickle in" to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion.
Cohen-Adad, Julien; Descoteaux, Maxime; Wald, Lawrence L
2011-05-01
To develop a bootstrap method to assess the quality of High Angular Resolution Diffusion Imaging (HARDI) data using Q-Ball imaging (QBI) reconstruction. HARDI data were re-shuffled using regular bootstrap with jackknife sampling. For each bootstrap dataset, the diffusion orientation distribution function (ODF) was estimated voxel-wise using QBI reconstruction based on spherical harmonics functions. The reproducibility of the ODF was assessed using the Jensen-Shannon divergence (JSD) and the angular confidence interval was derived for the first and the second ODF maxima. The sensitivity of the bootstrap method was evaluated on a human subject by adding synthetic noise to the data, by acquiring a map of image signal-to-noise ratio (SNR) and by varying the echo time and the b-value. The JSD was directly linked to the image SNR. The impact of echo times and b-values was reflected by both the JSD and the angular confidence interval, proving the usefulness of the bootstrap method to evaluate specific features of HARDI data. The bootstrap method can effectively assess the quality of HARDI data and can be used to evaluate new hardware and pulse sequences, perform multifiber probabilistic tractography, and provide reliability metrics to support clinical studies. Copyright © 2011 Wiley-Liss, Inc.
The non-local bootstrap--estimation of uncertainty in diffusion MRI.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2013-01-01
Diffusion MRI is a noninvasive imaging modality that allows for the estimation and visualization of white matter connectivity patterns in the human brain. However, due to the low signal-to-noise ratio (SNR) nature of diffusion data, deriving useful statistics from the data is adversely affected by different sources of measurement noise. This is aggravated by the fact that the sampling distribution of the statistic of interest is often complex and unknown. In situations as such, the bootstrap, due to its distribution-independent nature, is an appealing tool for the estimation of the variability of almost any statistic, without relying on complicated theoretical calculations, but purely on computer simulation. In this work, we present new bootstrap strategies for variability estimation of diffusion statistics in association with noise. In contrast to the residual bootstrap, which relies on a predetermined data model, or the repetition bootstrap, which requires repeated signal measurements, our approach, called the non-local bootstrap (NLB), is non-parametric and obviates the need for time-consuming multiple acquisitions. The key assumption of NLB is that local image structures recur in the image. We exploit this self-similarity via a multivariate non-parametric kernel regression framework for bootstrap estimation of uncertainty. Evaluation of NLB using a set of high-resolution diffusion-weighted images, with lower than usual SNR due to the small voxel size, indicates that NLB is markedly more robust to noise and results in more accurate inferences.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood. Copyright © 2013 Elsevier Inc. All rights reserved.
Sequential logic analysis and synthesis
Cavanagh, Joseph
2007-01-01
Until now, there was no single resource for actual digital system design. Using both basic and advanced concepts, Sequential Logic: Analysis and Synthesis offers a thorough exposition of the analysis and synthesis of both synchronous and asynchronous sequential machines. With 25 years of experience in designing computing equipment, the author stresses the practical design of state machines. He clearly delineates each step of the structured and rigorous design principles that can be applied to practical applications. The book begins by reviewing the analysis of combinatorial logic and Boolean a
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
Chaibub Neto, Elias
2015-01-01
In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.
Ibaraki, Masanobu; Matsubara, Keisuke; Nakamura, Kazuhiro; Yamaguchi, Hiroshi; Kinoshita, Toshibumi
2014-02-01
Accurate and validated methods for estimating regional PET image noise are helpful for optimizing image processing. The bootstrap is a data-based simulation method for statistical inference, which can be used to estimate the PET image noise without repeated measurements. The aim of this study was to experimentally validate bootstrap-based methods as a tool for estimating PET image noise and demonstrate its usefulness for evaluating image reconstruction algorithms. Two bootstrap-based method, the list-mode data bootstrap (LMBS) and the sinogram bootstrap (SNBS), were implemented on a clinical PET scanner. A uniform cylindrical phantom filled with (18)F solution was scanned using list-mode acquisition. A reference standard deviation (SD) map was calculated from 60 statistically independent measured list-mode data. Using one of the 60 list-mode data, 60 bootstrap replicates were generated and used to calculate bootstrap SD maps. Brain (18)F-FDG data from a healthy volunteer were also processed as an example of the bootstrap application. Three reconstruction algorithms, FBP 2D and both 2D and 3D versions of dynamic row-action maximum likelihood algorithm (DRAMA), were assessed. For all the reconstruction algorithms used, the bootstrap SD maps agreed well with the reference SD map, confirming the validity of the bootstrap methods for assessing image noise. The two bootstrap methods were equivalent with respect to the performance of image noise estimation. The bootstrap analysis of the FDG data showed the better contrast-noise relation curve for DRAMA 3D compared to DRAMA 2D and FBP 2D. The bootstrap methods provide the estimates of image noise for various reconstruction algorithms with reasonable accuracy, require only a single measurement, not repeated measures, and are, therefore, applicable for a human PET study.
Reaction probability for sequential separatrix crossings
International Nuclear Information System (INIS)
Cary, J.R.; Skodje, R.T.
1988-01-01
The change of the crossing parameter (essentially the phase) between sequential slow separatrix crossings is calculated for Hamiltonian systems with one degree of freedom. Combined with the previous separatrix crossing analysis, these results reduce the dynamics of adiabatic systems with separatrices to a map. This map determines whether a trajectory leaving a given separatrix lobe is ultimately captured by the other lobe. Averaging these results over initial phase yields the reaction probability, which does not asymptote to the fully phase-mixed result even for arbitrarily long times between separatrix crossings
A sequential/parallel track selector
Bertolino, F; Bressani, Tullio; Chiavassa, E; Costa, S; Dellacasa, G; Gallio, M; Musso, A
1980-01-01
A medium speed ( approximately 1 mu s) hardware pre-analyzer for the selection of events detected in four planes of drift chambers in the magnetic field of the Omicron Spectrometer at the CERN SC is described. Specific geometrical criteria determine patterns of hits in the four planes of vertical wires that have to be recognized and that are stored as patterns of '1's in random access memories. Pairs of good hits are found sequentially, then the RAMs are used as look-up tables. (6 refs).
Attack Trees with Sequential Conjunction
Jhawar, Ravi; Kordy, Barbara; Mauw, Sjouke; Radomirović, Sasa; Trujillo-Rasua, Rolando
2015-01-01
We provide the first formal foundation of SAND attack trees which are a popular extension of the well-known attack trees. The SAND at- tack tree formalism increases the expressivity of attack trees by intro- ducing the sequential conjunctive operator SAND. This operator enables the modeling of
Sequential Analysis of Metals in Municipal Dumpsite Composts of ...
African Journals Online (AJOL)
... Ni) in Municipal dumpsite compost were determined by the sequential extraction method. Chemical parameters such as pH, conductivity, and organic carbon contents of the samples were also determined. Analysis of the extracts was carried out by atomic absorption spectrophotometer machine (Buck Scientific VPG 210).
Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A
2017-06-30
Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Thai, Hoai-Thu; Mentré, France; Holford, Nicholas H G; Veyrat-Follet, Christine; Comets, Emmanuelle
2014-02-01
Bootstrap methods are used in many disciplines to estimate the uncertainty of parameters, including multi-level or linear mixed-effects models. Residual-based bootstrap methods which resample both random effects and residuals are an alternative approach to case bootstrap, which resamples the individuals. Most PKPD applications use the case bootstrap, for which software is available. In this study, we evaluated the performance of three bootstrap methods (case bootstrap, nonparametric residual bootstrap and parametric bootstrap) by a simulation study and compared them to that of an asymptotic method (Asym) in estimating uncertainty of parameters in nonlinear mixed-effects models (NLMEM) with heteroscedastic error. This simulation was conducted using as an example of the PK model for aflibercept, an anti-angiogenic drug. As expected, we found that the bootstrap methods provided better estimates of uncertainty for parameters in NLMEM with high nonlinearity and having balanced designs compared to the Asym, as implemented in MONOLIX. Overall, the parametric bootstrap performed better than the case bootstrap as the true model and variance distribution were used. However, the case bootstrap is faster and simpler as it makes no assumptions on the model and preserves both between subject and residual variability in one resampling step. The performance of the nonparametric residual bootstrap was found to be limited when applying to NLMEM due to its failure to reflate the variance before resampling in unbalanced designs where the Asym and the parametric bootstrap performed well and better than case bootstrap even with stratification.
Automated modal parameter estimation using correlation analysis and bootstrap sampling
Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.
2018-02-01
The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to
Lee, Soojeong; Rajan, Sreeraman; Jeon, Gwanggil; Chang, Joon-Hyuk; Dajani, Hilmi R; Groza, Voicu Z
2017-06-01
Blood pressure (BP) is one of the most important vital indicators and plays a key role in determining the cardiovascular activity of patients. This paper proposes a hybrid approach consisting of nonparametric bootstrap (NPB) and machine learning techniques to obtain the characteristic ratios (CR) used in the blood pressure estimation algorithm to improve the accuracy of systolic blood pressure (SBP) and diastolic blood pressure (DBP) estimates and obtain confidence intervals (CI). The NPB technique is used to circumvent the requirement for large sample set for obtaining the CI. A mixture of Gaussian densities is assumed for the CRs and Gaussian mixture model (GMM) is chosen to estimate the SBP and DBP ratios. The K-means clustering technique is used to obtain the mixture order of the Gaussian densities. The proposed approach achieves grade "A" under British Society of Hypertension testing protocol and is superior to the conventional approach based on maximum amplitude algorithm (MAA) that uses fixed CR ratios. The proposed approach also yields a lower mean error (ME) and the standard deviation of the error (SDE) in the estimates when compared to the conventional MAA method. In addition, CIs obtained through the proposed hybrid approach are also narrower with a lower SDE. The proposed approach combining the NPB technique with the GMM provides a methodology to derive individualized characteristic ratio. The results exhibit that the proposed approach enhances the accuracy of SBP and DBP estimation and provides narrower confidence intervals for the estimates. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bootstrapped Discovery and Ranking of Relevant Services and Information in Context-aware Systems
Directory of Open Access Journals (Sweden)
Preeti Bhargava
2015-08-01
Full Text Available A context-aware system uses context to provide relevant information and services to the user, where relevancy depends on the user’s situation. This relevant information could include a wide range of heterogeneous content. Many existing context-aware systems determine this information based on pre-defined ontologies or rules. In addition, they rely on users’ context history to filter it. Moreover, they often provide domain-specific information. Such systems are not applicable to a large and varied set of user situations and information needs, and may suffer from cold start for new users. In this paper, we address these limitations and propose a novel, general and flexible approach for bootstrapped discovery and ranking of heterogeneous relevant services and information in context-aware systems. We design and implement four variations of a base algorithm that ranks candidate relevant services, and the information to be retrieved from them, based on the semantic relatedness between the information provided by the services and the user’s situation description. We conduct a live deployment with 14 subjects to evaluate the efficacy of our algorithms. We demonstrate that they have strong positive correlation with human supplied relevance rankings and can be used as an effective means to discover and rank relevant services and information. We also show that our approach is applicable to a wide set of users’ situations and to new users without requiring any user interaction history.
Buonaccorsi, John P; Romeo, Giovanni; Thoresen, Magne
2018-03-01
When fitting regression models, measurement error in any of the predictors typically leads to biased coefficients and incorrect inferences. A plethora of methods have been proposed to correct for this. Obtaining standard errors and confidence intervals using the corrected estimators can be challenging and, in addition, there is concern about remaining bias in the corrected estimators. The bootstrap, which is one option to address these problems, has received limited attention in this context. It has usually been employed by simply resampling observations, which, while suitable in some situations, is not always formally justified. In addition, the simple bootstrap does not allow for estimating bias in non-linear models, including logistic regression. Model-based bootstrapping, which can potentially estimate bias in addition to being robust to the original sampling or whether the measurement error variance is constant or not, has received limited attention. However, it faces challenges that are not present in handling regression models with no measurement error. This article develops new methods for model-based bootstrapping when correcting for measurement error in logistic regression with replicate measures. The methodology is illustrated using two examples, and a series of simulations are carried out to assess and compare the simple and model-based bootstrap methods, as well as other standard methods. While not always perfect, the model-based approaches offer some distinct improvements over the other methods. © 2017, The International Biometric Society.
Study of the separate exposure method for bootstrap sensitometry on X-ray cine film
International Nuclear Information System (INIS)
Matsuda, Eiji; Sanada, Taizo; Hitomi, Go; Kakuba, Koki; Kangai, Yoshiharu; Ishii, Koushi
1997-01-01
We developed a new method for bootstrap sensitometry that obtained the characteristic curve from a wide range, with a smaller number of aluminum steps than the conventional bootstrap method. In this method, the density-density curve was obtained from standard and multiplied exposures to the aluminum step wedge and used for bootstrap manipulation. The curve was acquired from two regions separated and added together, e.g., lower and higher photographic density regions. In this study, we evaluated the usefulness of a new cinefluorography method in comparison with N.D. filter sensitometry. The shape of the characteristic curve and the gradient curve obtained with the new method were highly similar to that obtained with N.D. filter sensitometry. Also, the average gradient obtained with the new bootstrap sensitometry method was not significantly different from that obtained by the N.D. filter method. The study revealed that the reliability of the characteristic curve was improved by increasing the measured value used to calculate the density-density curve. This new method was useful for obtaining a characteristic curve with a sufficient density range, and the results suggested that this new method could be applied to specific systems to which the conventional bootstrap method is not applicable. (author)
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.
Yin, Guosheng; Ma, Yanyuan
2013-01-01
The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.
Bootstrap Restricted Likelihood Ratio Test for the Detection of Rare Variants
Zeng, Ping; Wang, Ting
2015-01-01
In this paper the detection of rare variants association with continuous phenotypes of interest is investigated via the likelihood-ratio based variance component test under the framework of linear mixed models. The hypothesis testing is challenging and nonstandard, since under the null the variance component is located on the boundary of its parameter space. In this situation the usual asymptotic chisquare distribution of the likelihood ratio statistic does not necessarily hold. To circumvent the derivation of the null distribution we resort to the bootstrap method due to its generic applicability and being easy to implement. Both parametric and nonparametric bootstrap likelihood ratio tests are studied. Numerical studies are implemented to evaluate the performance of the proposed bootstrap likelihood ratio test and compare to some existing methods for the identification of rare variants. To reduce the computational time of the bootstrap likelihood ratio test we propose an effective approximation mixture for the bootstrap null distribution. The GAW17 data is used to illustrate the proposed test. PMID:26069459
Random sequential adsorption of cubes.
Cieśla, Michał; Kubala, Piotr
2018-01-14
Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.
Casellas, J; Tarrés, J; Piedrafita, J; Varona, L
2006-10-01
Given that correct assumptions on the baseline survival function are determinant for the validity of further inferences, specific tools to test the fit of a model to real data become essential in proportional hazards models. In this sense, we have proposed a parametric bootstrap to test the fit of survival models. Monte Carlo simulations are used to generate new data sets from the estimates obtained through the assumed models, and then bootstrap intervals can be established for the survival function along the time space studied. Significant fitting deficiencies are revealed when the real survival function is not included within the bootstrap interval. We tested this procedure in a survival data set of Bruna dels Pirineus beef calves, assuming 4 parametric models (exponential, Weibull, exponential time-dependent, Weibull time-dependent) and the Cox's semiparametric model. Fitting deficiencies were not observed for the Cox's model and the exponential time-dependent model, whereas the Weibull time-dependent model suffered from moderate overestimation at different ages. Thus, the exponential time-dependent model appears to be preferable because of its correct fit for survival data of beef calves and its smaller computational and time requirements. Exponential and Weibull models were completely rejected due to the continuous over- and underestimation of the survival probability reported. Results here highlighted the flexibility of parametric models with time-dependent effects, achieving a fit comparable to nonparametric models.
Introduction of Bootstrap Current Reduction in the Stellarator Optimization Using the Algorithm DAB
International Nuclear Information System (INIS)
Castejón, F.; Gómez-Iglesias, A.; Velasco, J. L.
2015-01-01
This work is devoted to introduce new optimization criterion in the DAB (Distributed Asynchronous Bees) code. With this new criterion, we have now in DAB the equilibrium and Mercier stability criteria, the minimization of Bxgrad(B) criterion, which ensures the reduction of neoclassical transport and the improvement of the confinement of fast particles, and the reduction of bootstrap current. We have started from a neoclassically optimised configuration of the helias type and imposed the reduction of bootstrap current. The obtained configuration only presents a modest reduction of total bootstrap current, but the local current density is reduced along the minor radii. Further investigations are developed to understand the reason of this modest improvement.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...... the recommended decisions) to build rational confidence limits were given. Two NIR datasets were used to study the effect of outliers and bimodal distributions on the bootstrap-based limits. The results showed that bootstrapping can give reasonable estimate of distributions for scores and loadings. It can also...... on assumptions that do not always hold in practice. The aim of this thesis was to illustrate the concept of bootstrap-based confidence estimation in PCA and MSPC. It particularly shows how to build bootstrapbased confidence limits in these areas to be used as alternative to the traditional/asymptotic limits...
The S-matrix bootstrap. Part I: QFT in AdS
Paulos, Miguel F.; Penedones, Joao; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-01
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Constructing Optimal Prediction Intervals by Using Neural Networks and Bootstrap Method.
Khosravi, Abbas; Nahavandi, Saeid; Srinivasan, Dipti; Khosravi, Rihanna
2015-08-01
This brief proposes an efficient technique for the construction of optimized prediction intervals (PIs) by using the bootstrap technique. The method employs an innovative PI-based cost function in the training of neural networks (NNs) used for estimation of the target variance in the bootstrap method. An optimization algorithm is developed for minimization of the cost function and adjustment of NN parameters. The performance of the optimized bootstrap method is examined for seven synthetic and real-world case studies. It is shown that application of the proposed method improves the quality of constructed PIs by more than 28% over the existing technique, leading to narrower PIs with a coverage probability greater than the nominal confidence level.
Aspect Ratio Scaling of Ideal No-wall Stability Limits in High Bootstrap Fraction Tokamak Plasmas
International Nuclear Information System (INIS)
Menard, J.E.; Bell, M.G.; Bell, R.E.; Gates, D.A.; Kaye, S.M.; LeBlanc, B.P.; Maingi, R.; Sabbagh, S.A.; Soukhanovskii, V.; Stutman, D.
2003-01-01
Recent experiments in the low aspect ratio National Spherical Torus Experiment (NSTX) [M. Ono et al., Nucl. Fusion 40 (2000) 557] have achieved normalized beta values twice the conventional tokamak limit at low internal inductance and with significant bootstrap current. These experimental results have motivated a computational re-examination of the plasma aspect ratio dependence of ideal no-wall magnetohydrodynamic stability limits. These calculations find that the profile-optimized no-wall stability limit in high bootstrap fraction regimes is well described by a nearly aspect ratio invariant normalized beta parameter utilizing the total magnetic field energy density inside the plasma. However, the scaling of normalized beta with internal inductance is found to be strongly aspect ratio dependent at sufficiently low aspect ratio. These calculations and detailed stability analyses of experimental equilibria indicate that the nonrotating plasma no-wall stability limit has been exceeded by as much as 30% in NSTX in a high bootstrap fraction regime
arXiv The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F.; Toledo, Jonathan; van Rees, Balt C.; Vieira, Pedro
2017-11-21
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Closure of the operator product expansion in the non-unitary bootstrap
Energy Technology Data Exchange (ETDEWEB)
Esterlis, Ilya [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States); Fitzpatrick, A. Liam [Department of Physics, Boston University,Commonwealth Ave, Boston, MA, 02215 (United States); Ramirez, David M. [Stanford Institute for Theoretical Physics, Stanford University,Via Pueblo, Stanford, CA 94305 (United States)
2016-11-07
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in http://arxiv.org/abs/1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the “Gliozzi” bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
A bootstrap test for comparing two variances: simulation of size and power in small samples.
Sun, Jiajing; Chernick, Michael R; LaBudde, Robert A
2011-11-01
An F statistic was proposed by Good and Chernick ( 1993 ) in an unpublished paper, to test the hypothesis of the equality of variances from two independent groups using the bootstrap; see Hall and Padmanabhan ( 1997 ), for a published reference where Good and Chernick ( 1993 ) is discussed. We look at various forms of bootstrap tests that use the F statistic to see whether any or all of them maintain the nominal size of the test over a variety of population distributions when the sample size is small. Chernick and LaBudde ( 2010 ) and Schenker ( 1985 ) showed that bootstrap confidence intervals for variances tend to provide considerably less coverage than their theoretical asymptotic coverage for skewed population distributions such as a chi-squared with 10 degrees of freedom or less or a log-normal distribution. The same difficulties may be also be expected when looking at the ratio of two variances. Since bootstrap tests are related to constructing confidence intervals for the ratio of variances, we simulated the performance of these tests when the population distributions are gamma(2,3), uniform(0,1), Student's t distribution with 10 degrees of freedom (df), normal(0,1), and log-normal(0,1) similar to those used in Chernick and LaBudde ( 2010 ). We find, surprisingly, that the results for the size of the tests are valid (reasonably close to the asymptotic value) for all the various bootstrap tests. Hence we also conducted a power comparison, and we find that bootstrap tests appear to have reasonable power for testing equivalence of variances.
Neoclassical tearing dynamo and self-sustainment of a bootstrapped tokamak
International Nuclear Information System (INIS)
Bhattacharjee, A.; Yuan, Y.
1993-01-01
It has been suggested by Boozer that a completely bootstrapped tokamak which requires no seed current is possible due to the open-quotes dynamo effectclose quotes caused by tearing modes. Numerical calculations have been carried out by Weening and Boozer confirming the feasibility of a completely bootstrapped tokamak. These calculations use the resistive MHD model, with the pressure profile held arbitrarily fixed. Several questions naturally arise. Is resistive MHD a good model in the low-collisionality regime of present-day tokamaks in which large bootstrap currents have been observed? Is it consistent to rely on pressure gradients to provide the bootstrap current, but then omit pressure gradients in investigating the tearing instabilities that provide the dynamo effect? And how realistic is it to assume that a strong pressure gradient is sustainable in the central region where current relaxation is expected to produce a dynamo effect? In this paper, the authors investigate the dynamo effect in a bootstrapped tokamak within the framework of the neoclassical MHD model which is more realistic than resistive MHD for the regime in question. Since neoclassical MHD includes trapped-particle effects, it can, in principle, provide an additional mechanism for exciting tearing modes which are known to be stabilized by temperature gradients. They investigate the properties of the dynamo field var-epsilon, and find that the original definition var-epsilon = 1 x b 1 > used in incompressible resistive MHD is no longer adequate; neoclassical MHD forces a redefinition of var-epsilon due to the requirements imposed by the helicity conservation constraint. Thus a completely steady-state bootstrapped tokamak sustained by a neoclassical tearing dynamo is realizable. However, they are pessimistic that such a tokamak, even if it were resistively stable, would be stable to ideal kink modes
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Larriba, Yolanda; Rueda, Cristina; Fernández, Miguel A; Peddada, Shyamal D
2018-01-01
Motivation: Gene-expression data obtained from high throughput technologies are subject to various sources of noise and accordingly the raw data are pre-processed before formally analyzed. Normalization of the data is a key pre-processing step, since it removes systematic variations across arrays. There are numerous normalization methods available in the literature. Based on our experience, in the context of oscillatory systems, such as cell-cycle, circadian clock, etc., the choice of the normalization method may substantially impact the determination of a gene to be rhythmic. Thus rhythmicity of a gene can purely be an artifact of how the data were normalized. Since the determination of rhythmic genes is an important component of modern toxicological and pharmacological studies, it is important to determine truly rhythmic genes that are robust to the choice of a normalization method. Results: In this paper we introduce a rhythmicity measure and a bootstrap methodology to detect rhythmic genes in an oscillatory system. Although the proposed methodology can be used for any high-throughput gene expression data, in this paper we illustrate the proposed methodology using several publicly available circadian clock microarray gene-expression datasets. We demonstrate that the choice of normalization method has very little effect on the proposed methodology. Specifically, for any pair of normalization methods considered in this paper, the resulting values of the rhythmicity measure are highly correlated. Thus it suggests that the proposed measure is robust to the choice of a normalization method. Consequently, the rhythmicity of a gene is potentially not a mere artifact of the normalization method used. Lastly, as demonstrated in the paper, the proposed bootstrap methodology can also be used for simulating data for genes participating in an oscillatory system using a reference dataset. Availability: A user friendly code implemented in R language can be downloaded from http://www.eio.uva.es/~miguel/robustdetectionprocedure.html.
Directory of Open Access Journals (Sweden)
Paula Cristiane Pinto Mesquita Pardal
2015-01-01
Full Text Available The paper aims at discussing techniques for administering one implementation issue that often arises in the application of particle filters: sample impoverishment. Dealing with such problem can significantly improve the performance of particle filters and can make the difference between success and failure. Sample impoverishment occurs because of the reduction in the number of truly distinct sample values. A simple solution can be to increase the number of particles, which can quickly lead to unreasonable computational demands, which only delays the inevitable sample impoverishment. There are more intelligent ways of dealing with this problem, such as roughening and prior editing, procedures to be discussed herein. The nonlinear particle filter is based on the bootstrap filter for implementing recursive Bayesian filters. The application consists of determining the orbit of an artificial satellite using real data from the GPS receivers. The standard differential equations describing the orbital motion and the GPS measurements equations are adapted for the nonlinear particle filter, so that the bootstrap algorithm is also used for estimating the orbital state. The evaluation will be done through convergence speed and computational implementation complexity, comparing the bootstrap algorithm results obtained for each technique that deals with sample impoverishment.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio
2014-01-01
measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Improving Web Learning through model Optimization using Bootstrap for a Tour-Guide Robot
Directory of Open Access Journals (Sweden)
Rafael León
2012-09-01
Full Text Available We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability
Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec
2008-07-01
Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.
El-Showk, Sheer; Poland, David; Rychkov, Slava; Simmons-Duffin, David; Vichi, Alessandro
2014-01-01
We use the conformal bootstrap to perform a precision study of the operator spectrum of the critical 3d Ising model. We conjecture that the 3d Ising spectrum minimizes the central charge c in the space of unitary solutions to crossing symmetry. Because extremal solutions to crossing symmetry are uniquely determined, we are able to precisely reconstruct the first several Z2-even operator dimensions and their OPE coefficients. We observe that a sharp transition in the operator spectrum occurs at the 3d Ising dimension Delta_sigma=0.518154(15), and find strong numerical evidence that operators decouple from the spectrum as one approaches the 3d Ising point. We compare this behavior to the analogous situation in 2d, where the disappearance of operators can be understood in terms of degenerate Virasoro representations.
Native Frames: Disentangling Sequential from Concerted Three-Body Fragmentation
Rajput, Jyoti; Severt, T.; Berry, Ben; Jochim, Bethany; Feizollah, Peyman; Kaderiya, Balram; Zohrabi, M.; Ablikim, U.; Ziaee, Farzaneh; Raju P., Kanaka; Rolles, D.; Rudenko, A.; Carnes, K. D.; Esry, B. D.; Ben-Itzhak, I.
2018-03-01
A key question concerning the three-body fragmentation of polyatomic molecules is the distinction of sequential and concerted mechanisms, i.e., the stepwise or simultaneous cleavage of bonds. Using laser-driven fragmentation of OCS into O++C++S+ and employing coincidence momentum imaging, we demonstrate a novel method that enables the clear separation of sequential and concerted breakup. The separation is accomplished by analyzing the three-body fragmentation in the native frame associated with each step and taking advantage of the rotation of the intermediate molecular fragment, CO2 + or CS2 + , before its unimolecular dissociation. This native-frame method works for any projectile (electrons, ions, or photons), provides details on each step of the sequential breakup, and enables the retrieval of the relevant spectra for sequential and concerted breakup separately. Specifically, this allows the determination of the branching ratio of all these processes in OCS3 + breakup. Moreover, we find that the first step of sequential breakup is tightly aligned along the laser polarization and identify the likely electronic states of the intermediate dication that undergo unimolecular dissociation in the second step. Finally, the separated concerted breakup spectra show clearly that the central carbon atom is preferentially ejected perpendicular to the laser field.
Random sequential adsorption on fractals.
Ciesla, Michal; Barbasz, Jakub
2012-07-28
Irreversible adsorption of spheres on flat collectors having dimension d fractals (1 < d < 2), and on general Cantor set (d < 1). Adsorption process is modeled numerically using random sequential adsorption (RSA) algorithm. The paper concentrates on measurement of fundamental properties of coverages, i.e., maximal random coverage ratio and density autocorrelation function, as well as RSA kinetics. Obtained results allow to improve phenomenological relation between maximal random coverage ratio and collector dimension. Moreover, simulations show that, in general, most of known dimensional properties of adsorbed monolayers are valid for non-integer dimensions.
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
Warton, David I; Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.
A model for bootstrap current calculations with bounce averaged Fokker-Planck codes
Westerhof, E.; Peeters, A.G.
1996-01-01
A model is presented that allows the calculation of the neoclassical bootstrap current originating from the radial electron density and pressure gradients in standard (2+1)D bounce averaged Fokker-Planck codes. The model leads to an electron momentum source located almost exclusively at the
Barrera, Begoña Barrios; Figalli, Alessio; Valdinoci, Enrico
2012-01-01
We prove that $C^{1,\\alpha}$ $s$-minimal surfaces are automatically $C^\\infty$. For this, we develop a new bootstrap regularity theory for solutions of integro-differential equations of very general type, which we believe is of independent interest.
Zhang, Guangjian; Preacher, Kristopher J.; Luo, Shanhong
2010-01-01
This article is concerned with using the bootstrap to assign confidence intervals for rotated factor loadings and factor correlations in ordinary least squares exploratory factor analysis. Coverage performances of "SE"-based intervals, percentile intervals, bias-corrected percentile intervals, bias-corrected accelerated percentile…
Computing Robust, Bootstrap-Adjusted Fit Indices for Use with Nonnormal Data
Walker, David A.; Smith, Thomas J.
2017-01-01
Nonnormality of data presents unique challenges for researchers who wish to carry out structural equation modeling. The subsequent SPSS syntax program computes bootstrap-adjusted fit indices (comparative fit index, Tucker-Lewis index, incremental fit index, and root mean square error of approximation) that adjust for nonnormality, along with the…
Bootstrapping Malmquist indices for Danish seiners in the North Sea and Skagerrak
DEFF Research Database (Denmark)
Hoff, Ayoe
2006-01-01
DEA scores or related parameters. The bootstrap method for estimating confidence intervals of deterministic parameters can however be applied to estimate confidence intervals for DEA scores. This method is applied in the present paper for assessing TFP changes between 1987 and 1999 for the fleet...
Bootstrapping Multifractals: Surrogate Data from Random Cascades on Wavelet Dyadic Trees
Czech Academy of Sciences Publication Activity Database
Paluš, Milan
2008-01-01
Roč. 101, č. 13 (2008), 134101-1-134101-4 ISSN 0031-9007 EU Projects: European Commission(XE) 517133 - BRACCIA Grant - others:GA AV ČR(CZ) 1ET110190504 Institutional research plan: CEZ:AV0Z10300504 Keywords : multifractal * bootstrap * hypothesis testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 7.180, year: 2008
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
Bootstrap Approach To Compare the Slopes of Two Calibrations When Few Standards Are Available.
Estévez-Pérez, Graciela; Andrade, Jose M; Wilcox, Rand R
2016-02-16
Comparing the slopes of aqueous-based and standard addition calibration procedures is almost a daily task in analytical laboratories. As usual protocols imply very few standards, sound statistical inference and conclusions are hard to obtain for current classical tests (e.g., the t-test), which may greatly affect decision-making. Thus, there is a need for robust statistics that are not distorted by small samples of experimental values obtained from analytical studies. Several promising alternatives based on bootstrapping are studied in this paper under the typical constraints common in laboratory work. The impact of number of standards, homoscedasticity or heteroscedasticity, three variance patterns, and three error distributions on least-squares fits were considered (in total, 144 simulation scenarios). The Student's t-test is the most valuable procedure when the normality assumption is true and homoscedasticity is present, although it can be highly affected by outliers. A wild bootstrap method leads to average rejection percentages that are closer to the nominal level in almost every situation, and it is recommended for laboratories working with a small number of standards. Finally, it was seen that the Theil-Sen percentile bootstrap statistic is very robust but its rejection percentages depart from the nominal ones (bootstrap principles to compare the slopes of two calibration lines.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size. © The Author(s) 2016.
A common-gate bootstrapped CMOS rectifier for VHF isolated DC-DC converter
Pan, Dongfang; Zhang, Feng; Huang, Lu; Li, Jinliang
2017-06-01
A common-gate bootstrapped CMOS rectifier dedicated for VHF (very high frequency) isolated DC-DC converter is proposed. It uses common-gate bootstrapped technique to compensate the power loss due to the threshold voltage, and to solve the reflux problem in the conventional rectifier circuit. As a result, it improves the power conversion efficiency (PCE) and voltage conversion ratio (VCR). The design saves almost 90% of the area compared to a previously reported double capacitor structure. In addition, we compare the previous rectifier with the proposed common-gate bootstrapped rectifier in the case of the same area; simulation results show that the PCE and VCR of the proposed structure are superior to other structures. The proposed common-gate bootstrapped rectifier was fabricated by using CSMC 0.5 μm BCD process. The measured maximum PCE is 86% and VCR achieves 77% at the operating frequency of 20 MHz. The average PCE is about 79% and average VCR achieves 71% in the frequency range of 30-70 MHz. Measured PCE and VCR have been improved compared to previous results.
Forward Kinematic Analysis of Tip-Tilt-Piston Parallel Manipulator using Secant-Bootstrap Method
Majidian, A.; Amani, A.; Golipour, M.; Amraei, A.
2014-01-01
This paper, deals with application of the Secant-Bootstrap Method (SBM) to solve the Closed-form forward kinematics of a new three degree-of-freedom (DOF) parallel manipulator with inextensible limbs and base-mounted actuators. The manipulator has higher resolution and precision than the existing
Czech Academy of Sciences Publication Activity Database
Kyselý, Jan
2008-01-01
Roč. 47, č. 12 (2008), s. 3236-3251 ISSN 1558-8424 R&D Projects: GA ČR GA205/06/1535 Institutional research plan: CEZ:AV0Z30420517 Keywords : bootstrap * resampling * extreme value analysis * Generalized extreme value distribution * Gumbel distribution Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.761, year: 2008
Maximum non-extensive entropy block bootstrap for non-stationary processes
Czech Academy of Sciences Publication Activity Database
Bergamelli, M.; Novotný, Jan; Urga, G.
2015-01-01
Roč. 91, 1/2 (2015), s. 115-139 ISSN 0001-771X R&D Projects: GA ČR(CZ) GA14-27047S Institutional support: RVO:67985998 Keywords : maximum entropy * bootstrap * Monte Carlo simulations Subject RIV: AH - Economics
A bootstrap procedure to select hyperspectral wavebands related to tannin content
Ferwerda, J.G.; Skidmore, A.K.; Stein, A.
2006-01-01
Detection of hydrocarbons in plants with hyperspectral remote sensing is hampered by overlapping absorption pits, while the `optimal' wavebands for detecting some surface characteristics (e.g. chlorophyll, lignin, tannin) may shift. We combined a phased regression with a bootstrap procedure to find
Abrupt change in mean using block bootstrap and avoiding variance estimation
Czech Academy of Sciences Publication Activity Database
Peštová, Barbora; Pešta, M.
2018-01-01
Roč. 33, č. 1 (2018), s. 413-441 ISSN 0943-4062 Grant - others:GA ČR(CZ) GJ15-04774Y Institutional support: RVO:67985807 Keywords : Block bootstrap * Change in mean * Change point * Hypothesis testing * Ratio type statistics * Robustness Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.434, year: 2016
International Nuclear Information System (INIS)
Li, Ke; Lin, Boqiang
2017-01-01
This paper proposes a total-factor energy consumption performance index (TEPI) for measuring China's energy efficiency across 30 provinces during the period 1997 to 2012. The TEPI is derived by solving an improved non-radial data envelopment analysis (DEA) model, which is based on an energy distance function. The production possibility set is constructed by combining the super-efficiency and sequential DEA models to avoid “discriminating power problem” and “technical regress”. In order to explore the impacts of technological progress on TEPI and perform statistical inferences on the results, a two-stage double bootstrap approach is adopted. The important findings are that China's energy technology innovation produces a negative effect on TEPI, while technology import and imitative innovation produce positive effects on TEPI. Thus, the main contribution of TEPI improvement is technology import. These conclusions imply that technology import especially foreign direct investment (FDI) is important for imitative innovation and can improve China's energy efficiency. In the long run, as the technical level of China approaches to the frontier, energy technology innovation and its wide adoption become a sustained way to improve energy efficiency. Therefore, it is urgent for China to introduce measures such as technology translation and spillover policies as well as energy pricing reforms to support energy technology innovation. - Highlights: • A total-factor energy consumption performance index (TEPI) is introduced. • Three types of technological progress have various effects on TEPI. • FDI is the main contributor of TEPI improvement. • An improved DEA calculation method is introduced. • A two-stage double-bootstrap non-radial DEA model is used.
Multilevel sequential Monte-Carlo samplers
Jasra, Ajay
2016-01-05
Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.
Calculating Power by Bootstrap, with an Application to Cluster-Randomized Trials.
Kleinman, Ken; Huang, Susan S
2016-01-01
A key requirement for a useful power calculation is that the calculation mimic the data analysis that will be performed on the actual data, once that data is observed. Close approximations may be difficult to achieve using analytic solutions, however, and thus Monte Carlo approaches, including both simulation and bootstrap resampling, are often attractive. One setting in which this is particularly true is cluster-randomized trial designs. However, Monte Carlo approaches are useful in many additional settings as well. Calculating power for cluster-randomized trials using analytic or simulation-based methods is frequently unsatisfactory due to the complexity of the data analysis methods to be employed and to the sparseness of data to inform the choice of important parameters in these methods. We propose that among Monte Carlo methods, bootstrap approaches are most likely to generate data similar to the observed data. In bootstrap approaches, real data are resampled to build complete data sets based on real data that resemble the data for the intended analyses. In contrast, simulation methods would use the real data to estimate parameters for the data, and would then simulate data using these parameters. We describe means of implementing bootstrap power calculation. We demonstrate bootstrap power calculation for a cluster-randomized trial with a censored survival outcome and a baseline observation period. Bootstrap power calculation is a natural application of resampling methods. It provides a relatively simple solution to power calculation that is likely to be more accurate than analytic solutions or simulation-based calculations, in the sense that the bootstrap approach does not rely on the assumptions inherent in analytic calculations. This method of calculation has several important strengths. Notably, it is simple to achieve great fidelity to the proposed data analysis method and there is no requirement for parameter estimates, or estimates of their variability
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
Energy Technology Data Exchange (ETDEWEB)
Silva, Cleomacio Miguel da; Amaral, Romilton dos Santos; Santos Junior, Jose Araujo dos; Vieira, Jose Wilson; Leoterio, Dilmo Marques da Silva [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Radioecologia (RAE)], E-mail: cleomaciomiguel@yahoo.com.br; Amaral, Ademir [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Estudos em Radioprotecao e Radioecologia
2007-07-01
The distribution of natural radionuclides in samples from typically anomalous environments has generally a great significant asymmetry, as a result of outlier. For diminishing statistic fluctuation researchers, in radioecology, commonly use geometric mean or median, once the average has no stability under the effect of outliers. As the median is not affected by anomalous values, this parameter of central tendency is the most frequently employed for evaluation of a set of data containing discrepant values. On the other hand, Efron presented a non-parametric method the so-called bootstrap that can be used to decrease the dispersion around the central-tendency value. Generally, in radioecology, statistics procedures are used in order to reduce the effect results of the presence of anomalous values as regards averages. In this context, the present study had as an objective to evaluate the application of the non-parametric bootstrap method (BM) for determining the average concentration of {sup 226}Ra in cultivated forage palms (Opuntia spp.) in soils with uranium anomaly on the dairy milk farms, localized in the cities of Pedra and Venturosa, Pernambuco-Brazil, as well as discussing the utilization of this method in radioecology. The results of {sup 226}Ra in samples of forage palm varied from 1,300 to 25,000 mBq.kg{sup -1} (dry matter), with arithmetic average of 5,965.86 +- 5,903.05 mBq.kg{sup -1}. The result obtained for this average using BM was 5,963.82 +- 1,202.96 mBq.kg{sup -1} (dry matter). The use of BM allowed an automatic filtration of experimental data, without the elimination of outliers, leading to the reduction of dispersion around the average. As a result, the BM permitted reaching a stable arithmetic average of the effects of the outliers. (author)
The Bacterial Sequential Markov Coalescent.
De Maio, Nicola; Wilson, Daniel J
2017-05-01
Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is
Immediate Sequential Bilateral Cataract Surgery
DEFF Research Database (Denmark)
Kessel, Line; Andresen, Jens; Erngaard, Ditte
2015-01-01
The aim of the present systematic review was to examine the benefits and harms associated with immediate sequential bilateral cataract surgery (ISBCS) with specific emphasis on the rate of complications, postoperative anisometropia, and subjective visual function in order to formulate evidence......-based national Danish guidelines for cataract surgery. A systematic literature review in PubMed, Embase, and Cochrane central databases identified three randomized controlled trials that compared outcome in patients randomized to ISBCS or bilateral cataract surgery on two different dates. Meta-analyses were...... performed using the Cochrane Review Manager software. The quality of the evidence was assessed using the GRADE method (Grading of Recommendation, Assessment, Development, and Evaluation). We did not find any difference in the risk of complications or visual outcome in patients randomized to ISBCS or surgery...
Susko, Edward
2010-07-01
The most frequent measure of phylogenetic uncertainty for splits is bootstrap support. Although large bootstrap support intuitively suggests that a split in a tree is well supported, it has not been clear how large bootstrap support needs to be to conclude that there is significant evidence that a hypothesized split is present. Indeed, recent work has shown that bootstrap support is not first-order correct and thus cannot be directly used for hypothesis testing. We present methods that adjust bootstrap support values in a maximum likelihood (ML) setting so that they have an interpretation corresponding to P values in conventional hypothesis testing; for instance, adjusted bootstrap support larger than 95% occurs only 5% of the time if the split is not present. Through examples and simulation settings, it is found that adjustments always increase the level of support. We also find that the nature of the adjustment is fairly constant across parameter settings. Finally, we consider adjustments that take into account the data-dependent nature of many hypotheses about splits: the hypothesis that they are present is being tested because they are in the tree estimated through ML. Here, in contrast, we find that bootstrap probability often needs to be adjusted downwards.
Thibaut, Loïc; Wang, Yi Alice
2017-01-01
Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071
Directory of Open Access Journals (Sweden)
Müller Kai F
2005-10-01
Full Text Available Abstract Background For parsimony analyses, the most common way to estimate confidence is by resampling plans (nonparametric bootstrap, jackknife, and Bremer support (Decay indices. The recent literature reveals that parameter settings that are quite commonly employed are not those that are recommended by theoretical considerations and by previous empirical studies. The optimal search strategy to be applied during resampling was previously addressed solely via standard search strategies available in PAUP*. The question of a compromise between search extensiveness and improved support accuracy for Bremer support received even less attention. A set of experiments was conducted on different datasets to find an empirical cut-off point at which increased search extensiveness does not significantly change Bremer support and jackknife or bootstrap proportions any more. Results For the number of replicates needed for accurate estimates of support in resampling plans, a diagram is provided that helps to address the question whether apparently different support values really differ significantly. It is shown that the use of random addition cycles and parsimony ratchet iterations during bootstrapping does not translate into higher support, nor does any extension of the search extensiveness beyond the rather moderate effort of TBR (tree bisection and reconnection branch swapping plus saving one tree per replicate. Instead, in case of very large matrices, saving more than one shortest tree per iteration and using a strict consensus tree of these yields decreased support compared to saving only one tree. This can be interpreted as a small risk of overestimating support but should be more than compensated by other factors that counteract an enhanced type I error. With regard to Bremer support, a rule of thumb can be derived stating that not much is gained relative to the surplus computational effort when searches are extended beyond 20 ratchet iterations per
Trial Sequential Methods for Meta-Analysis
Kulinskaya, Elena; Wood, John
2014-01-01
Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…
Sequential association rules in atonal music
Honingh, A.; Weyde, T.; Conklin, D.; Chew, E.; Childs, A.; Chuan, C.-H.
2009-01-01
This paper describes a preliminary study on the structure of atonal music. In the same way as sequential association rules of chords can be found in tonal music, sequential association rules of pitch class set categories can be found in atonal music. It has been noted before that certain pitch class
Multi-agent sequential hypothesis testing
Kim, Kwang-Ki K.
2014-12-15
This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.
Beam geometry selection using sequential beam addition.
Popple, Richard A; Brezovich, Ivan A; Fiveash, John B
2014-05-01
The selection of optimal beam geometry has been of interest since the inception of conformal radiotherapy. The authors report on sequential beam addition, a simple beam geometry selection method, for intensity modulated radiation therapy. The sequential beam addition algorithm (SBA) requires definition of an objective function (score) and a set of candidate beam geometries (pool). In the first iteration, the optimal score is determined for each beam in the pool and the beam with the best score selected. In the next iteration, the optimal score is calculated for each beam remaining in the pool combined with the beam selected in the first iteration, and the best scoring beam is selected. The process is repeated until the desired number of beams is reached. The authors selected three treatment sites, breast, lung, and brain, and determined beam arrangements for up to 11 beams from a pool comprised of 25 equiangular transverse beams. For the brain, arrangements were additionally selected from a pool of 22 noncoplanar beams. Scores were determined for geometries comprised equiangular transverse beams (EQA), as well as two tangential beams for the breast case. In all cases, SBA resulted in scores superior to EQA. The breast case had the strongest dependence on beam geometry, for which only the 7-beam EQA geometry had a score better than the two tangential beams, whereas all SBA geometries with more than two beams were superior. In the lung case, EQA and SBA scores monotonically improved with increasing number of beams; however, SBA required fewer beams to achieve scores equivalent to EQA. For the brain case, SBA with a coplanar pool was equivalent to EQA, while the noncoplanar pool resulted in slightly better scores; however, the dose-volume histograms demonstrated that the differences were not clinically significant. For situations in which beam geometry has a significant effect on the objective function, SBA can identify arrangements equivalent to equiangular
Beam geometry selection using sequential beam addition
Energy Technology Data Exchange (ETDEWEB)
Popple, Richard A., E-mail: rpopple@uabmc.edu; Brezovich, Ivan A.; Fiveash, John B. [Department of Radiation Oncology, The University of Alabama at Birmingham, 1720 2nd Avenue South, Birmingham, Alabama 35294 (United States)
2014-05-15
Purpose: The selection of optimal beam geometry has been of interest since the inception of conformal radiotherapy. The authors report on sequential beam addition, a simple beam geometry selection method, for intensity modulated radiation therapy. Methods: The sequential beam addition algorithm (SBA) requires definition of an objective function (score) and a set of candidate beam geometries (pool). In the first iteration, the optimal score is determined for each beam in the pool and the beam with the best score selected. In the next iteration, the optimal score is calculated for each beam remaining in the pool combined with the beam selected in the first iteration, and the best scoring beam is selected. The process is repeated until the desired number of beams is reached. The authors selected three treatment sites, breast, lung, and brain, and determined beam arrangements for up to 11 beams from a pool comprised of 25 equiangular transverse beams. For the brain, arrangements were additionally selected from a pool of 22 noncoplanar beams. Scores were determined for geometries comprised equiangular transverse beams (EQA), as well as two tangential beams for the breast case. Results: In all cases, SBA resulted in scores superior to EQA. The breast case had the strongest dependence on beam geometry, for which only the 7-beam EQA geometry had a score better than the two tangential beams, whereas all SBA geometries with more than two beams were superior. In the lung case, EQA and SBA scores monotonically improved with increasing number of beams; however, SBA required fewer beams to achieve scores equivalent to EQA. For the brain case, SBA with a coplanar pool was equivalent to EQA, while the noncoplanar pool resulted in slightly better scores; however, the dose-volume histograms demonstrated that the differences were not clinically significant. Conclusions: For situations in which beam geometry has a significant effect on the objective function, SBA can identify
Estimating Parameter Uncertainty in Binding-Energy Models by the Frequency-Domain Bootstrap
Bertsch, G. F.; Bingham, Derek
2017-12-01
We propose using the frequency-domain bootstrap (FDB) to estimate errors of modeling parameters when the modeling error is itself a major source of uncertainty. Unlike the usual bootstrap or the simple χ2 analysis, the FDB can take into account correlations between errors. It is also very fast compared to the Gaussian process Bayesian estimate as often implemented for computer model calibration. The method is illustrated with a simple example, the liquid drop model of nuclear binding energies. We find that the FDB gives a more conservative estimate of the uncertainty in liquid drop parameters than the χ2 method, and is in fair accord with more empirical estimates. For the nuclear physics application, there are no apparent obstacles to apply the method to the more accurate and detailed models based on density-functional theory.
Bootstrapping six-gluon scattering in planar ${\\cal N}=4$ super-Yang-Mills theory
Dixon, Lance J; Duhr, Claude; von Hippel, Matt; Pennington, Jeffrey
2014-01-01
We describe the hexagon function bootstrap for solving for six-gluon scattering amplitudes in the large $N_c$ limit of ${\\cal N}=4$ super-Yang-Mills theory. In this method, an ansatz for the finite part of these amplitudes is constrained at the level of amplitudes, not integrands, using boundary information. In the near-collinear limit, the dual picture of the amplitudes as Wilson loops leads to an operator product expansion which has been solved using integrability by Basso, Sever and Vieira. Factorization of the amplitudes in the multi-Regge limit provides additional boundary data. This bootstrap has been applied successfully through four loops for the maximally helicity violating (MHV) configuration of gluon helicities, and through three loops for the non-MHV case.
Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut
2017-05-01
This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés
2018-03-01
Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.
Simulation and sequential dynamical systems
Energy Technology Data Exchange (ETDEWEB)
Mortveit, H.S.; Reidys, C.M.
1999-06-01
Computer simulations have a generic structure. Motivated by this the authors present a new class of discrete dynamical systems that captures this structure in a mathematically precise way. This class of systems consists of (1) a loopfree graph {Upsilon} with vertex set {l_brace}1,2,{hor_ellipsis},n{r_brace} where each vertex has a binary state, (2) a vertex labeled set of functions (F{sub i,{Upsilon}}:F{sub 2}{sup n} {yields} F{sub 2}{sup n}){sub i} and (3) a permutation {pi} {element_of} S{sub n}. The function F{sub i,{Upsilon}} updates the state of vertex i as a function of the states of vertex i and its {Upsilon}-neighbors and leaves the states of all other vertices fixed. The permutation {pi} represents the update ordering, i.e., the order in which the functions F{sub i,{Upsilon}} are applied. By composing the functions F{sub i,{Upsilon}} in the order given by {pi} one obtains the dynamical system (equation given in paper), which the authors refer to as a sequential dynamical system, or SDS for short. The authors will present bounds for the number of functionally different systems and for the number of nonisomorphic digraphs {Gamma}[F{sub {Upsilon}},{pi}] that can be obtained by varying the update order and applications of these to specific graphs and graph classes.
Sequential provisional implant prosthodontics therapy.
Zinner, Ira D; Markovits, Stanley; Jansen, Curtis E; Reid, Patrick E; Schnader, Yale E; Shapiro, Herbert J
2012-01-01
The fabrication and long-term use of first- and second-stage provisional implant prostheses is critical to create a favorable prognosis for function and esthetics of a fixed-implant supported prosthesis. The fixed metal and acrylic resin cemented first-stage prosthesis, as reviewed in Part I, is needed for prevention of adjacent and opposing tooth movement, pressure on the implant site as well as protection to avoid micromovement of the freshly placed implant body. The second-stage prosthesis, reviewed in Part II, should be used following implant uncovering and abutment installation. The patient wears this provisional prosthesis until maturation of the bone and healing of soft tissues. The second-stage provisional prosthesis is also a fail-safe mechanism for possible early implant failures and also can be used with late failures and/or for the necessity to repair the definitive prosthesis. In addition, the screw-retained provisional prosthesis is used if and when an implant requires removal or other implants are to be placed as in a sequential approach. The creation and use of both first- and second-stage provisional prostheses involve a restorative dentist, dental technician, surgeon, and patient to work as a team. If the dentist alone cannot do diagnosis and treatment planning, surgery, and laboratory techniques, he or she needs help by employing the expertise of a surgeon and a laboratory technician. This team approach is essential for optimum results.
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
A bootstrap test for instrument validity in heterogeneous treatment effect models
Kitagawa, Toru
2013-01-01
This paper develops a specification test for the instrument validity conditions in the heterogeneous treatment effect model with a binary treatment and a discrete instrument. A necessary testable implication for the joint restriction of instrument exogeneity and instrument monotonicity is given by nonnegativity of point-identifiable complier's outcome densities. Our specification test infers this testable implication using a Kolmogorov-Smirnov type test statistic. We provide a bootstrap algor...
Asymptotic Expansions and Bootstrapping Distributions for Dependent Variables: A Martingale Approach
Mykland, Per Aslak
1992-01-01
The paper develops a one-step triangular array asymptotic expansion for continuous martingales which are asymptotically normal. Mixing conditions are not required, but the quadratic variations of the martingales must satisfy a law of large numbers and a central limit type condition. From this result we derive expansions for the distributions of estimators in asymptotically ergodic differential equation models, and also for the bootstrapping estimators of these distributions.
Parametric bootstrap methods for testing multiplicative terms in GGE and AMMI models.
Forkman, Johannes; Piepho, Hans-Peter
2014-09-01
The genotype main effects and genotype-by-environment interaction effects (GGE) model and the additive main effects and multiplicative interaction (AMMI) model are two common models for analysis of genotype-by-environment data. These models are frequently used by agronomists, plant breeders, geneticists and statisticians for analysis of multi-environment trials. In such trials, a set of genotypes, for example, crop cultivars, are compared across a range of environments, for example, locations. The GGE and AMMI models use singular value decomposition to partition genotype-by-environment interaction into an ordered sum of multiplicative terms. This article deals with the problem of testing the significance of these multiplicative terms in order to decide how many terms to retain in the final model. We propose parametric bootstrap methods for this problem. Models with fixed main effects, fixed multiplicative terms and random normally distributed errors are considered. Two methods are derived: a full and a simple parametric bootstrap method. These are compared with the alternatives of using approximate F-tests and cross-validation. In a simulation study based on four multi-environment trials, both bootstrap methods performed well with regard to Type I error rate and power. The simple parametric bootstrap method is particularly easy to use, since it only involves repeated sampling of standard normally distributed values. This method is recommended for selecting the number of multiplicative terms in GGE and AMMI models. The proposed methods can also be used for testing components in principal component analysis. © 2014, The International Biometric Society.
Forecasting Model for IPTV Service in Korea Using Bootstrap Ridge Regression Analysis
Lee, Byoung Chul; Kee, Seho; Kim, Jae Bum; Kim, Yun Bae
The telecom firms in Korea are taking new step to prepare for the next generation of convergence services, IPTV. In this paper we described our analysis on the effective method for demand forecasting about IPTV broadcasting. We have tried according to 3 types of scenarios based on some aspects of IPTV potential market and made a comparison among the results. The forecasting method used in this paper is the multi generation substitution model with bootstrap ridge regression analysis.
Gaussian process regression bootstrapping: exploring the effects of uncertainty in time course data.
Kirk, Paul D W; Stumpf, Michael P H
2009-05-15
Although widely accepted that high-throughput biological data are typically highly noisy, the effects that this uncertainty has upon the conclusions we draw from these data are often overlooked. However, in order to assign any degree of confidence to our conclusions, we must quantify these effects. Bootstrap resampling is one method by which this may be achieved. Here, we present a parametric bootstrapping approach for time-course data, in which Gaussian process regression (GPR) is used to fit a probabilistic model from which replicates may then be drawn. This approach implicitly allows the time dependence of the data to be taken into account, and is applicable to a wide range of problems. We apply GPR bootstrapping to two datasets from the literature. In the first example, we show how the approach may be used to investigate the effects of data uncertainty upon the estimation of parameters in an ordinary differential equations (ODE) model of a cell signalling pathway. Although we find that the parameter estimates inferred from the original dataset are relatively robust to data uncertainty, we also identify a distinct second set of estimates. In the second example, we use our method to show that the topology of networks constructed from time-course gene expression data appears to be sensitive to data uncertainty, although there may be individual edges in the network that are robust in light of present data. Matlab code for performing GPR bootstrapping is available from our web site: http://www3.imperial.ac.uk/theoreticalsystemsbiology/data-software/.
Syntactic bootstrapping in children with Down syndrome: the impact of bilingualism.
Cleave, Patricia L; Kay-Raining Bird, Elizabeth; Trudeau, Natacha; Sutton, Ann
2014-01-01
The purpose of the study was to add to our knowledge of bilingual learning in children with Down syndrome (DS) using a syntactic bootstrapping task. Four groups of children and youth matched on non-verbal mental age participated. There were 14 bilingual participants with DS (DS-B, mean age 12;5), 12 monolingual participants with DS (DS-M, mean age 10;10), 9 bilingual typically developing children (TD-B; mean age 4;1) and 11 monolingual typically developing children (TD-M; mean age 4;1). The participants completed a computerized syntactic bootstrapping task involving unfamiliar nouns and verbs. The syntactic cues employed were a for the nouns and ing for the verbs. Performance was better on nouns than verbs. There was also a main effect for group. Follow-up t-tests revealed that there were no significant differences between the TD-M and TD-B or between the DS-M and DS-B groups. However, the DS-M group performed more poorly than the TD-M group with a large effect size. Analyses at the individual level revealed a similar pattern of results. There was evidence that Down syndrome impacted performance; there was no evidence that bilingualism negatively affected the syntactic bootstrapping skills of individuals with DS. These results from a dynamic language task are consistent with those of previous studies that used static or product measures. Thus, the results are consistent with the position that parents should be supported in their decision to provide bilingual input to their children with DS. Readers of this article will identify (1) research evidence regarding bilingual development in children with Down syndrome and (2) syntactic bootstrapping skills in monolingual and bilingual children who are typically developing or who have Down syndrome. Copyright © 2014 Elsevier Inc. All rights reserved.
Statistical error estimation of the Feynman-α method using the bootstrap method
International Nuclear Information System (INIS)
Endo, Tomohiro; Yamamoto, Akio; Yagi, Takahiro; Pyeon, Cheol Ho
2016-01-01
Applicability of the bootstrap method is investigated to estimate the statistical error of the Feynman-α method, which is one of the subcritical measurement techniques on the basis of reactor noise analysis. In the Feynman-α method, the statistical error can be simply estimated from multiple measurements of reactor noise, however it requires additional measurement time to repeat the multiple times of measurements. Using a resampling technique called 'bootstrap method' standard deviation and confidence interval of measurement results obtained by the Feynman-α method can be estimated as the statistical error, using only a single measurement of reactor noise. In order to validate our proposed technique, we carried out a passive measurement of reactor noise without any external source, i.e. with only inherent neutron source by spontaneous fission and (α,n) reactions in nuclear fuels at the Kyoto University Criticality Assembly. Through the actual measurement, it is confirmed that the bootstrap method is applicable to approximately estimate the statistical error of measurement results obtained by the Feynman-α method. (author)
Song, Hangke; Liu, Zhi; Du, Huan; Sun, Guangling; Le Meur, Olivier; Ren, Tongwei
2017-09-01
This paper proposes a novel depth-aware salient object detection and segmentation framework via multiscale discriminative saliency fusion (MDSF) and bootstrap learning for RGBD images (RGB color images with corresponding Depth maps) and stereoscopic images. By exploiting low-level feature contrasts, mid-level feature weighted factors and high-level location priors, various saliency measures on four classes of features are calculated based on multiscale region segmentation. A random forest regressor is learned to perform the discriminative saliency fusion (DSF) and generate the DSF saliency map at each scale, and DSF saliency maps across multiple scales are combined to produce the MDSF saliency map. Furthermore, we propose an effective bootstrap learning-based salient object segmentation method, which is bootstrapped with samples based on the MDSF saliency map and learns multiple kernel support vector machines. Experimental results on two large datasets show how various categories of features contribute to the saliency detection performance and demonstrate that the proposed framework achieves the better performance on both saliency detection and salient object segmentation.
A wild bootstrap approach for the selection of biomarkers in early diagnostic trials.
Zapf, Antonia; Brunner, Edgar; Konietschke, Frank
2015-05-01
In early diagnostic trials, particularly in biomarker studies, the aim is often to select diagnostic tests among several methods. In case of metric, discrete, or even ordered categorical data, the area under the receiver operating characteristic (ROC) curve (denoted by AUC) is an appropriate overall accuracy measure for the selection, because the AUC is independent of cut-off points. For selection of biomarkers the individual AUC's are compared with a pre-defined threshold. To keep the overall coverage probability or the multiple type-I error rate, simultaneous confidence intervals and multiple contrast tests are considered. We propose a purely nonparametric approach for the estimation of the AUC's with the corresponding confidence intervals and statistical tests. This approach uses the correlation among the statistics to account for multiplicity. For small sample sizes, a Wild-Bootstrap approach is presented. It is shown that the corresponding intervals and tests are asymptotically exact. Extensive simulation studies indicate that the derived Wild-Bootstrap approach keeps and exploits the nominal type-I error at best, even for high accuracies and in case of small samples sizes. The strength of the correlation, the type of covariance structure, a skewed distribution, and also a moderate imbalanced case-control ratio do not have any impact on the behavior of the approach. A real data set illustrates the application of the proposed methods. We recommend the new Wild Bootstrap approach for the selection of biomarkers in early diagnostic trials, especially for high accuracies and small samples sizes.
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
Nonparametric bootstrap technique for calibrating surgical SmartForceps: theory and application.
Azimaee, Parisa; Jafari Jozani, Mohammad; Maddahi, Yaser; Zareinia, Kourosh; Sutherland, Garnette
2017-10-01
Knowledge of forces, exerted on the brain tissue during the performance of neurosurgical tasks, is critical for quality assurance, case rehearsal, and training purposes. Quantifying the interaction forces has been made possible by developing SmartForceps, a bipolar forceps retrofitted by a set of strain gauges. The forces are estimated using voltages read from strain gauges. We therefore need to quantify the force-voltage relationship to estimate the interaction forces during microsurgery. This problem has been addressed in the literature by following the physical and deterministic properties of the force-sensing strain gauges without obtaining the precision associated with each estimate. In this paper, we employ a probabilistic methodology by using a nonparametric Bootstrap approach to obtain both point and interval estimates of the applied forces at the tool tips, while the precision associated with each estimate is provided. To show proof-of-concept, the Bootstrap technique is employed to estimate unknown forces, and construct necessary confidence intervals using observed voltages in data sets that are measured from the performance of surgical tasks on a cadaveric brain. Results indicate that the Bootstrap technique is capable of estimating tool-tissue interaction forces with acceptable level of accuracy compared to the linear regression technique under the normality assumption.
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
International Nuclear Information System (INIS)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin
2013-01-01
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2001-01-01
with the preconcentration process of the ensuing sample. The performance of the system is demonstrated for the determination of bismuth. With 2.4 ml sample loading, an enrichment factor of 33.4, a detection limit of 27 ng /l, along with a sampling frequency of 10 was obtained. The relative standard deviation was 2.......3% for the determination of 2.0 mug/l Bi (n = 7). The procedure was validated by determination of bismuth in a certified reference material CRM 320 (river sediment), and by bismuth spike recoveries in two human urine samples....
Wedenberg, Minna
2013-11-15
To apply a statistical bootstrap analysis to assess the uncertainty in the dose-response relation for the endpoints pneumonitis and myelopathy reported in the QUANTEC review. The bootstrap method assesses the uncertainty of the estimated population-based dose-response relation due to sample variability, which reflects the uncertainty due to limited numbers of patients in the studies. A large number of bootstrap replicates of the original incidence data were produced by random sampling with replacement. The analysis requires only the dose, the number of patients, and the number of occurrences of the studied endpoint, for each study. Two dose-response models, a Poisson-based model and the Lyman model, were fitted to each bootstrap replicate using maximum likelihood. The bootstrap analysis generates a family of curves representing the range of plausible dose-response relations, and the 95% bootstrap confidence intervals give an estimated upper and lower toxicity risk. The curve families for the 2 dose-response models overlap for doses included in the studies at hand but diverge beyond that, with the Lyman model suggesting a steeper slope. The resulting distributions of the model parameters indicate correlation and non-Gaussian distribution. For both data sets, the likelihood of the observed data was higher for the Lyman model in >90% of the bootstrap replicates. The bootstrap method provides a statistical analysis of the uncertainty in the estimated dose-response relation for myelopathy and pneumonitis. It suggests likely values of model parameter values, their confidence intervals, and how they interrelate for each model. Finally, it can be used to evaluate to what extent data supports one model over another. For both data sets considered here, the Lyman model was preferred over the Poisson-based model. Copyright © 2013 Elsevier Inc. All rights reserved.
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of
Sequential extraction applied to Peruibe black mud, SP, Brazil
International Nuclear Information System (INIS)
Torrecilha, Jefferson Koyaishi
2014-01-01
The Peruibe Black mud is used in therapeutic treatments such as psoriasis, peripheral dermatitis, acne and seborrhoea, as well as in the treatment of myalgia, arthritis, rheumatism and non-articular processes. Likewise other medicinal clays, it may not be free from possible adverse health effects due to possible hazardous minerals leading to respiratory system occurrences and other effects, caused by the presence of toxic elements. Once used for therapeutic purposes, any given material should be fully characterized and thus samples of Peruibe black mud were analyzed to determine physical and chemical properties: moisture content, organic matter and loss on ignition; pH, particle size, cation exchange capacity and swelling index. The elemental composition was determined by Neutron Activation Analysis, Atomic Absorption Graphite Furnace and X-ray fluorescence; the mineralogical composition was determined by X-ray diffraction. Another tool widely used to evaluate the behavior of trace elements, in various environmental matrices, is the sequential extraction. Thus, a sequential extraction procedure was applied to fractionate the mud in specific geochemical forms and verify how and how much of the elements may be contained in it. Considering the several sequential extraction procedures, BCR-701 method (Community Bureau of Reference) was used since it is considered the most reproducible among them. A simple extraction with an artificial sweat was, also, applied in order to verify which components are potentially available for absorption by the patient skin during the topical treatment. The results indicated that the mud is basically composed by a silty-clay material, rich in organic matter and with good cation exchange capacity. There were no significant variations in mineralogy and elemental composition of both, in natura and mature mud forms. The analysis by sequential extraction and by simple extraction indicated that the elements possibly available in larger
Mafuba, Kay; Gates, Bob
2012-12-01
This paper explores and advocates the use of sequential multiple methods as a contemporary strategy for undertaking research. Sequential multiple methods involve the use of results obtained through one data collection method to determine the direction and implementation of subsequent stages of a research project (Morse, 1991; Morgan, 1998). This paper will also explore the significance of how triangulating research at the epistemological, theoretical and methodological levels could enhance research. Finally the paper evaluates the significance of sequential multiple method in learning disability nursing research practice.
Sequential parameter estimation for stochastic systems
Directory of Open Access Journals (Sweden)
G. A. Kivman
2003-01-01
Full Text Available The quality of the prediction of dynamical system evolution is determined by the accuracy to which initial conditions and forcing are known. Availability of future observations permits reducing the effects of errors in assessment the external model parameters by means of a filtering algorithm. Usually, uncertainties in specifying internal model parameters describing the inner system dynamics are neglected. Since they are characterized by strongly non-Gaussian distributions (parameters are positive, as a rule, traditional Kalman filtering schemes are badly suited to reducing the contribution of this type of uncertainties to the forecast errors. An extension of the Sequential Importance Resampling filter (SIR is proposed to this aim. The filter is verified against the Ensemble Kalman filter (EnKF in application to the stochastic Lorenz system. It is shown that the SIR is capable of estimating the system parameters and to predict the evolution of the system with a remarkably better accuracy than the EnKF. This highlights a severe drawback of any Kalman filtering scheme: due to utilizing only first two statistical moments in the analysis step it is unable to deal with probability density functions badly approximated by the normal distribution.
Constrained treatment planning using sequential beam selection
International Nuclear Information System (INIS)
Woudstra, E.; Storchi, P.R.M.
2000-01-01
In this paper an algorithm is described for automated treatment plan generation. The algorithm aims at delivery of the prescribed dose to the target volume without violation of constraints for target, organs at risk and the surrounding normal tissue. Pre-calculated dose distributions for all candidate orientations are used as input. Treatment beams are selected in a sequential way. A score function designed for beam selection is used for the simultaneous selection of beam orientations and weights. In order to determine the optimum choice for the orientation and the corresponding weight of each new beam, the score function is first redefined to account for the dose distribution of the previously selected beams. Addition of more beams to the plan is stopped when the target dose is reached or when no additional dose can be delivered without violating a constraint. In the latter case the score function is modified by importance factor changes to enforce better sparing of the organ with the limiting constraint and the algorithm is run again. (author)
Stationary Anonymous Sequential Games with Undiscounted Rewards.
Więcek, Piotr; Altman, Eitan
Stationary anonymous sequential games with undiscounted rewards are a special class of games that combine features from both population games (infinitely many players) with stochastic games. We extend the theory for these games to the cases of total expected reward as well as to the expected average reward. We show that in the anonymous sequential game equilibria correspond to the limits of those of related finite population games as the number of players grows to infinity. We provide examples to illustrate our results.
Nyasulu, Frazier; Barlag, Rebecca
2011-01-01
The well-known colorimetric determination of the equilibrium constant of the iron(III-thiocyanate complex is simplified by preparing solutions in a cuvette. For the calibration plot, 0.10 mL increments of 0.00100 M KSCN are added to 4.00 mL of 0.200 M Fe(NO[subscript 3])[subscript 3], and for the equilibrium solutions, 0.50 mL increments of…
Use of sequential extraction to assess metal partitioning in soils
International Nuclear Information System (INIS)
Kaasalainen, Marika; Yli-Halla, Markku
2003-01-01
The state of heavy metal pollution and the mobility of Cd, Cu, Ni, Cr, Pb and Zn were studied in three texturally different agricultural soil profiles near a Cu-Ni smelter in Harjavalta, Finland. The pseudo-total concentrations were determined by an aqua regia procedure. Metals were also determined after division into four fractions by sequential extraction with (1) acetic acid (exchangeable and specifically adsorbed metals), (2) a reducing agent (bound to Fe/Mn hydroxides), (3) an oxidizing agent (bound to soil organic matter) and (4) aqua regia (bound to mineral structures). Fallout from the smelter has increased the concentrations of Cd, Cu and Ni in the topsoil, where 75-90% of Cd, 49-72% of Cu and 22-52% of Ni occurred in the first two fractions. Slight Pb and Zn pollution was evident as well. High proportions of mobile Cd, Cu and Ni also deeper in the sandy soil, closest to the smelter, indicated some downward movement of metals. The hydroxide-bound fraction of Pb dominated in almost all soils and horizons, while Ni, Cr and Zn mostly occurred in mineral structures. Aqua regia extraction is usefully supplemented with sequential extraction, particularly in less polluted soils and in soils that exhibit substantial textural differences within the profiles. - Sequential extraction is most useful with soils with low metal pollutant levels