Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Guiseppe, Cavaliere; Rahbæk, Anders; Taylor, A.M. Robert
with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... that the probability of selecting a rank smaller than (equal to) the true co-integrating rank will converge to zero (one minus the marginal significance level), as the sample size diverges, for general I(1) processes. No such likelihood-based procedure is currently known to be available. In this paper we fill this gap...
Bootstrap Sequential Determination of the Co-integration Rank in VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
Determining the co-integrating rank of a system of variables has become a fundamental aspect of applied research in macroeconomics and finance. It is wellknown that standard asymptotic likelihood ratio tests for co-integration rank of Johansen (1996) can be unreliable in small samples with empiri......Determining the co-integrating rank of a system of variables has become a fundamental aspect of applied research in macroeconomics and finance. It is wellknown that standard asymptotic likelihood ratio tests for co-integration rank of Johansen (1996) can be unreliable in small samples...... with empirical rejection frequencies often very much in excess of the nominal level. As a consequence, bootstrap versions of these tests have been developed. To be useful, however, sequential procedures for determining the co-integrating rank based on these bootstrap tests need to be consistent, in the sense...... that the probability of selecting a rank smaller than (equal to) the true co-integrating rank will converge to zero (one minus the marginal significance level), as the sample size diverges, for general I(1) processes. No such likelihood-based procedure is currently known to be available. In this paper we fill this gap...
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, G.; Rahbek, Anders; Taylor, A.M.R.
2014-01-01
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio (PLR) co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates...... show that the bootstrap PLR tests are asymptotically correctly sized and, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under...
Bootstrap confidence intervals in a complex situation: A sequential paired clinical trial
Energy Technology Data Exchange (ETDEWEB)
Morton, S.C.
1988-06-01
This paper considers the problem of determining a confidence interval for the difference between two treatments in a simplified sequential paired clinical trial, which is analogous to setting an interval for the drift of a random walk subject to a parabolic stopping boundary. Three bootstrap methods of construction are applied: Efron's accelerated bias-covered, the DiCiccio-Romano, and the bootstrap-t. The results are compared with a theoretical approximate interval due to Siegmund. Difficulties inherent in the use of these bootstrap methods in a complex situations are illustrated. The DiCiccio-Romano method is shown to be the easiest to apply and to work well. 13 refs.
Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A. M. Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates......, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional...
Bootstrap Determination of the Co-integration Rank in Heteroskedastic VAR Models
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Rahbek, Anders; Taylor, A.M.Robert
In a recent paper Cavaliere et al. (2012) develop bootstrap implementations of the (pseudo-) likelihood ratio [PLR] co-integration rank test and associated sequential rank determination procedure of Johansen (1996). The bootstrap samples are constructed using the restricted parameter estimates......, moreover, that the probability that the associated bootstrap sequential procedures select a rank smaller than the true rank converges to zero. This result is shown to hold for both the i.i.d. and wild bootstrap variants under conditional heteroskedasticity but only for the latter under unconditional...
Energy Technology Data Exchange (ETDEWEB)
Niehof, Jonathan T.; Morley, Steven K.
2012-01-01
We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.
Dynamics of bootstrap percolation
Indian Academy of Sciences (India)
Prabodh Shukla
2008-08-01
Bootstrap percolation transition may be first order or second order, or it may have a mixed character where a first-order drop in the order parameter is preceded by critical fluctuations. Recent studies have indicated that the mixed transition is characterized by power-law avalanches, while the continuous transition is characterized by truncated avalanches in a related sequential bootstrap process. We explain this behaviour on the basis of an analytical and numerical study of the avalanche distributions on a Bethe lattice.
Niska, Christoffer
2014-01-01
Practical and instruction-based, this concise book will take you from understanding what Bootstrap is, to creating your own Bootstrap theme in no time! If you are an intermediate front-end developer or designer who wants to learn the secrets of Bootstrap, this book is perfect for you.
Echeverri, Alejandro Castedo; Serone, Marco
2016-01-01
We study the numerical bounds obtained using a conformal-bootstrap method - advocated in ref. [1] but never implemented so far - where different points in the plane of conformal cross ratios $z$ and $\\bar z$ are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point $z=\\bar z =1/2$, we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and $O(n)$ vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the $O(n)$ vector models, with $n=2,3,4$, which have not yet been computed using bootstrap techniques.
Echeverri, Alejandro Castedo; von Harling, Benedict; Serone, Marco
2016-09-01
We study the numerical bounds obtained using a conformal-bootstrap method — advocated in ref. [1] but never implemented so far — where different points in the plane of conformal cross ratios z and overline{z} are sampled. In contrast to the most used method based on derivatives evaluated at the symmetric point z=overline{z}=1/2 , we can consistently "integrate out" higher-dimensional operators and get a reduced simpler, and faster to solve, set of bootstrap equations. We test this "effective" bootstrap by studying the 3D Ising and O( n) vector models and bounds on generic 4D CFTs, for which extensive results are already available in the literature. We also determine the scaling dimensions of certain scalar operators in the O( n) vector models, with n = 2, 3, 4, which have not yet been computed using bootstrap techniques.
Bhaumik, Snig
2015-01-01
If you are a web developer who designs and develops websites and pages using HTML, CSS, and JavaScript, but have very little familiarity with Bootstrap, this is the book for you. Previous experience with HTML, CSS, and JavaScript will be helpful, while knowledge of jQuery would be an extra advantage.
Magno, Alexandre
2013-01-01
A practical, step-by-step tutorial on developing websites for mobile using Bootstrap.This book is for anyone who wants to get acquainted with the new features available in Bootstrap 3 and who wants to develop websites with the mobile-first feature of Bootstrap. The reader should have a basic knowledge of Bootstrap as a frontend framework.
Buist, Peter J.; Teunissen, Peter J. G.; Verhagen, Sandra; Giorgi, Gabriele
2011-04-01
Traditionally in multi-spacecraft missions (e.g. formation flying, rendezvous) the GNSS-based relative positioning and attitude determination problem are treated as independent. In this contribution we will investigate the possibility to use multi-antenna data from each spacecraft, not only for attitude determination, but also to improve the relative positioning between spacecraft. Both for ambiguity resolution and accuracy of the baseline solution, we will show the theoretical improvement achievable as a function of the number of antennas on each platform. We concentrate on ambiguity resolution as the key to precise relative positioning and attitude determination and will show the theoretical limit of this kind of approach. We will use mission parameters of the European Proba-3 satellites in a software-based algorithm verification and a hardware-in-the-loop simulation. The software simulations indicated that this approach can improve single epoch ambiguity resolution up to 50% for relative positioning applying the typical antenna configurations for attitude determination. The hardware-in-the-loop simulations show that for the same antenna configurations, the accuracy of the relative positioning solution can improve up to 40%.
Collier, Scott; Yin, Xi
2016-01-01
We constrain the spectrum of two-dimensional unitary, compact conformal field theories with central charge c > 1 using modular bootstrap. Upper bounds on the gap in the dimension of primary operators of any spin, as well as in the dimension of scalar primaries, are computed numerically as functions of the central charge using semi-definite programming. Our bounds refine those of Hellerman and Friedan-Keller, and are in some cases saturated by known CFTs. In particular, we show that unitary CFTs with c < 8 must admit relevant deformations, and that a nontrivial bound on the gap of scalar primaries exists for c < 25. We also study bounds on the dimension gap in the presence of twist gaps, bounds on the degeneracy of operators, and demonstrate how "extremal spectra" which maximize the degeneracy at the gap can be determined numerically.
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Angelis, Luca De; Rahbek, Anders
2015-01-01
work done for the latter in Cavaliere, Rahbek and Taylor [Econometric Reviews (2014) forthcoming], we establish the asymptotic properties of the procedures based on information criteria in the presence of heteroskedasticity (conditional or unconditional) of a quite general and unknown form....... The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms...
Rejon-Barrera, Fernando
2015-01-01
We work out all of the details required for implementation of the conformal bootstrap program applied to the four-point function of two scalars and two vectors in an abstract conformal field theory in arbitrary dimension. This includes a review of which tensor structures make appearances, a construction of the projectors onto the required mixed symmetry representations, and a computation of the conformal blocks for all possible operators which can be exchanged. These blocks are presented as differential operators acting upon the previously known scalar conformal blocks. Finally, we set up the bootstrap equations which implement crossing symmetry. Special attention is given to the case of conserved vectors, where several simplifications occur.
The Local Fractional Bootstrap
DEFF Research Database (Denmark)
Bennedsen, Mikkel; Hounyo, Ulrich; Lunde, Asger;
We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data.......We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our...... and in simulations we observe that the bootstrap-based hypothesis test provides considerable finite-sample improvements over an existing test that is based on a central limit theorem. This is important when studying the roughness properties of time series data; we illustrate this by applying the bootstrap method...
On the Impact of Bootstrap in Stratified Random Sampling
Institute of Scientific and Technical Information of China (English)
LIU Cheng; ZHAO Lian-wen
2009-01-01
In general the accuracy of mean estimator can be improved by stratified random sampling. In this paper, we provide an idea different from empirical methods that the accuracy can be more improved through bootstrap resampling method under some conditions. The determination of sample size by bootstrap method is also discussed, and a simulation is made to verify the accuracy of the proposed method. The simulation results show that the sample size based on bootstrapping is smaller than that based on central limit theorem.
Bootstrap Percolation on Random Geometric Graphs
Bradonjić, Milan
2012-01-01
Bootstrap percolation has been used effectively to model phenomena as diverse as emergence of magnetism in materials, spread of infection, diffusion of software viruses in computer networks, adoption of new technologies, and emergence of collective action and cultural fads in human societies. It is defined on an (arbitrary) network of interacting agents whose state is determined by the state of their neighbors according to a threshold rule. In a typical setting, bootstrap percolation starts by random and independent "activation" of nodes with a fixed probability $p$, followed by a deterministic process for additional activations based on the density of active nodes in each neighborhood ($\\th$ activated nodes). Here, we study bootstrap percolation on random geometric graphs in the regime when the latter are (almost surely) connected. Random geometric graphs provide an appropriate model in settings where the neighborhood structure of each node is determined by geographical distance, as in wireless {\\it ad hoc} ...
Noel, James D; Biswas, Pratim; Giammar, Daniel E
2007-07-01
Leaching of mercury from coal combustion byproducts is a concern because of the toxicity of mercury. Leachability of mercury can be assessed by using sequential extraction procedures. Sequential extraction procedures are commonly used to determine the speciation and mobility of trace metals in solid samples and are designed to differentiate among metals bound by different mechanisms and to different solid phases. This study evaluated the selectivity and effectiveness of a sequential extraction process used to determine mercury binding mechanisms to various materials. A six-step sequential extraction process was applied to laboratory-synthesized materials with known mercury concentrations and binding mechanisms. These materials were calcite, hematite, goethite, and titanium dioxide. Fly ash from a full-scale power plant was also investigated. The concentrations of mercury were measured using inductively coupled plasma (ICP) mass spectrometry, whereas the major elements were measured by ICP atomic emission spectrometry. The materials were characterized by X-ray powder diffraction and scanning electron microscopy with energy dispersive spectroscopy. The sequential extraction procedure provided information about the solid phases with which mercury was associated in the solid sample. The procedure effectively extracted mercury from the target phases. The procedure was generally selective in extracting mercury. However, some steps in the procedure extracted mercury from nontarget phases, and others resulted in mercury redistribution. Iron from hematite and goethite was only leached in the reducible and residual extraction steps. Some mercury associated with goethite was extracted in the ion exchangeable step, whereas mercury associated with hematite was extracted almost entirely in the residual step. Calcium in calcite and mercury associated with calcite were primarily removed in the acid-soluble extraction step. Titanium in titanium dioxide and mercury adsorbed onto
Orawon Chailapakul; Toshihiko Imato; Narong Praphairaksit; Kulwadee Pinwattana; Chakorn Chinvongamorn
2008-01-01
A gas diffusion sequential injection system with amperometric detection using aboron-doped diamond electrode was developed for the determination of sulfite. A gasdiffusion unit (GDU) was used to prevent interference from sample matrices for theelectrochemical measurement. The sample was mixed with an acid solution to generategaseous sulfur dioxide prior to its passage through the donor channel of the GDU. Thesulfur dioxide diffused through the PTFE hydrophobic membrane into a carrier solution...
Mesquita, Raquel B R; Rangel, António O S S
2004-08-01
A sequential injection methodology for the spectrophotometric determination of calcium, magnesium and alkalinity in water samples is proposed. A single manifold is used for the determination of the three analytes, and the same protocol sequence allows the sequential determination of calcium and magnesium (the sum corresponds to the water hardness). The determination of both metals is based on their reaction with cresolphtalein complexone; mutual interference is minimized by using 8-hydroxyquinoline for the determination of calcium and ethylene glycol-bis(beta-aminoethyl ether)-N,N,N',N'-tetraacetic acid (EGTA) for the determination of magnesium. Alkalinity determination is based on a reaction with acetic acid, and corresponding color change of Bromcresol Green. Working ranges of 0.5 - 5 mg dm(-3) for Ca, 0.5 - 10 mg dm(-3) for Mg, and 10 - 100 mg HCO3- dm(-3), for alkalinity have been achieved. The results for water samples were comparable to those of the reference methods and to a certified reference water sample. RSDs lower than 5% were obtained, a low reagent consumption and a reduced volume of effluent have been accomplished. The determination rate for calcium and magnesium is 80 h(-1), corresponding to 40 h(-1) per element, while 65 determinations of alkalinity per hour could be carried out.
Analytic bootstrap at large spin
Kaviraj, Apratim; Sinha, Aninda
2015-01-01
We use analytic conformal bootstrap methods to determine the anomalous dimensions and OPE coefficients for large spin operators in general conformal field theories in four dimensions containing a scalar operator of conformal dimension $\\Delta_\\phi$. It is known that such theories will contain an infinite sequence of large spin operators with twists approaching $2\\Delta_\\phi+2n$ for each integer $n$. By considering the case where such operators are separated by a twist gap from other operators at large spin, we analytically determine the $n$, $\\Delta_\\phi$ dependence of the anomalous dimensions. We find that for all $n$, the anomalous dimensions are negative for $\\Delta_\\phi$ satisfying the unitarity bound, thus extending the Nachtmann theorem to non-zero $n$. In the limit when $n$ is large, we find agreement with the AdS/CFT prediction corresponding to the Eikonal limit of a 2-2 scattering with dominant graviton exchange.
Alhadeff, Eliana M.; Salgado, Andrea M.; Cos, Oriol; Pereira, Nei; Valdman, Belkis; Valero, Francisco
A sequential injection analysis system with two enzymatic microreactors for the determination of ethanol has been designed. Alcohol oxidase and horseradish peroxidase were separately immobilized on glass aminopropyl beads, and packed in 0.91-mL volume microreactors, working in line with the sequential injection analysis system. A stop flow of 120 s was selected for a linear ethanol range of 0.005-0.04 g/L±0.6% relative standard deviation with a throughput of seven analyses per hour. The system was applied to measure ethanol concentrations in samples of distilled and nondistilled alcoholic beverages, and of alcoholic fermentation with good performance and no significant difference compared with other analytical procedures (gas chromatography and high-performance liquid chromatography).
Toral, M Inés; Paine, Maximiliano; Leyton, Patricio; Richter, Pablo
2004-01-01
A new method for the sequential determination of attapulgite and nifuroxazide in pharmaceutical formulations by first- and second-derivative spectrophotometry, respectively, has been developed. In order to obtain the optimal conditions for nifuroxazide stability, studies of solvent, light, and temperature effects were performed. The results show that a previous hydrolysis of 2 h in 1.0 x 10(-1)M NaOH solution is necessary in order to obtain stable compounds for analytical purposes. Subsequently, the first- and second-derivative spectra were evaluated directly in the same samples. The sequential determination of the drugs can be performed using the zero-crossing method; the attapulgite determination was carried out using the first derivative at 278.0 nm and the nifuroxazide determination, using the second derivative at 282.0 nm. The determination ranges were 5.7 x 10(-6)-1.0 x 10(-4) and 3.7 x 10(-8) -1.2 x 10(-4)M for attapulgite and nifuroxazide, respectively. Repeatability (relative standard deviation) values of 1.2 and 3.0% were observed for attapulgite and nifuroxazide, respectively. The ingredients commonly found in commercial pharmaceutical formulations do not interfere. The proposed method was applied to the determination of these drugs in tablets. Further, infrared spectroscopy and cyclic voltammetry studies were carried out in order to obtain knowledge of the decomposition products of nifuroxazide.
Energy Technology Data Exchange (ETDEWEB)
Echols, R. T.; James, R. R.; Aldstadt, J. H.; Environmental Research; Univ.of Minnesota; Univ of Minnesota
1997-01-01
The application of flow injection methodology to the determination of trace concentrations of primary explosives is presented. The approach is demonstrated with a sequential injection amperometric method for the determination of the azide ion (N{sub 3}{sup -}). The proposed method can be applied to the determination of sodium azide or lead azide, a primary explosive, without regard to other sources of lead in environmental samples. The sequential injection system used for the analysis forms the basis for a proposed field-portable instrument for the analysis of primary explosives. A microporous gas permeable membrane in a gas diffusion unit (GDU) is used to separate the analyte from other anions that can also be oxidized at the amperometric cell. The behavior of the GDU was optimized with respect to the pH of the donor stream and the timing of the preconcentration step. A study of anions that are commonly found in environmental samples showed that the species that will interfere with the analytical signal can be removed by the GDU. Results from three water samples that were spiked with 0.40 ppm of azide are presented. RSDs in the range 3-5% were typically obtained using the method. The useful working range of the method was linear up to 0.5 ppm and non-linear up to 20 ppm (second-order model). The limit of detection was 24.6 ppb.
Bootstrapping quarks and gluons
Energy Technology Data Exchange (ETDEWEB)
Chew, G.F.
1979-04-01
Dual topological unitarization (DTU) - the approach to S-matrix causality and unitarity through combinatorial topology - is reviewed. Amplitudes associated with triangulated spheres are shown to constitute the core of particle physics. Each sphere is covered by triangulated disc faces corresponding to hadrons. The leading current candidate for the hadron-face triangulation pattern employs 3-triangle basic subdiscs whose orientations correspond to baryon number and topological color. Additional peripheral triangles lie along the hadron-face perimeter. Certain combinations of peripheral triangles with a basic-disc triangle can be identified as quarks, the flavor of a quark corresponding to the orientation of its edges that lie on the hadron-face perimeter. Both baryon number and flavor are additively conserved. Quark helicity, which can be associated with triangle-interior orientation, is not uniformly conserved and interacts with particle momentum, whereas flavor does not. Three different colors attach to the 3 quarks associated with a single basic subdisc, but there is no additive physical conservation law associated with color. There is interplay between color and quark helicity. In hadron faces with more than one basic subdisc, there may occur pairs of adjacent flavorless but colored triangles with net helicity +-1 that are identifiable as gluons. Broken symmetry is an automatic feature of the bootstrap. T, C and P symmetries, as well as up-down flavor symmetry, persist on all orientable surfaces.
Bootstrap Dynamical Symmetry Breaking
Directory of Open Access Journals (Sweden)
Wei-Shu Hou
2013-01-01
Full Text Available Despite the emergence of a 125 GeV Higgs-like particle at the LHC, we explore the possibility of dynamical electroweak symmetry breaking by strong Yukawa coupling of very heavy new chiral quarks Q . Taking the 125 GeV object to be a dilaton with suppressed couplings, we note that the Goldstone bosons G exist as longitudinal modes V L of the weak bosons and would couple to Q with Yukawa coupling λ Q . With m Q ≳ 700 GeV from LHC, the strong λ Q ≳ 4 could lead to deeply bound Q Q ¯ states. We postulate that the leading “collapsed state,” the color-singlet (heavy isotriplet, pseudoscalar Q Q ¯ meson π 1 , is G itself, and a gap equation without Higgs is constructed. Dynamical symmetry breaking is affected via strong λ Q , generating m Q while self-consistently justifying treating G as massless in the loop, hence, “bootstrap,” Solving such a gap equation, we find that m Q should be several TeV, or λ Q ≳ 4 π , and would become much heavier if there is a light Higgs boson. For such heavy chiral quarks, we find analogy with the π − N system, by which we conjecture the possible annihilation phenomena of Q Q ¯ → n V L with high multiplicity, the search of which might be aided by Yukawa-bound Q Q ¯ resonances.
Bootstrapping quality of Web Services
Directory of Open Access Journals (Sweden)
Zainab Aljazzaf
2015-07-01
Full Text Available A distributed application may be composed of global services provided by different organizations and having different properties. To select a service from many similar services, it is important to distinguish between them. Quality of services (QoS has been used as a distinguishing factor between similar services and plays an important role in service discovery, selection, and composition. Moreover, QoS is an important contributing factor to the evolution of distributed paradigms, such as service-oriented computing and cloud computing. There are many research works that assess services and justify the QoS at the finding, composition, or binding stages of services. However, there is a need to justify the QoS once new services are registered and before any requestors use them; this is called bootstrapping QoS. Bootstrapping QoS is the process of evaluating the QoS of the newly registered services at the time of publishing the services. Thus, this paper proposes a QoS bootstrapping solution for Web Services and builds a QoS bootstrapping framework. In addition, Service Oriented Architecture (SOA is extended and a prototype is built to support QoS bootstrapping. Experiments are conducted and a case study is presented to test the proposed QoS bootstrapping solution.
Extremal bootstrapping: go with the flow
El-Showk, Sheer
2016-01-01
The extremal functional method determines approximate solutions to the constraints of crossing symmetry, which saturate bounds on the space of unitary CFTs. We show that such solutions are characterized by extremality conditions, which may be used to flow continuously along the boundaries of parameter space. Along the flow there is generically no further need for optimization, which dramatically reduces computational requirements, bringing calculations from the realm of computing clusters to laptops. Conceptually, extremality sheds light on possible ways to bootstrap without positivity, extending the method to non-unitary theories, and implies that theories saturating bounds, and especially those sitting at kinks, have unusually sparse spectra. We discuss several applications, including the first high-precision bootstrap of a non-unitary CFT.
N=1 Supersymmetric Boundary Bootstrap
Toth, G Z
2004-01-01
We investigate the boundary bootstrap programme for finding exact reflection matrices of integrable boundary quantum field theories with N=1 boundary supersymmetry. The bulk S-matrix and the reflection matrix are assumed to take the form S=S_1S_0, R=R_1R_0, where S_0 and R_0 are the S-matrix and reflection matrix of some integrable non-supersymmetric boundary theory that is assumed to be known, and S_1 and R_1 describe the mixing of supersymmetric indices. Under the assumption that the bulk particles transform in the kink and boson/fermion representations and the ground state is a singlet we present rules by which the supersymmetry representations and reflection factors for excited boundary bound states can be determined. We apply these rules to the boundary sine-Gordon model, to the boundary a_2^(1) and a_4^(1) affine Toda field theories, to the boundary sinh-Gordon model and to the free particle.
Fertility determinants in Indonesia: a sequential analysis of the proximate determinants.
Ananta, A; Lim, T; Molyneaux, J W; Kantner, A
1992-06-01
Data drom the Indonesian Contraceptive Prevalence Survey in 1987 was used to examine the extent to which socioeconomic factors affect the direct association between proximate determinants and fertility. The Bongaarts framework was applied to individual level data on married women who had at least on birth between 1982 and 1987. The fertility measure was the probability of having a birth in the last 12 months before the survey. Proximate determinants were breast feeding, fertile period (non-amenorrhea), sexual exposure, and contraceptive use. Socioeconomic variables were husband's education, wife's education, husband's occupation, religion, urban/rural status, and region of residence. The logit regression analysis is controlled by the age of the respondent and number of children ever born at the time of the survey. There is a possibility that socioeconomic variables may have a direct impact on fertility and the logit framework does not model perfectly the true stochastic model. Thus, a regression is specified in which the probability of experiencing a birth is regressed on both proximate determinants and socioeconomic determinants and on socioeconomic determinants alone. Results show that fertility is lower when the duration of breast feeding and level of contraceptive use is higher. Fertility is higher when the length of the fertile period and sexual exposure is higher. Education showed no significant impact on duration of breast feeding, but when both parents' education is considered, women's lack of education is related to having longer fertile periods (an average of 64 months). When the wife's education is considered alone, women with no schooling and less education have 56-44 more months of sexual exposure. The husband's education considered alone followed the same pattern. As level of parents' education rose, the probability of contraception increased. Women have shorter fertile periods when husbands are farmers. Religion explains duration of breast feeding
Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.
Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil
2015-02-01
A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations.
Rapid 90Sr/90Y determination in water samples using a sequential injection method
Mateos; Gomez; Garcias; Casas; Cerda
2000-07-01
We have developed a semiautomatic procedure based on a sequential injection method for 90Sr/90Y determination that allows their radiochemical separation in about 30 min. The method has been tested using 90Sr/90Y solutions with activities lower than 12 Bq. The source is eluted in a pH = 6.5 medium through a MnO2-impregnated cotton filter, where 9OY is preconcentrated in preference by adsorption. 90Y is extracted from the column with hydroxylamine, some 90Sr in the leached solution has also been found. After the radiochemical separation, the total beta-activity of the leached solution has been determined using a low background alpha-beta proportional counter and, assuming the presence of 90Sr and 90Y at t = 0, the solution of the Bateman equations allows the initial concentration of both isotopes to be obtained. We have verified that the addition of some ions usually found in water samples (Cl-, HCO3-, NO3-, SO4(2-), Ca2+, Mg2+) does not interfere with the yield of the radiochemical process, (90 +/- 10)%. The method has been applied to 90Sr/90Y determination in mineral waters, and even in thermal waters, where the salt concentration can be about 3500 mg/l, the radiochemical yield remains greater than 80%.
The bootstrap in bioequivalence studies.
Pigeot, Iris; Hauschke, Dieter; Shao, Jun
2011-11-01
In 1997, the U.S. Food and Drug Administration (FDA) suggested in its draft guidance the use of new concepts for assessing the bioequivalence of two drug formulations, namely, the concepts of population and individual bioequivalence. Aggregate moment-based and probability-based measures of bioequivalence were introduced to derive criteria in order to decide whether two formulations should be regarded as bioequivalent or not. The statistical decision may be made via a nonparametric bootstrap percentile interval. In this article, we review the history of population and individual bioequivalence with special focus on the role of the bootstrap in this context.
Directory of Open Access Journals (Sweden)
Orawon Chailapakul
2008-03-01
Full Text Available A gas diffusion sequential injection system with amperometric detection using aboron-doped diamond electrode was developed for the determination of sulfite. A gasdiffusion unit (GDU was used to prevent interference from sample matrices for theelectrochemical measurement. The sample was mixed with an acid solution to generategaseous sulfur dioxide prior to its passage through the donor channel of the GDU. Thesulfur dioxide diffused through the PTFE hydrophobic membrane into a carrier solution of 0.1 M phosphate buffer (pH 8/0.1% sodium dodecyl sulfate in the acceptor channel of theGDU and turned to sulfite. Then the sulfite was carried to the electrochemical flow cell anddetected directly by amperometry using the boron-doped diamond electrode at 0.95 V(versus Ag/AgCl. Sodium dodecyl sulfate was added to the carrier solution to preventelectrode fouling. This method was applicable in the concentration range of 0.2-20 mgSO32Ã¢ÂˆÂ’/L and a detection limit (S/N = 3 of 0.05 mg SO32Ã¢ÂˆÂ’/L was achieved. This method wassuccessfully applied to the determination of sulfite in wines and the analytical resultsagreed well with those obtained by iodimetric titration. The relative standard deviations forthe analysis of sulfite in wines were in the range of 1.0-4.1 %. The sampling frequency was65 hÃ¢ÂˆÂ’1.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
On the basis of oxidative decoloration of bromopyrogallol red(BPR) with H2O2, catalyzed by horseradish peroxidase(HRP), and the sequential injection renewable surface technique(SI-RST), a highly sensitive optical-fiber sensor spectrophotometric method for the enzymatic determination of hydrogen peroxide was proposed. By coupling with a glucose oxidase(GOD) -catalyzed reaction, the method was used to determine glucose in human serum. The considerations in system and flow cell design, and factors that influence the determination performance are discussed. With 100 μL of sample loaded and 0. 6 mg of bead trapped, the linear response range from 5.0 × 10-8 to 5. 2 × 10-6mol/L BPR with a detection limit(3σ) of 2. 5 × 10 -8 mol/L BPR, and a precision of 1.1% RSD(n = 11) and a throughput of a 80 samples per hour can be achieved. Under the conditions of a 8. 7 × 10-6 mol/L BPR substrate,0.04 unit/mL HRP, 600 s reaction time and a reaction temperature of 37 ℃, the linear response range for H2O2 was from 5.0 × 10-8 to 7.0 × 10-6 mol/L with a detection limit(3σ) of 1.0 × 10-8 mol/L and a precision of 3.7% RSD(n=11). The linear response range by coupling with a GOD-catalyzed reaction was from 1.0 ×10-7 to 1.0 × 10-5mol/L. The method was directly applied to determine glucose in human serum. Glucose contents obtained by the proposed procedure were compared with those obtained by using the phenol-4-AAP method, the error was found to be less than 3%.
1979-02-15
A simple approximate formula is shown to be remarkably accurate for the determination of the regions of the sequential test for the correlation ... coefficient , rho, when the variates follow a bivariate normal distribution. The approximate results are compared with the exact values and with an
Li, Songqing; Gao, Peng; Zhang, Jiaheng; Li, Yubo; Peng, Bing; Gao, Haixiang; Zhou, Wenfeng
2012-12-01
A novel dispersive liquid-liquid microextraction (DLLME) method followed by HPLC analysis, termed sequential DLLME, was developed for the preconcentration and determination of aryloxyphenoxy-propionate herbicides (i.e. haloxyfop-R-methyl, cyhalofop-butyl, fenoxaprop-P-ethyl, and fluazifop-P-butyl) in aqueous samples. The method is based on the combination of ultrasound-assisted DLLME with in situ ionic liquid (IL) DLLME into one extraction procedure and achieved better performance than widely used DLLME procedures. Chlorobenzene was used as the extraction solvent during the first extraction. Hydrophilic IL 1-octyl-3-methylimidazolium chloride was used as a dispersive solvent during the first extraction and as an extraction solvent during the second extraction after an in situ chloride exchange by bis[(trifluoromethane)sulfonyl]imide. Several experimental parameters affecting the extraction efficiency were studied and optimized with the design of experiments using MINITAB® 16 software. Under the optimized conditions, the extractions resulted in analyte recoveries of 78-91%. The correlation coefficients of the calibration curves ranged from 0.9994 to 0.9997 at concentrations of 10-300, 15-300, and 20-300 μg L(-1). The relative SDs (n = 5) ranged from 2.9 to 5.4%. The LODs for the four herbicides were between 1.50 and 6.12 μg L(-1).
Miró, Manuel; Gómez, Enrique; Estela, José Manuel; Casas, Montserrat; Cerdà, Victor
2002-02-15
A sequential injection procedure involving a flow-reversal wetting-film extraction method for the determination of the radionuclide 90Sr has been developed. The methodology is based on the coating of the inner walls of an open tubular reactor with a film prepared from a 0.14 M 4,4'(5')-bis(tert-butylcyclohexano)-18-crown-6 (BCHC) solution in 1-octanol, which allows the selective isolation of strontium from the sample matrix. Selection of the optimum extractant diluent attending its physical properties, investigation of the extraction kinetics features, and choice of the proper elution procedure are discussed in detail in this paper. The noteworthy aspects of using a wetting-film phase instead of a solid-phase material described to date in the literature are the reduction of crown ether consumption and the simplification of both the operational sequence and the automation of the extractant-phase renewal between consecutive samples, which is of interest to avoid analyte carryover and reduction of the resin capacity factor caused by irreversible interferences. The proposed method has been successfully applied to different spiked environmental samples (water, milk, and soil), with 90Sr total activities ranging between 0.07 and 0.30 Bq, measured using a low-background proportional counter. The standard deviation of the automated analytical separation procedure is lower than 3% (n = 10), and the 90Sr isolation process under the studied conditions may be carried out with a yield up to 80%.
Coefficient Omega Bootstrap Confidence Intervals: Nonnormal Distributions
Padilla, Miguel A.; Divers, Jasmin
2013-01-01
The performance of the normal theory bootstrap (NTB), the percentile bootstrap (PB), and the bias-corrected and accelerated (BCa) bootstrap confidence intervals (CIs) for coefficient omega was assessed through a Monte Carlo simulation under conditions not previously investigated. Of particular interests were nonnormal Likert-type and binary items.…
Bootstrap Current in Spherical Tokamaks
Institute of Scientific and Technical Information of China (English)
王中天; 王龙
2003-01-01
Variational principle for the neoclassical theory has been developed by including amomentum restoring term in the electron-electron collisional operator, which gives an additionalfree parameter maximizing the heat production rate. All transport coefficients are obtained in-cluding the bootstrap current. The essential feature of the study is that the aspect ratio affects thefunction of the electron-electron collision operator through a geometrical factor. When the aspectratio approaches to unity, the fraction of circulating particles goes to zero and the contribution toparticle flux from the electron-electron collision vanishes. The resulting diffusion coefficient is inrough agreement with Hazeltine. When the aspect ratio approaches to infinity, the results are inagreement with Rosenbluth. The formalism gives the two extreme cases a connection. The theoryis particularly important for the calculation of bootstrap current in spherical tokamaks and thepresent tokamaks, in which the square root of the inverse aspect ratio, in general, is not small.
Conformal Bootstrap in Mellin Space
Gopakumar, Rajesh; Sen, Kallol; Sinha, Aninda
2016-01-01
We propose a new approach towards analytically solving for the dynamical content of Conformal Field Theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the epsilon expansion of the Wilson-Fisher fixed point by computing operator dimensions and, strikingly, OPE coefficients to higher orders in epsilon than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement of certain observables in the 3d Ising model, with the precise numerical values that...
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
Sequential chemical extraction procedure has been widely used to partition particulate trace metals into various fractions and to describe the distribution and the statue of trace metals in geo-environment. One sequential chemical extraction procedure was employed here to partition various fractions of Mn in soils. The experiment was designed with quality controlling concept in order to show sampling and analytical error. Experimental results obtained on duplicate analysis of all soil samples demonstrated that the precision was less than 10% (at 95% confidence level). The accuracy was estimated by comparing the accepted total concentration of Mn in standard reference materials (SRMs) with the measured sum of the individual fractions. The recovery of Mn from SRM1 and SRM2 was 94.1% and 98.4% , respectively. The detection limit, accuracy and precision of the sequential chemical extraction procedure were discussed in detailed. All the results suggest that the trueness of the analytical method is satisfactory.
Unbiased bootstrap error estimation for linear discriminant analysis.
Vu, Thang; Sima, Chao; Braga-Neto, Ulisses M; Dougherty, Edward R
2014-12-01
Convex bootstrap error estimation is a popular tool for classifier error estimation in gene expression studies. A basic question is how to determine the weight for the convex combination between the basic bootstrap estimator and the resubstitution estimator such that the resulting estimator is unbiased at finite sample sizes. The well-known 0.632 bootstrap error estimator uses asymptotic arguments to propose a fixed 0.632 weight, whereas the more recent 0.632+ bootstrap error estimator attempts to set the weight adaptively. In this paper, we study the finite sample problem in the case of linear discriminant analysis under Gaussian populations. We derive exact expressions for the weight that guarantee unbiasedness of the convex bootstrap error estimator in the univariate and multivariate cases, without making asymptotic simplifications. Using exact computation in the univariate case and an accurate approximation in the multivariate case, we obtain the required weight and show that it can deviate significantly from the constant 0.632 weight, depending on the sample size and Bayes error for the problem. The methodology is illustrated by application on data from a well-known cancer classification study.
Institute of Scientific and Technical Information of China (English)
DONGDe－ming; FANGChun－sheng; 等
2002-01-01
Sequential chemical extraction procedure has been widely used to partition particulate trace metals into vari-ous fractions and to describe the distribution and the statue of trace metals in geo-environment.One sequential chemical extraction procedure was employed here to partition various fractions of Mn in soils.The experiment was designed with quality controlling concept in order to show sampling and analytical error.Experimental results obtained on duplicate analy-sis of all soil samples demonstrated that the precision was less than 10%(at 95% confidence level).The accuracy was estimated by comparing the accepted total concentration of Mn in standard reference materials (SRMs) with the measured sum of the individual fractions.The recovery of Mn from SRM1 and SRM2 was 94.1% and 98.4%,respectively.The detection limit,accuracy and precision of the sequential chemical extraction procedure were discussed in detailed.All the results suggest that the trueness of the analytical method is satisfactory.
Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Grassi, Viviane; Zagatto, Elias A G
2006-06-14
A sequential injection system for the automatic determination of glycerol in wine and beer was developed. The method is based on the rate of formation of NADH from the reaction of glycerol and NAD+ catalyzed by the enzyme glycerol dehydrogenase in solution. The determination of glycerol was performed between 0.3 and 3.0 mmol L(-1) (0.028 and 0.276 g L(-1)), and good repeatability was attained (rsd production was 2.12 mL per assay. Results obtained for samples were in agreement with those obtained with the batch enzymatic method.
Bootstrap percolation on spatial networks
Gao, Jian; Zhou, Tao; Hu, Yanqing
2015-10-01
Bootstrap percolation is a general representation of some networked activation process, which has found applications in explaining many important social phenomena, such as the propagation of information. Inspired by some recent findings on spatial structure of online social networks, here we study bootstrap percolation on undirected spatial networks, with the probability density function of long-range links’ lengths being a power law with tunable exponent. Setting the size of the giant active component as the order parameter, we find a parameter-dependent critical value for the power-law exponent, above which there is a double phase transition, mixed of a second-order phase transition and a hybrid phase transition with two varying critical points, otherwise there is only a second-order phase transition. We further find a parameter-independent critical value around -1, about which the two critical points for the double phase transition are almost constant. To our surprise, this critical value -1 is just equal or very close to the values of many real online social networks, including LiveJournal, HP Labs email network, Belgian mobile phone network, etc. This work helps us in better understanding the self-organization of spatial structure of online social networks, in terms of the effective function for information spreading.
Lee, Jin-Ho; Kim, Dong-Jin; Ahn, Byung-Koo
2015-06-01
The objectives of this study were to investigate the distribution of thallium in soils collected near suspected areas such as cement plants, active and closed mines, and smelters and to examine the extraction of thallium in the soils using 19 single chemical and sequential chemical extraction procedures. Thallium concentrations in soils near cement plants were distributed between 1.20 and 12.91 mg kg(-1). However, soils near mines and smelters contained relatively low thallium concentrations ranging from 0.18 to 1.09 mg kg(-1). Thallium extractability with 19 single chemical extractants from selected soils near cement plants ranged from 0.10% to 8.20% of the total thallium concentration. In particular, 1.0 M NH4Cl, 1.0 M (NH4)2SO4, and 1.0 M CH3COONH4 extracted more thallium than other extractants. Sequential fractionation results of thallium from different soils such as industrially and artificially contaminated soils varied with the soil properties, especially soil pH and the duration of thallium contamination.
DEFF Research Database (Denmark)
Shi, Keliang; Qiao, Jixin; Wu, Wangsuo
2012-01-01
An automated method was developed for rapid determination of 99Tc in large volume seawater samples. The analytical procedure involves preconcentration of technetium with coprecipitation, online separation using extraction chromatography (two TEVA columns) implemented in a sequential injection set...
Statistical Analysis of Random Simulations : Bootstrap Tutorial
Deflandre, D.; Kleijnen, J.P.C.
2002-01-01
The bootstrap is a simple but versatile technique for the statistical analysis of random simulations.This tutorial explains the basics of that technique, and applies it to the well-known M/M/1 queuing simulation.In that numerical example, different responses are studied.For some responses, bootstrap
Efficient bootstrap with weakly dependent processes
Bravo, Francesco; Crudu, Federico
2012-01-01
The efficient bootstrap methodology is developed for overidentified moment conditions models with weakly dependent observation. The resulting bootstrap procedure is shown to be asymptotically valid and can be used to approximate the distributions of t-statistics, the J-statistic for overidentifying
Ezoe, Kentaro; Ohyama, Seiichi; Hashem, Md Abul; Ohira, Shin-Ichi; Toda, Kei
2016-02-01
After the Fukushima disaster, power generation from nuclear power plants in Japan was completely stopped and old coal-based power plants were re-commissioned to compensate for the decrease in power generation capacity. Although coal is a relatively inexpensive fuel for power generation, it contains high levels (mgkg(-1)) of selenium, which could contaminate the wastewater from thermal power plants. In this work, an automated selenium monitoring system was developed based on sequential hydride generation and chemiluminescence detection. This method could be applied to control of wastewater contamination. In this method, selenium is vaporized as H2Se, which reacts with ozone to produce chemiluminescence. However, interference from arsenic is of concern because the ozone-induced chemiluminescence intensity of H2Se is much lower than that of AsH3. This problem was successfully addressed by vaporizing arsenic and selenium individually in a sequential procedure using a syringe pump equipped with an eight-port selection valve and hot and cold reactors. Oxidative decomposition of organoselenium compounds and pre-reduction of the selenium were performed in the hot reactor, and vapor generation of arsenic and selenium were performed separately in the cold reactor. Sample transfers between the reactors were carried out by a pneumatic air operation by switching with three-way solenoid valves. The detection limit for selenium was 0.008 mg L(-1) and calibration curve was linear up to 1.0 mg L(-1), which provided suitable performance for controlling selenium in wastewater to around the allowable limit (0.1 mg L(-1)). This system consumes few chemicals and is stable for more than a month without any maintenance. Wastewater samples from thermal power plants were collected, and data obtained by the proposed method were compared with those from batchwise water treatment followed by hydride generation-atomic fluorescence spectrometry.
Zhang, Zhuomin; Zhao, Cheng; Li, Gongke
2016-07-01
Achieving reproducible signals is a key point to improve the analytical precision and accuracy of surface enhanced Raman scattering (SERS) technique and further expand the application scope of SERS for on-site and rapid analysis of real sample with complex matrice. In this work, a novel Au@hydroxyl-functionalized polystyrene (Au@PS-OH) substrate was prepared by atom transfer radical polymerization and chemical assembly method, which possessed promised potential for the rapid and sequential analysis of multisamples coupling with SERS technique. Au@PS-OH substrate with regular nanoarrayed morphology possessed excellent anti-agglomeration capability even for testing solutions with strong basicity or acidity, mechanic and chemical stability due to the large amount of Au nanoparticles homogeneously and stably fixed on substrate surface. Moreover, excellent hydrophobicity of Au@PS-OH substrate could keep testing droplets of multiple samples stable and uniform spherical shape with similar contact angles to substrate, which guaranteed the reproducible SERS light paths and SERS signals during real sequential analysis. Then, an Au@PS-OH based SERS analytical method was developed and practically applied for the sequential determination of trace 4-aminoazobenzene in various textiles. It was satisfactory that the contents of trace 4-aminoazobenzene in black woolen, green woolen and yellow fiber cloth could be actually found and calculated to be 106.4, 120.9 and 140.8mg/kg with good recoveries of 76.0-118.9% and relative standard deviations of 1.6-5.1%. It is expected that this SERS method is suitable for on-site and rapid analysis of multiple samples in a short period.
Perrotta, Allison R; Kumaraswamy, Rajkumari; Bastidas-Oyanedel, Juan R; Alm, Eric J; Rodríguez, Jorge
2017-01-01
The sustainable recovery of resources from wastewater streams can provide many social and environmental benefits. A common strategy to recover valuable resources from wastewater is to harness the products of fermentation by complex microbial communities. In these fermentation bioreactors high microbial community diversity within the inoculum source is commonly assumed as sufficient for the selection of a functional microbial community. However, variability of the product profile obtained from these bioreactors is a persistent challenge in this field. In an attempt to address this variability, the impact of inoculum on the microbial community structure and function within the bioreactor was evaluated using controlled laboratory experiments. In the course of this work, sequential batch reactors were inoculated with three complex microbial inocula and the chemical and microbial compositions were monitored by HPLC and 16S rRNA amplicon analysis, respectively. Microbial community dynamics and chemical profiles were found to be distinct to initial inoculate and highly reproducible. Additionally we found that the generation of a complex volatile fatty acid profile was not specific to the diversity of the initial microbial inoculum. Our results suggest that the composition of the original inoculum predictably contributes to bioreactor community structure and function.
Perrotta, Allison R.; Kumaraswamy, Rajkumari; Bastidas-Oyanedel, Juan R.; Alm, Eric J.
2017-01-01
The sustainable recovery of resources from wastewater streams can provide many social and environmental benefits. A common strategy to recover valuable resources from wastewater is to harness the products of fermentation by complex microbial communities. In these fermentation bioreactors high microbial community diversity within the inoculum source is commonly assumed as sufficient for the selection of a functional microbial community. However, variability of the product profile obtained from these bioreactors is a persistent challenge in this field. In an attempt to address this variability, the impact of inoculum on the microbial community structure and function within the bioreactor was evaluated using controlled laboratory experiments. In the course of this work, sequential batch reactors were inoculated with three complex microbial inocula and the chemical and microbial compositions were monitored by HPLC and 16S rRNA amplicon analysis, respectively. Microbial community dynamics and chemical profiles were found to be distinct to initial inoculate and highly reproducible. Additionally we found that the generation of a complex volatile fatty acid profile was not specific to the diversity of the initial microbial inoculum. Our results suggest that the composition of the original inoculum predictably contributes to bioreactor community structure and function. PMID:28196102
Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W
2002-01-01
LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Roos, Per;
2013-01-01
An analytical method was developed for simultaneous determination of ultratrace level plutonium (Pu) and neptunium (Np) using iron hydroxide coprecipitation in combination with automated sequential injection extraction chromatography separation and accelerator mass spectrometry (AMS) measurement...... show that preboiling and aging are important for obtaining high chemical yields for both Pu and Np, which is possibly related to the aggregation and adsorption behavior of organic substances contained in urine. Although the optimal condition for Np and Pu simultaneous determination requires 5-day aging...... time, an immediate coprecipitation without preboiling and aging could also provide fairly satisfactory chemical yields for both Np and Pu (50-60%) with high sample throughput (4 h/sample). Within the developed method, (242)Pu was exploited as chemical yield tracer for both Pu and Np isotopes. (242)Pu...
Bootstrap Learning and Visual Processing Management on Mobile Robots
Directory of Open Access Journals (Sweden)
Mohan Sridharan
2010-01-01
Full Text Available A central goal of robotics and AI is to enable a team of robots to operate autonomously in the real world and collaborate with humans over an extended period of time. Though developments in sensor technology have resulted in the deployment of robots in specific applications the ability to accurately sense and interact with the environment is still missing. Key challenges to the widespread deployment of robots include the ability to learn models of environmental features based on sensory inputs, bootstrap off of the learned models to detect and adapt to environmental changes, and autonomously tailor the sensory processing to the task at hand. This paper summarizes a comprehensive effort towards such bootstrap learning, adaptation, and processing management using visual input. We describe probabilistic algorithms that enable a mobile robot to autonomously plan its actions to learn models of color distributions and illuminations. The learned models are used to detect and adapt to illumination changes. Furthermore, we describe a probabilistic sequential decision-making approach that autonomously tailors the visual processing to the task at hand. All algorithms are fully implemented and tested on robot platforms in dynamic environments.
Detection of Wideband Signal Number Based on Bootstrap Resampling
Directory of Open Access Journals (Sweden)
Jiaqi Zhen
2016-01-01
Full Text Available Knowing source number correctly is the precondition for most spatial spectrum estimation methods; however, many snapshots are needed when we determine number of wideband signals. Therefore, a new method based on Bootstrap resampling is proposed in this paper. First, signals are divided into some nonoverlapping subbands; apply coherent signal methods (CSM to focus them on the single frequency. Then, fuse the eigenvalues with the corresponding eigenvectors of the focused covariance matrix. Subsequently, use Bootstrap to construct the new resampling matrix. Finally, the number of wideband signals can be calculated with obtained vector sequences according to clustering technique. The method has a high probability of success under low signal to noise ratio (SNR and small number of snapshots.
Conformal bootstrap: non-perturbative QFT's under siege
CERN. Geneva
2016-01-01
[Exceptionally in Council Chamber] Originally formulated in the 70's, the conformal bootstrap is the ambitious idea that one can use internal consistency conditions to carve out, and eventually solve, the space of conformal field theories. In this talk I will review recent developments in the field which have boosted this program to a new level. I will present a method to extract quantitative informations in strongly-interacting theories, such as 3D Ising, O(N) vector model and even systems without a Lagrangian formulation. I will explain how these techniques have led to the world record determination of several critical exponents. Finally, I will review exact analytical results obtained using bootstrap techniques.
DEFF Research Database (Denmark)
Liu, Xuezhu; Hansen, Elo Harald
1996-01-01
. The operating parameters were optimised by fractional factorial screening and response surface modelling. The linear range of D-glucose determination was 30-600 mu M, With a detection limit of 15 mu M using a photodiode detector. The sampling frequency was 54 h(-1). Lower LOD (0.5 mu M D-glucose) could...
Soybean yield modeling using bootstrap methods for small samples
Energy Technology Data Exchange (ETDEWEB)
Dalposso, G.A.; Uribe-Opazo, M.A.; Johann, J.A.
2016-11-01
One of the problems that occur when working with regression models is regarding the sample size; once the statistical methods used in inferential analyzes are asymptotic if the sample is small the analysis may be compromised because the estimates will be biased. An alternative is to use the bootstrap methodology, which in its non-parametric version does not need to guess or know the probability distribution that generated the original sample. In this work we used a set of soybean yield data and physical and chemical soil properties formed with fewer samples to determine a multiple linear regression model. Bootstrap methods were used for variable selection, identification of influential points and for determination of confidence intervals of the model parameters. The results showed that the bootstrap methods enabled us to select the physical and chemical soil properties, which were significant in the construction of the soybean yield regression model, construct the confidence intervals of the parameters and identify the points that had great influence on the estimated parameters. (Author)
Dametto, Patrícia Roberta; Franzini, Vanessa Pezza; Gomes Neto, José Anchieta
2007-07-25
A flow injection spectrophotometric system is proposed for phosphite determination in fertilizers by the molybdenum blue method after the processing of each sample two times on-line without and with an oxidizing step. The flow system was designed to add sulfuric acid or permanganate solutions alternately into the system by simply displacing the injector-commutator from one resting position to another, allowing the determination of phosphate and total phosphate, respectively. The concentration of phosphite is obtained then by difference between the two measurents. The influence of flow rates, sample volume, and dimension of flow line connecting the injector-commutator to the main analytical channel was evaluated. The proposed method was applied to phosphite determination in commercial liquid fertilizers. Results obtained with the proposed FIA system were not statistically different from those obtained by titrimetry at the 95% confidence level. In addition, recoveries within 94 and 100% of spiked fertilizers were found. The relative standard deviation (n = 12) related to the phosphite-converted-phosphate peak alone was determinations per hour, and the reagent consumption was about 6.3 mg of KMnO4, 200 mg of (NH4)6Mo7O24.4H2O, and 40 mg of ascorbic acid per measurement.
Sequential determination of lead and cobalt in tap water and foods samples by fluorescence.
Talio, María Carolina; Alesso, Magdalena; Acosta, María Gimena; Acosta, Mariano; Fernández, Liliana P
2014-09-01
In this work, a new procedure was developed for the separation and preconcentration of lead(II) and cobalt(II) in several water and foods samples. Complexes of metal ions with 8-hydroxyquinolein (8-HQ) were formed in aqueous solution. The proposed methodology is based on the preconcentration/separation of Pb(II) by solid-phase extraction using paper filter, followed by spectrofluorimetric determination of both metals, on the solid support and the filtered aqueous solution, respectively. The solid surface fluorescence determination was carried out at λem=455 nm (λex=385 nm) for Pb(II)-8-HQ complex and the fluorescence of Co(II)-8-HQ was determined in aqueous solution using λem=355 nm (λex=225 nm). The calibration graphs are linear in the range 0.14-8.03×10(4) μg L(-1) and 7.3×10(-2)-4.12×10(3) μg L(-1), for Pb(II) and Co(II), respectively, with a detection limit of 4.3×10(-2) and 2.19×10(-2) μg L(-1) (S/N=3). The developed methodology showed good sensitivity and adequate selectivity and it was successfully applied to the determination of trace amounts of lead and cobalt in tap waters belonging of different regions of Argentina and foods samples (milk powder, express coffee, cocoa powder) with satisfactory results. The new methodology was validated by electrothermal atomic absorption spectroscopy with adequate agreement. The proposed methodology represents a novel application of fluorescence to Pb(II) and Co(II) quantification with sensitivity and accuracy similar to atomic spectroscopies.
DEFF Research Database (Denmark)
Hansen, Elo Harald
Determination of low or trace-level amounts of metals by electrothermal atomic absorption spectrometry (ETAAS) often requires the use of suitable preconcentration and/or separation procedures in order to attain the necessary sensitivity and selectivity. Such schemes are advantageously executed....../preconcentration procedures have been suggested and applied, such as liquid-liquid extraction, (co)precipitation with collection in knotted reactors, adsorption, hydride generation, or ion-exchange. Selected examples of some of these procedures will be discussed. Emphasis will be placed on the use of FI...
Saad, Bahruddin; Wai, Wan Tatt; Ali, Abdussalam Salhin M; Saleh, Muhammad Idiris
2006-01-01
A flow injection analysis (FIA) method for the determination of four residual chlorine species, namely combined available chlorine (CAC), free available chlorine (FAC), total available chlorine (TAC) and chlorite (ClO2-) was developed using a flow-through triiodide-selective electrode as a detector. An important strategy of speciation studies utilized the kinetic discrimination of reactions between the CAC and FAC with Fe2+, which was applied to the speciation of FAC, CAC and TAC. The speciation of available chlorine species and chlorite (an oxychlorine species) was achieved by using the same set-up, but using flow streams of different pH. The effects of the pH of the carrier stream, the flow rate and the sample volume were studied. The method exhibited linearity from 2.8 x 10(-6) to 2.8 x 10(-4) M active chlorine (expressed as OCl-) with a detection limit of 1.4 x 10(-6) M. The selectivity of the method was studied by examining the minimum pH for the oxidation of iodide by other oxidants, and also by assessing the potentiometric selectivity coefficients. The proposed method was successfully applied to the determination of chlorine species in tap water, and disinfecting formulations where good agreement occurred between the proposed and standard methods were found.
Energy Technology Data Exchange (ETDEWEB)
Silva, Sidnei G. [Universidade de Sao Paulo, Instituto de Quimica, Sao Paulo (Brazil); Morales-Rubio, Angel; Guardia, Miguel de la [Universidad de Valencia, Department of Analytical Chemistry, Burjassot, Valencia (Spain); Rocha, Fabio R.P. [Universidade de Sao Paulo, Centro de Energia Nuclear na Agricultura, Piracicaba (Brazil)
2011-07-15
A new procedure for spectrofluorimetric determination of free and total glycerol in biodiesel samples is presented. It is based on the oxidation of glycerol by periodate, forming formaldehyde, which reacts with acetylacetone, producing the luminescent 3,5-diacetyl-1,4-dihydrolutidine. A flow system with solenoid micro-pumps is proposed for solution handling. Free glycerol was extracted off-line from biodiesel samples with water, and total glycerol was converted to free glycerol by saponification with sodium ethylate under sonication. For free glycerol, a linear response was observed from 5 to 70 mg L{sup -1} with a detection limit of 0.5 mg L{sup -1}, which corresponds to 2 mg kg{sup -1} in biodiesel. The coefficient of variation was 0.9% (20 mg L{sup -1}, n = 10). For total glycerol, samples were diluted on-line, and the linear response range was 25 to 300 mg L{sup -1}. The detection limit was 1.4 mg L{sup -1} (2.8 mg kg{sup -1} in biodiesel) with a coefficient of variation of 1.4% (200 mg L{sup -1}, n = 10). The sampling rate was ca. 35 samples h{sup -1} and the procedure was applied to determination of free and total glycerol in biodiesel samples from soybean, cottonseed, and castor beans. (orig.)
Albert, Anastasia; Eksteen, J Johannes; Isaksson, Johan; Sengee, Myagmarsuren; Hansen, Terkel; Vasskog, Terje
2016-10-04
Within the field of bioprospecting, disulfide-rich peptides are a promising group of compounds that has the potential to produce important leads for new pharmaceuticals. The disulfide bridges stabilize the tertiary structure of the peptides and often make them superior drug candidates to linear peptides. However, determination of disulfide connectivity in peptides with many disulfide bridges has proven to be laborious and general methods are lacking. This study presents a general approach for structure elucidation of disulfide-rich peptides. The method features sequential reduction and alkylation of a peptide on solid phase combined with sequencing of the fully alkylated peptide by tandem mass spectrometry. Subsequently, the disulfide connectivity is assigned on the basis of the determined alkylation pattern. The presented method is especially suitable for peptides that are prone to disulfide scrambling or are unstable in solution with partly reduced bridges. Additionally, the use of small amounts of peptide in the lowest nmol range makes the method ideal for structure elucidation of unknown peptides from the bioprospecting process. This study successfully demonstrates the new method for seven different peptides with two to four disulfide bridges. Two peptides with previous contradicting publications, μ-conotoxin KIIA and hepcidin-25, are included, and their disulfide connectivity is confirmed in accordance with the latest published results.
Burguera, José L; Burguera, Marcela; Antón, Raquel E; Salager, Jean-Louis; Arandia, María A; Rondón, Carlos; Carrero, Pablo; de Peña, Yaneira Petit; Brunetto, Rosario; Gallignani, Máximo
2005-12-15
The sequential injection (SIA) technique was applied for the on-line preparation of an "oil in water" microemulsion and for the determination of aluminum in new and used lubricating oils by electrothermal atomic absorption spectrometry (ET AAS) with Zeeman-effect background correction. Respectively, 1.0, 0.5 and 1.0ml of surfactants mixture, sample and co-surfactant (sec-butanol) solutions were sequentially aspirated to a holding coil. The sonication and repetitive change of the flowing direction improved the stability of the different emulsion types (oil in water, water in oil and microemulsion). The emulsified zone was pumped to fill the sampling arm of the spectrometer with a sub-sample of 200mul. Then, 10mul of this sample solution were introduced by means of air displacement in the graphite tube atomizer. This sequence was timed to synchronize with the previous introduction of 15mug of Mg(NO(3))(2) (in a 10mul) by the spectrometer autosampler. The entire SIA system was controlled by a computer, independent of the spectrometer. The furnace program was carried out by employing a heating cycle in four steps: drying (two steps at 110 and 130 degrees C), pyrolisis (at 1500 degrees C), atomization (at 2400 degrees C) and cleaning (at 2400 degrees C). The calibration graph was linear from 7.7 to 120mugAll(-1). The characteristic mass (mo) was 33.2pg/0.0044s and the detection limit was 2.3mugAll(-1). The relative standard (RSD) of the method, evaluated by replicate analyses of different lubricating oil samples varied in all cases between 1.5 and 1.7%, and the recovery values found in the analysis of spiked samples ranged from 97.2 to 100.4%. The agreement between the observed and reference values obtained from two NIST Standard Certified Materials was good. The method was simple and satisfactory for determining aluminum in new and used lubricating oils.
BOOTSTRAPPING FOR EXTRACTING RELATIONS FROM LARGE CORPORA
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A new approach of relation extraction is described in this paper. It adopts a bootstrapping model with a novel iteration strategy, which generates more precise examples of specific relation. Compared with previous methods, the proposed method has three main advantages: first, it needs less manual intervention; second, more abundant and reasonable information are introduced to represent a relation pattern; third, it reduces the risk of circular dependency occurrence in bootstrapping. Scalable evaluation methodology and metrics are developed for our task with comparable techniques over TianWang 100G corpus. The experimental results show that it can get 90% precision and have excellent expansibility.
Conference on Bootstrapping and Related Techniques
Rothe, Günter; Sendler, Wolfgang
1992-01-01
This book contains 30 selected, refereed papers from an in- ternational conference on bootstrapping and related techni- ques held in Trier 1990. Thepurpose of the book is to in- form about recent research in the area of bootstrap, jack- knife and Monte Carlo Tests. Addressing the novice and the expert it covers as well theoretical as practical aspects of these statistical techniques. Potential users in different disciplines as biometry, epidemiology, computer science, economics and sociology but also theoretical researchers s- hould consult the book to be informed on the state of the art in this area.
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Roos, Per
2010-01-01
This paper reports an automated analytical method for rapid and simultaneous determination of plutonium isotopes (239Pu and 240Pu) and neptunium (237Np) in environmental samples. An extraction chromatographic column packed with TrisKem TEVA® resin was incorporated in a sequential injection (SI...
Qiao, Jixin; Hou, Xiaolin; Roos, Per; Lachner, Johannes; Christl, Marcus; Xu, Yihong
2013-09-17
An analytical method was developed for simultaneous determination of ultratrace level plutonium (Pu) and neptunium (Np) using iron hydroxide coprecipitation in combination with automated sequential injection extraction chromatography separation and accelerator mass spectrometry (AMS) measurement. Several experimental parameters affecting the analytical performance were investigated and compared including sample preboiling operation, aging time, amount of coprecipitating reagent, reagent for pH adjustment, sedimentation time, and organic matter decomposition approach. The overall analytical results show that preboiling and aging are important for obtaining high chemical yields for both Pu and Np, which is possibly related to the aggregation and adsorption behavior of organic substances contained in urine. Although the optimal condition for Np and Pu simultaneous determination requires 5-day aging time, an immediate coprecipitation without preboiling and aging could also provide fairly satisfactory chemical yields for both Np and Pu (50-60%) with high sample throughput (4 h/sample). Within the developed method, (242)Pu was exploited as chemical yield tracer for both Pu and Np isotopes. (242)Pu was also used as a spike in the AMS measurement for quantification of (239)Pu and (237)Np concentrations. The results show that, under the optimal experimental condition, the chemical yields of (237)Np and (242)Pu are nearly identical, indicating the high feasibility of (242)Pu as a nonisotopic tracer for (237)Np determination in real urine samples. The analytical method was validated by analysis of a number of urine samples spiked with different levels of (237)Np and (239)Pu. The measured values of (237)Np and (239)Pu by AMS exhibit good agreement (R(2) ≥ 0.955) with the spiked ones confirming the reliability of the proposed method.
Energy Technology Data Exchange (ETDEWEB)
Gine, Maria Fernanda; Patreze, Aparecida F.; Silva, Edson L. [Centro de Energia Nuclear na Agricultura (CENA-USP), Piracicaba, SP (Brazil)]. E-mail: mfgine@cena.usp.br; Sarkis, Jorge E.S.; Kakazu, Mauricio H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2008-07-01
A two-step sequential cloud point extraction (CPE) of trace elements from small sample volumes of human serum, animal blood, and food diet is proposed to gain analytical information in the analysis by inductively coupled plasma mass spectrometry. The first CPE was attained by adding O,O-diethyldithiophosphate, the non ionic surfactant Triton{sup R} X-114 followed by heating at 40 deg C, centrifugation and cooling at 0 deg C. The resulting surfactant-rich phase was separated to determine Cd, Pb and Cu by isotope dilution. Isotope ratio measurements presented RSD < 0.7%. The residual surfactant-poor phase solution had the pH adjusted in the range 4 to 5 before the chelating reagent, 4-(2-pyridylazo) resorcinol plus surfactant Triton{sup R} X-114 were added followed by the sequence to attain the CPE. Co and Ni were quantified in the second extracted surfactant-rich phases by standard additions method with RSD < 2%. Recoveries from 85 to 96% were obtained for all elements. Analyzing reference materials with certified and recommended values assessed accuracy. (author)
Directory of Open Access Journals (Sweden)
Mohtasham MOHAMMADI
2014-03-01
Full Text Available An experiment was conducted to evaluate 295 wheat genotypes in Alpha-Lattice design with two replications. The arithmetic mean and standard deviation of grain yield was 2706 and 950 (kg/ha,respectively. The results of correlation coefficients indicated that grain yield had significant and positive association with plant height, spike length, early growth vigor and agronomic score. Whereas there were negative correlation coefficients between grain yield and days to physiological maturity and canopy temperature before and during anthesis. Path analysis indicated agronomic score and plant height had high positive direct effects on grain yield, while canopy temperature before and during anthesis, and days to maturity, wes another trait having negative direct effect on grain yield. The results of sequential path analysis showed the traits that accounted as a criteria variable for high grain yield were agronomic score, plant height, canopy temperature, spike length, chlorophyll content and early growth vigor, which were determined as first, second and third order variables and had strong effects on grain yield via one or more paths. More important, as canopy temperature, agronomic score and early growth vigor can be evaluated quickly and easily, these traits may be used for evaluation of large populations.
Energy Technology Data Exchange (ETDEWEB)
Anthemidis, Aristidis N.; Ioannou, Kallirroy-Ioanna G. [Aristotle University, Laboratory of Analytical Chemistry, Department of Chemistry, Thessaloniki (Greece)
2012-08-15
A novel, automatic on-line sequential injection dispersive liquid-liquid microextraction (SI-DLLME) method, based on 1-hexyl-3-methylimidazolium hexafluorophosphate ([Hmim][PF{sub 6}]) ionic liquid as an extractant solvent was developed and demonstrated for trace thallium determination by flame atomic absorption spectrometry. The ionic liquid was on-line fully dispersed into the aqueous solution in a continuous flow format while the TlBr{sub 4} {sup -} complex was easily migrated into the fine droplets of the extractant due to the huge contact area of them with the aqueous phase. Furthermore, the extractant was simply retained onto the surface of polyurethane foam packed into a microcolumn. No specific conditions like low temperature are required for extractant isolation. All analytical parameters of the proposed method were investigated and optimized. For 15 mL of sample solution, an enhancement factor of 290, a detection limit of 0.86 {mu}g L{sup -1} and a precision (RSD) of 2.7% at 20.0 {mu}g L{sup -1} Tl(I) concentration level, was obtained. The developed method was evaluated by analyzing certified reference materials while good recoveries from environmental and biological samples proved that present method was competitive in practical applications. (orig.)
How to Bootstrap a Human Communication System
Fay, Nicolas; Arbib, Michael; Garrod, Simon
2013-01-01
How might a human communication system be bootstrapped in the absence of conventional language? We argue that motivated signs play an important role (i.e., signs that are linked to meaning by structural resemblance or by natural association). An experimental study is then reported in which participants try to communicate a range of pre-specified…
Pulling Econometrics Students up by Their Bootstraps
O'Hara, Michael E.
2014-01-01
Although the concept of the sampling distribution is at the core of much of what we do in econometrics, it is a concept that is often difficult for students to grasp. The thought process behind bootstrapping provides a way for students to conceptualize the sampling distribution in a way that is intuitive and visual. However, teaching students to…
Deterministic bootstrap percolation in high dimensional grids
Huang, Hao; Lee, Choongbum
2013-01-01
In this paper, we study the k-neighbor bootstrap percolation process on the d-dimensional grid [n]^d, and show that the minimum number of initial vertices that percolate is (1-d/k)n^d + O(n^{d-1})$ when d
Wald, Abraham
2013-01-01
In 1943, while in charge of Columbia University's Statistical Research Group, Abraham Wald devised Sequential Design, an innovative statistical inference system. Because the decision to terminate an experiment is not predetermined, sequential analysis can arrive at a decision much sooner and with substantially fewer observations than equally reliable test procedures based on a predetermined number of observations. The system's immense value was immediately recognized, and its use was restricted to wartime research and procedures. In 1945, it was released to the public and has since revolutio
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Bootstrapping Relational Affordances of Object Pairs using Transfer
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert;
2016-01-01
leverage past knowledge to accelerate current learning (which we call bootstrapping). We learn Random Forest based affordance predictors from visual inputs and demonstrate two approaches to knowledge transfer for bootstrapping. In the first approach (direct bootstrapping), the state-space for a new...... affordance predictor is augmented with the output of previously learnt affordances. In the second approach (category based bootstrapping), we form categories that capture underlying commonalities of a pair of existing affordances and augment the state-space with this category classifier’s output. In addition....... We also show that there is no significant difference in performance between direct and category based bootstrapping....
T.A. Knoch (Tobias)
2003-01-01
textabstractGenomes are one of the major foundations of life due to their role in information storage, process regulation and evolution. However, the sequential and three-dimensional structure of the human genome in the cell nucleus as well as its interplay with and embedding into the cell and o
T.A. Knoch (Tobias)
2003-01-01
textabstractGenomes are one of the major foundations of life due to their role in information storage, process regulation and evolution. However, the sequential and three-dimensional structure of the human genome in the cell nucleus as well as its interplay with and embedding into the cell and organ
Bootstrap for the case-cohort design.
Huang, Yijian
2014-06-01
The case-cohort design facilitates economical investigation of risk factors in a large survival study, with covariate data collected only from the cases and a simple random subset of the full cohort. Methods that accommodate the design have been developed for various semiparametric models, but most inference procedures are based on asymptotic distribution theory. Such inference can be cumbersome to derive and implement, and does not permit confidence band construction. While bootstrap is an obvious alternative, how to resample is unclear because of complications from the two-stage sampling design. We establish an equivalent sampling scheme, and propose a novel and versatile nonparametric bootstrap for robust inference with an appealingly simple single-stage resampling. Theoretical justification and numerical assessment are provided for a number of procedures under the proportional hazards model.
Bootstrapping ${\\mathcal N}=2$ chiral correlators
Lemos, Madalena
2016-01-01
We apply the numerical bootstrap program to chiral operators in four-dimensional ${\\mathcal N}=2$ SCFTs. In the first part of this work we study four-point functions in which all fields have the same conformal dimension. We give special emphasis to bootstrapping a specific theory: the simplest Argyres-Douglas fixed point with no flavor symmetry. In the second part we generalize our setup and consider correlators of fields with unequal dimension. This is an example of a mixed correlator and allows us to probe new regions in the parameter space of ${\\mathcal N}=2$ SCFTs. In particular, our results put constraints on relations in the Coulomb branch chiral ring and on the curvature of the Zamolodchikov metric.
The $(2,0)$ superconformal bootstrap
Beem, Christopher; Rastelli, Leonardo; van Rees, Balt C
2016-01-01
We develop the conformal bootstrap program for six-dimensional conformal field theories with $(2,0)$ supersymmetry, focusing on the universal four-point function of stress tensor multiplets. We review the solution of the superconformal Ward identities and describe the superconformal block decomposition of this correlator. We apply numerical bootstrap techniques to derive bounds on OPE coefficients and scaling dimensions from the constraints of crossing symmetry and unitarity. We also derive analytic results for the large spin spectrum using the lightcone expansion of the crossing equation. Our principal result is strong evidence that the $A_1$ theory realizes the minimal allowed central charge $(c=25)$ for any interacting $(2,0)$ theory. This implies that the full stress tensor four-point function of the $A_1$ theory is the unique unitary solution to the crossing symmetry equation at $c=25$. For this theory, we estimate the scaling dimensions of the lightest unprotected operators appearing in the stress tenso...
A comparison of four different block bootstrap methods
Directory of Open Access Journals (Sweden)
Boris Radovanov
2014-12-01
Full Text Available The paper contains a description of four different block bootstrap methods, i.e., non-overlapping block bootstrap, overlapping block bootstrap (moving block bootstrap, stationary block bootstrap and subsampling. Furthermore, the basic goal of this paper is to quantify relative efficiency of each mentioned block bootstrap procedure and then to compare those methods. To achieve the goal, we measure mean square errors of estimation variance returns. The returns are calculated from 1250 daily observations of Serbian stock market index values BELEX15 from April 2009 to April 2014. Thereby, considering the effects of potential changes in decisions according to variations in the sample length and purposes of the use, this paper introduces stability analysis which contains robustness testing of the different sample size and the different block length. Testing results indicate some changes in bootstrap method efficiencies when altering the sample size or the block length.
Bootstrapping Deep Lexical Resources: Resources for Courses
Baldwin, Timothy
2007-01-01
We propose a range of deep lexical acquisition methods which make use of morphological, syntactic and ontological language resources to model word similarity and bootstrap from a seed lexicon. The different methods are deployed in learning lexical items for a precision grammar, and shown to each have strengths and weaknesses over different word classes. A particular focus of this paper is the relative accessibility of different language resource types, and predicted ``bang for the buck'' associated with each in deep lexical acquisition applications.
Bootstrap Estimation for Nonparametric Efficiency Estimates
1995-01-01
This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...
TASI Lectures on the Conformal Bootstrap
Simmons-Duffin, David
2016-01-01
These notes are from courses given at TASI and the Advanced Strings School in summer 2015. Starting from principles of quantum field theory and the assumption of a traceless stress tensor, we develop the basics of conformal field theory, including conformal Ward identities, radial quantization, reflection positivity, the operator product expansion, and conformal blocks. We end with an introduction to numerical bootstrap methods, focusing on the 2d and 3d Ising models.
Directory of Open Access Journals (Sweden)
Miriam Martinez-Biarge
Full Text Available The evolution of non-hemorrhagic white matter injury (WMI based on sequential magnetic resonance imaging (MRI has not been well studied. Our aim was to describe sequential MRI findings in preterm infants with non-hemorrhagic WMI and to develop an MRI classification system for preterm WMI based on these findings.Eighty-two preterm infants (gestation ≤35 weeks were retrospectively included. WMI was diagnosed and classified based on sequential cranial ultrasound (cUS and confirmed on MRI.138 MRIs were obtained at three time-points: early (<2 weeks; n = 32, mid (2-6 weeks; n = 30 and term equivalent age (TEA; n = 76. 63 infants (77% had 2 MRIs during the neonatal period. WMI was non-cystic in 35 and cystic in 47 infants. In infants with cystic-WMI early MRI showed extensive restricted diffusion abnormalities, cysts were already present in 3 infants; mid MRI showed focal or extensive cysts, without acute diffusion changes. A significant reduction in the size and/or extent of the cysts was observed in 32% of the infants between early/mid and TEA MRI. In 4/9 infants previously seen focal cysts were no longer identified at TEA. All infants with cystic WMI showed ≥2 additional findings at TEA: significant reduction in WM volume, mild-moderate irregular ventriculomegaly, several areas of increased signal intensity on T1-weighted-images, abnormal myelination of the PLIC, small thalami.In infants with extensive WM cysts at 2-6 weeks, cysts may be reduced in number or may even no longer be seen at TEA. A single MRI at TEA, without taking sequential cUS data and pre-TEA MRI findings into account, may underestimate the extent of WMI; based on these results we propose a new MRI classification for preterm non-hemorrhagic WMI.
Energy Technology Data Exchange (ETDEWEB)
Zdenek Klika; Lenka Ambruzova; Ivana Sykorova; Jana Seidlerova; Ivan Kolomaznik [VSB-Technical University Ostrava, Ostrava (Czech Republic)
2009-10-15
The affinities of Ga and Ge in lignite were determined using sequential extraction (SE) and element affinity calculation (EAC) based on sink-float data. For this study a bulk lignite sample was fractioned into two sets. The first set of samples (A) consisted of the different grain sizes fractions; the second one set (B) was prepared by density fractionation. Sequential extractions (1) were performed on both sets of fractions with very good agreement between determined organic elements affinities (OEA of Ga evaluated from A data is 32%, from B data 35%; OEA of Ge evaluated from A data is 31% and from B data 26%). The data of B lignite fractions were evaluated using two element affinity calculations: (a) EAC (I) of Klika and Kolomaznk (2) and (b) newly prepared subroutine EAC (II) based on quantitative contents of lignite macerals and minerals. There was also good agreement between both methods obtained (OEA of Ga calculated by EAC (I) is 83% and by EAC (II) 77%; OEA of Ge calculated by EAC (I) is 89% and by EAC (II) 97%). The significant differences of organic elements affinities of Ga and Ge evaluated by sequential extraction and by element affinity calculation based on sink-float data are discussed. 34 refs., 7 figs., 6 tabs.
Bootstrapping Density-Weighted Average Derivatives
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Crump, Richard K.; Jansson, Michael
Employing the "small bandwidth" asymptotic framework of Cattaneo, Crump, and Jansson (2009), this paper studies the properties of a variety of bootstrap-based inference procedures associated with the kernel-based density-weighted averaged derivative estimator proposed by Powell, Stock, and Stoker......" variance estimator derived from the "small bandwidth" asymptotic framework. The results of a small-scale Monte Carlo experiment are found to be consistent with the theory and indicate in particular that sensitivity with respect to the bandwidth choice can be ameliorated by using the "robust...
Accidental Symmetries and the Conformal Bootstrap
Chester, Shai M; Iliesiu, Luca V; Klebanov, Igor R; Pufu, Silviu S; Yacoby, Ran
2015-01-01
We study an ${\\cal N} = 2$ supersymmetric generalization of the three-dimensional critical $O(N)$ vector model that is described by $N+1$ chiral superfields with superpotential $W = g_1 X \\sum_i Z_i^2 + g_2 X^3$. By combining the tools of the conformal bootstrap with results obtained through supersymmetric localization, we argue that this model exhibits a symmetry enhancement at the infrared superconformal fixed point due to $g_2$ flowing to zero. This example is special in that the existence of an infrared fixed point with $g_1,g_2\
DEFF Research Database (Denmark)
Hansen, Elo Harald
in the increasing number of papers appearing in the scientific literature. These novel generations of flow injection analysis have demonstrated themselves as attractive substitutes for labour-intensive, manual sample pre-treatment and solution handling methods prior to analyte detection by atomic absorption...... to handle solid samples of environmental interest as demonstrated by the accommodation of both single and sequential extraction schemes for metal fractionation of solid samples of environmental concern (e.g. soils and sediments) packed within dedicated microcartridges. A brief account of the construction...
DEFF Research Database (Denmark)
Hansen, Elo Harald; Miró, Manuel; Petersen, Roongrat
in the substantial number of papers that have emerged in the scientific literature. These novel generations of flow injection analysis have demonstrated themselves as attractive substitutes for labour-intensive, manual sample pre-treatment and solution handling methods prior to analyte detection by atomic absorption...... of environmental interest as demonstrated by the accommodation of both single and sequential extraction schemes for metal fractionation of solid samples of environmental concern (e.g. soils and sediments) packed within dedicated microcartridges. An account of the construction and the experimental modes of operandi...
Giving the Boot to the Bootstrap: How Not to Learn the Natural Numbers
Rips, Lance J.; Asmuth, Jennifer; Bloomfield, Amber
2006-01-01
According to one theory about how children learn the concept of natural numbers, they first determine that "one", "two", and "three" denote the size of sets containing the relevant number of items. They then make the following inductive inference (the Bootstrap): The next number word in the counting series denotes the size of the sets you get by…
Energy Technology Data Exchange (ETDEWEB)
Mesquita, Raquel B.R. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); Ferreira, M. Teresa S.O.B. [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal); Toth, Ildiko V. [REQUIMTE, Departamento de Quimica, Faculdade de Farmacia, Universidade de Porto, Rua Anibal Cunha, 164, 4050-047 Porto (Portugal); Bordalo, Adriano A. [Laboratory of Hydrobiology, Institute of Biomedical Sciences Abel Salazar (ICBAS) and Institute of Marine Research (CIIMAR), Universidade do Porto, Lg. Abel Salazar 2, 4099-003 Porto (Portugal); McKelvie, Ian D. [School of Chemistry, University of Melbourne, Victoria 3010 (Australia); Rangel, Antonio O.S.S., E-mail: aorangel@esb.ucp.pt [CBQF/Escola Superior de Biotecnologia, Universidade Catolica Portuguesa, R. Dr. Antonio Bernardino de Almeida, 4200-072 Porto (Portugal)
2011-09-02
Highlights: {yields} Sequential injection determination of phosphate in estuarine and freshwaters. {yields} Alternative spectrophotometric flow cells are compared. {yields} Minimization of schlieren effect was assessed. {yields} Proposed method can cope with wide salinity ranges. {yields} Multi-reflective cell shows clear advantages. - Abstract: A sequential injection system with dual analytical line was developed and applied in the comparison of two different detection systems viz; a conventional spectrophotometer with a commercial flow cell, and a multi-reflective flow cell coupled with a photometric detector under the same experimental conditions. The study was based on the spectrophotometric determination of phosphate using the molybdenum-blue chemistry. The two alternative flow cells were compared in terms of their response to variation of sample salinity, susceptibility to interferences and to refractive index changes. The developed method was applied to the determination of phosphate in natural waters (estuarine, river, well and ground waters). The achieved detection limit (0.007 {mu}M PO{sub 4}{sup 3-}) is consistent with the requirement of the target water samples, and a wide quantification range (0.024-9.5 {mu}M) was achieved using both detection systems.
Learning web development with Bootstrap and AngularJS
Radford, Stephen
2015-01-01
Whether you know a little about Bootstrap or AngularJS, or you're a complete beginner, this book will enhance your capabilities in both frameworks and you'll build a fully functional web app. A working knowledge of HTML, CSS, and JavaScript is required to fully get to grips with Bootstrap and AngularJS.
Bootstrapping pre-averaged realized volatility under market microstructure noise
DEFF Research Database (Denmark)
Hounyo, Ulrich; Goncalves, Sílvia; Meddahi, Nour
-averaged returns implies that these are kn-dependent with kn growing slowly with the sample size n. This motivates the application of a blockwise bootstrap method. We show that the "blocks of blocks" bootstrap method suggested by Politis and Romano (1992) (and further studied by Bühlmann and Künsch (1995...
Luan, Fubo; Burgos, William D
2012-11-06
Iron-bearing phyllosilicates strongly influence the redox state and mobility of uranium because of their limited hydraulic conductivity, high specific surface area, and redox reactivity. Standard extraction procedures cannot be accurately applied for the determination of clay-Fe(II/III) and U(IV/VI) in clay mineral-U suspensions such that advanced spectroscopic techniques are required. Instead, we developed and validated a sequential extraction method for determination of clay-Fe(II/III) and U(IV/VI) in clay-U suspensions. In our so-called "H(3)PO(4)-HF-H(2)SO(4) sequential extraction" method, H(3)PO(4)-H(2)SO(4) is used first to solubilize and remove U, and the remaining clay pellet is subject to HF-H(2)SO(4) digestion. Physical separation of U and clay eliminates valence cycling between U(IV/VI) and clay-Fe(II/III) that otherwise occurred in the extraction solutions and caused analytical discrepancies. We further developed an "automated anoxic KPA" method to measure soluble U(VI) and total U (calculate U(IV) by difference) and modified the conventional HF-H(2)SO(4) digestion method to eliminate a series of time-consuming weighing steps. We measured the kinetics of uraninite oxidation by nontronite using this sequential extraction method and anoxic KPA method and measured a stoichiometric ratio of 2.19 ± 0.05 mol clay-Fe(II) produced per mol U(VI) produced (theoretical value of 2.0). We found that we were able to recover 98.0-98.5% of the clay Fe and 98.1-98.5% of the U through the sequential extractions. Compared to the theoretical stoichiometric ratio of 2.0, the parallel extractions of 0.5 M HCl for clay-Fe(II) and 1 M NaHCO(3) for U(VI) leached two-times more Fe(II) than U(VI). The parallel extractions of HF-H(2)SO(4) for clay Fe(II) and 1 M NaHCO(3) for U(VI) leached six-times more Fe(II) than U(VI).
Conformal bootstrap, universality and gravitational scattering
Directory of Open Access Journals (Sweden)
Steven Jackson
2015-12-01
Full Text Available We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large c and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles between two heavy states (BTZ black holes. We find that the operator algebra in this regime is (i universal and identical to that of Liouville CFT, and (ii takes the form of an exchange algebra, specified by an R-matrix that exactly matches the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Conformal Bootstrap, Universality and Gravitational Scattering
Jackson, Steven; Verlinde, Herman
2014-01-01
We use the conformal bootstrap equations to study the non-perturbative gravitational scattering between infalling and outgoing particles in the vicinity of a black hole horizon in AdS. We focus on irrational 2D CFTs with large $c$, a sparse light spectrum and only Virasoro symmetry. The scattering process is described by the matrix element of two light operators (particles) between two heavy states (BTZ black holes). We find that the operator algebra in this regime is (i) universal and identical to that of Liouville CFT, and (ii) takes the form of an exchange algebra, specified by an R-matrix that exactly matches with the scattering amplitude of 2+1 gravity. The R-matrix is given by a quantum 6j-symbol and the scattering phase by the volume of a hyperbolic tetrahedron. We comment on the relevance of our results to scrambling and the holographic reconstruction of the bulk physics near black hole horizons.
Uncertainty estimation in diffusion MRI using the nonlocal bootstrap.
Yap, Pew-Thian; An, Hongyu; Chen, Yasheng; Shen, Dinggang
2014-08-01
In this paper, we propose a new bootstrap scheme, called the nonlocal bootstrap (NLB) for uncertainty estimation. In contrast to the residual bootstrap, which relies on a data model, or the repetition bootstrap, which requires repeated signal measurements, NLB is not restricted by the data structure imposed by a data model and obviates the need for time-consuming multiple acquisitions. NLB hinges on the observation that local imaging information recurs in an image. This self-similarity implies that imaging information coming from spatially distant (nonlocal) regions can be exploited for more effective estimation of statistics of interest. Evaluations using in silico data indicate that NLB produces distribution estimates that are in closer agreement with those generated using Monte Carlo simulations, compared with the conventional residual bootstrap. Evaluations using in vivo data demonstrate that NLB produces results that are in agreement with our knowledge on white matter architecture.
Effects of magnetic islands on bootstrap current in toroidal plasmas
Dong, G.; Lin, Z.
2017-03-01
The effects of magnetic islands on electron bootstrap current in toroidal plasmas are studied using gyrokinetic simulations. The magnetic islands cause little changes of the bootstrap current level in the banana regime because of trapped electron effects. In the plateau regime, the bootstrap current is completely suppressed at the island centers due to the destruction of trapped electron orbits by collisions and the flattening of pressure profiles by the islands. In the collisional regime, small but finite bootstrap current can exist inside the islands because of the pressure gradients created by large collisional transport across the islands. Finally, simulation results show that the bootstrap current level increases near the island separatrix due to steeper local density gradients.
Directory of Open Access Journals (Sweden)
Izabela CHMIEL
2012-03-01
Full Text Available Aim: To determine and analyse an alternative methodology for the analysis of a set of Likert responses measured on a common attitudinal scale when the primary focus of interest is on the relative importance of items in the set - with primary application to health-related quality of life (HRQOL measures. HRQOL questionnaires usually generate data that manifest evident departures from fundamental assumptions of Analysis of Variance (ANOVA approach, not only because of their discrete, bounded and skewed distributions, but also due to significant correlation between mean scores and their variances. Material and Methods: Questionnaire survey with SF-36 has been conducted among 142 convalescents after acute pancreatitis. The estimated scores of HRQOL were compared with use of the multiple comparisons procedures under Bonferroni-like adjustment, and with the bootstrap procedures. Results: In the data set studied, with the SF-36 outcome, the use of the multiple comparisons and bootstrap procedures for analysing HRQOL data provides results quite similar to conventional ANOVA and Rasch methods, suggested at frames of Classical Test Theory and Item Response Theory. Conclusions: These results suggest that the multiple comparisons and bootstrap both are valid methods for analysing HRQOL outcome data, in particular at case of doubts with appropriateness of the standard methods. Moreover, from practical point of view, the processes of the multiple comparisons and bootstrap procedures seems to be much easy to interpret by non-statisticians aimed to practise evidence based health care.
Naghshineh, Mahsa; Larsen, Jan; Georgiou, Constantinos; Olsen, Karsten
2016-08-01
A novel method for automated determination pectin degree of esterification (DE) using micro sequential injection lab-on-valve (μSI-LOV) system is developed. A face-centered central composite response surface methodology (RSM) was used to optimise system parameters. A calibration graph for determination of non-esterified galacturonic acid (GalA) content in pectin solutions with linear range of 0.08-0.34% (w/v) and the limit of detection (LOD) of 0.057% (w/v) under optimal condition was achieved. The difference between concentrations (w/v, %) of total GalA and non-esterified GalA was applied to estimate DE (%) of pectin samples. Results indicated a good agreement (tstatmanual reference method. The current method provided precision of less than 6% RSD, (n=10) with sample throughput of 15 samples h(-1). The presented method requires remarkably low consumption of sample and reagents and is thereby environmental friendly.
Bootstrap consistency for general semiparametric M-estimation
Cheng, Guang
2010-10-01
Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.
Directory of Open Access Journals (Sweden)
Ristya Widi Endah Yani
2008-12-01
Full Text Available Background: Bootstrap is a computer simulation-based method that provides estimation accuracy in estimating inferential statistical parameters. Purpose: This article describes a research using secondary data (n = 30 aimed to elucidate bootstrap method as the estimator of linear regression test based on the computer programs MINITAB 13, SPSS 13, and MacroMINITAB. Methods: Bootstrap regression methods determine ˆ β and Yˆ value from OLS (ordinary least square, ε i = Yi −Yˆi value, determine how many repetition for bootstrap (B, take n sample by replacement from ε i to ε (i , Yi = Yˆi + ε (i value, ˆ β value from sample bootstrap at i vector. If the amount of repetition less than, B a recalculation should be back to take n sample by using replacement from ε i . Otherwise, determine ˆ β from “bootstrap” methods as the average ˆ β value from the result of B times sample taken. Result: The result has similar result compared to linear regression equation with OLS method (α = 5%. The resulting regression equation for caries was = 1.90 + 2.02 (OHI-S, indicating that every one increase of OHI-S unit will result in caries increase of 2.02 units. Conclusion: This was conducted with B as many as 10,500 with 10 times iterations.
Bootstrap Power of Time Series Goodness of fit tests
Directory of Open Access Journals (Sweden)
Sohail Chand
2013-10-01
Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.
DEFF Research Database (Denmark)
Qiao, Jixin; Hou, Xiaolin; Steier, Peter;
2013-01-01
An automated analytical method implemented in a novel dual-column tandem sequential injection (SI) system was developed for simultaneous determination of 236U, 237Np, 239Pu, and 240Pu in seawater samples. A combination of TEVA and UTEVA extraction chromatography was exploited to separate and purify...... target analytes, whereupon plutonium and neptunium were simultaneously isolated and purified on TEVA, while uranium was collected on UTEVA. The separation behavior of U, Np, and Pu on TEVA–UTEVA columns was investigated in detail in order to achieve high chemical yields and complete purification...... for the radionuclides of interest. 242Pu was used as a chemical yield tracer for both plutonium and neptunium. 238U was quantified in the sample before the separation for deducing the 236U concentration from the measured 236U/238U atomic ratio in the separated uranium target using accelerator mass spectrometry...
Meeravali, Noorbasha N; Madhavi, K; Manjusha, R; Kumar, Sunil Jai
2014-01-01
A sequential extraction procedure is developed for the separation of trace levels of hexachloroplatinate, cisplatin and carboplatin from soil, which are then, pre-concentrated using a vesicular coacervative cloud point extraction method prior to their determination as platinum by continuum source ETAAS. Sequential extraction of carboplatin, cisplatin and hexachloroplatinate from a specific red soil is achieved by using the 20% HCl, aqua regia at room temperature and by combination of aqua regia and HF with microwave digestion, respectively. The pre-concentration of these species from the extracted solutions is based on the formation of extractable hydrophobic complexes of PtCl₆(2-) anionic species with free cationic head groups solubilizing sites of the Triton X-114 co-surfactant stabilized TOMAC (tri-octyl methyl ammonium chloride) vesicles through electrostatic attraction. This process separates the platinum from bulk aqueous solution into a small vesicular rich phase. The parameters affecting the extraction procedures are optimized. Under the optimized conditions, the achieved pre-concentration factor is 20 and detection limit is 0.5 ng g(-1) for soil and 0.02 ng mL(-1) for water samples. The spiked recoveries of hexachloroplatinate, cisplatin and carboplatin in water and soil extracts in the vesicular coacervative extraction are in the range of 96-102% at 0.5-1 ng mL(-1) with relative standard deviation of 1-3%. The accuracy of the method for platinum determination is evaluated by analyzing CCRMP PTC-1a copper-nickel sulfide concentrate and BCR 723 road dust certified reference materials and the obtained results agreed with the certified values with 95% confidence level of student t-test. The results were also compared to mixed-micelle (MM)-CPE method reported in the literature.
Bootstrapping Object Coreferencing on the Semantic Web
Institute of Scientific and Technical Information of China (English)
Wei Hu; Yu-Zhong Qu; Xing-Zhi Sun
2011-01-01
An object on the Semantic Web is likely to be denoted with several URIs by different parties.Object coreferencing is a process to identify "equivalent" URIs of objects for achieving a better Data Web.In this paper,we propose a bootstrapping approach for object coreferencing on the Semantic Web.For an object URI,we firstly establish a kernel that consists of semantically equivalent URIs from the same-as,(inverse) functional properties and (max-)cardinalities,and then extend the kernel with respect to the textual descriptions (e.g.,labels and local names) of URIs.We also propose a trustworthiness-based method to rank the coreferent URIs in the kernel as well as a similarity-based method for ranking the URIs in the extension of the kernel.We implement the proposed approach,called ObjectCoref,on a large-scale dataset that contains 76 million URIs collected by the Falcons search engine until 2008.The evaluation on precision,relative recall and response time demonstrates the feasibility of our approach.Additionally,we apply the proposed approach to investigate the popularity of the URI alias phenomenon on the current Semantic Web.
Dhamodharan, K; Pius, Anitha
2016-01-01
A simple potentiometric method for determining the free acidity without complexation in the presence of hydrolysable metal ions and sequentially determining the plutonium concentration by a direct spectrophotometric method using a single aliquot was developed. Interference from the major fission products, which are susceptible to hydrolysis at lower acidities, had been investigated in the free acidity measurement. This method is applicable for determining the free acidity over a wide range of nitric acid concentrations as well as the plutonium concentration in the irradiated fuel solution prior to solvent extraction. Since no complexing agent is introduced during the measurement of the free acidity, the purification step is eliminated during the plutonium estimation, and the resultant analytical waste is free from corrosive chemicals and any complexing agent. Hence, uranium and plutonium can be easily recovered from analytical waste by the conventional solvent extraction method. The error involved in determining the free acidity and plutonium is within ±1% and thus this method is superior to the complexation method for routine analysis of plant samples and is also amenable for remote analysis.
Chu, Ning; Fan, Shihua
2009-12-01
A new analytical method was developed for the simultaneous kinetic spectrophotometric determination of a quaternary carbamate pesticide mixture consisting of carbofuran, propoxur, metolcarb and fenobucarb using sequential injection analysis (SIA). The procedure was based upon the different kinetic properties between the analytes reacted with reagent in flow system in the non-stopped-flow mode, in which their hydrolysis products coupled with diazotized p-nitroaniline in an alkaline medium to form the corresponding colored complexes. The absorbance data from SIA peak time profile were recorded at 510 nm and resolved by the use of back-propagation-artificial neural network (BP-ANN) algorithms for multivariate quantitative analysis. The experimental variables and main network parameters were optimized and each of the pesticides could be determined in the concentration range of 0.5-10.0 μg mL -1, at a sampling frequency of 18 h -1. The proposed method was compared to other spectrophotometric methods for simultaneous determination of mixtures of carbamate pesticides, and it was proved to be adequately reliable and was successfully applied to the simultaneous determination of the four pesticide residues in water and fruit samples, obtaining the satisfactory results based on recovery studies (84.7-116.0%).
Variance estimation in neutron coincidence counting using the bootstrap method
Energy Technology Data Exchange (ETDEWEB)
Dubi, C., E-mail: chendb331@gmail.com [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Ocherashvilli, A.; Ettegui, H. [Physics Department, Nuclear Research Center of the Negev, P.O.B. 9001 Beer Sheva (Israel); Pedersen, B. [Nuclear Security Unit, Institute for Transuranium Elements, Via E. Fermi, 2749 JRC, Ispra (Italy)
2015-09-11
In the study, we demonstrate the implementation of the “bootstrap” method for a reliable estimation of the statistical error in Neutron Multiplicity Counting (NMC) on plutonium samples. The “bootstrap” method estimates the variance of a measurement through a re-sampling process, in which a large number of pseudo-samples are generated, from which the so-called bootstrap distribution is generated. The outline of the present study is to give a full description of the bootstrapping procedure, and to validate, through experimental results, the reliability of the estimated variance. Results indicate both a very good agreement between the measured variance and the variance obtained through the bootstrap method, and a robustness of the method with respect to the duration of the measurement and the bootstrap parameters.
Simulation-optimization via Kriging and bootstrapping : A survey
Kleijnen, Jack P.C.
2014-01-01
This article surveys optimization of simulated systems. The simulation may be either deterministic or random. The survey reflects the author’s extensive experience with simulation-optimization through Kriging (or Gaussian process) metamodels, analysed through parametric bootstrapping for determinist
Bootstrapping the statistical uncertainties of NN scattering data
Perez, R Navarro; Arriola, E Ruiz
2014-01-01
We use the Monte Carlo bootstrap as a method to simulate pp and np scattering data below pion production threshold from an initial set of over 6700 experimental mutually $3\\sigma$ consistent data. We compare the results of the bootstrap, with 1020 statistically generated samples of the full database, with the standard covariance matrix method of error propagation. No significant differences in scattering observables and phase shifts are found. This suggests alternative strategies for propagating errors of nuclear forces in nuclear structure calculations.
Control of bootstrap current in the pedestal region of tokamaks
Energy Technology Data Exchange (ETDEWEB)
Shaing, K. C. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China); Department of Engineering Physics, University of Wisconsin, Madison, Wisconsin 53796 (United States); Lai, A. L. [Institute for Space and Plasma Sciences, National Cheng Kung University, Tainan City 70101, Taiwan (China)
2013-12-15
The high confinement mode (H-mode) plasmas in the pedestal region of tokamaks are characterized by steep gradient of the radial electric field, and sonic poloidal U{sub p,m} flow that consists of poloidal components of the E×B flow and the plasma flow velocity that is parallel to the magnetic field B. Here, E is the electric field. The bootstrap current that is important for the equilibrium, and stability of the pedestal of H-mode plasmas is shown to have an expression different from that in the conventional theory. In the limit where ‖U{sub p,m}‖≫ 1, the bootstrap current is driven by the electron temperature gradient and inductive electric field fundamentally different from that in the conventional theory. The bootstrap current in the pedestal region can be controlled through manipulating U{sub p,m} and the gradient of the radial electric. This, in turn, can control plasma stability such as edge-localized modes. Quantitative evaluations of various coefficients are shown to illustrate that the bootstrap current remains finite when ‖U{sub p,m}‖ approaches infinite and to provide indications how to control the bootstrap current. Approximate analytic expressions for viscous coefficients that join results in the banana and plateau-Pfirsch-Schluter regimes are presented to facilitate bootstrap and neoclassical transport simulations in the pedestal region.
Energy Technology Data Exchange (ETDEWEB)
Andrade, Maria Celia Ramos; Ludwig, Gerson Otto [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Lab. Associado de Plasma]. E-mail: mcr@plasma.inpe.br
2004-07-01
Different bootstrap current formulations are implemented in a self-consistent equilibrium calculation obtained from a direct variational technique in fixed boundary tokamak plasmas. The total plasma current profile is supposed to have contributions of the diamagnetic, Pfirsch-Schlueter, and the neoclassical Ohmic and bootstrap currents. The Ohmic component is calculated in terms of the neoclassical conductivity, compared here among different expressions, and the loop voltage determined consistently in order to give the prescribed value of the total plasma current. A comparison among several bootstrap current models for different viscosity coefficient calculations and distinct forms for the Coulomb collision operator is performed for a variety of plasma parameters of the small aspect ratio tokamak ETE (Experimento Tokamak Esferico) at the Associated Plasma Laboratory of INPE, in Brazil. We have performed this comparison for the ETE tokamak so that the differences among all the models reported here, mainly regarding plasma collisionality, can be better illustrated. The dependence of the bootstrap current ratio upon some plasma parameters in the frame of the self-consistent calculation is also analysed. We emphasize in this paper what we call the Hirshman-Sigmar/Shaing model, valid for all collisionality regimes and aspect ratios, and a fitted formulation proposed by Sauter, which has the same range of validity but is faster to compute than the previous one. The advantages or possible limitations of all these different formulations for the bootstrap current estimate are analysed throughout this work. (author)
DEFF Research Database (Denmark)
Buanuam, Janya; Miró, Manuel; Hansen, Elo Harald;
2006-01-01
Sequential injection microcolumn extraction (SI-MCE) based on the implementation of a soil containing microcartridge as external reactor in a sequential injection network is, for the first time, proposed for dynamic fractionation of macronutrients in environmental solids, as exemplified by the pa...... atomic absorption spectrometry....
Technical and scale efficiency in public and private Irish nursing homes - a bootstrap DEA approach.
Ni Luasa, Shiovan; Dineen, Declan; Zieba, Marta
2016-10-27
This article provides methodological and empirical insights into the estimation of technical efficiency in the nursing home sector. Focusing on long-stay care and using primary data, we examine technical and scale efficiency in 39 public and 73 private Irish nursing homes by applying an input-oriented data envelopment analysis (DEA). We employ robust bootstrap methods to validate our nonparametric DEA scores and to integrate the effects of potential determinants in estimating the efficiencies. Both the homogenous and two-stage double bootstrap procedures are used to obtain confidence intervals for the bias-corrected DEA scores. Importantly, the application of the double bootstrap approach affords true DEA technical efficiency scores after adjusting for the effects of ownership, size, case-mix, and other determinants such as location, and quality. Based on our DEA results for variable returns to scale technology, the average technical efficiency score is 62 %, and the mean scale efficiency is 88 %, with nearly all units operating on the increasing returns to scale part of the production frontier. Moreover, based on the double bootstrap results, Irish nursing homes are less technically efficient, and more scale efficient than the conventional DEA estimates suggest. Regarding the efficiency determinants, in terms of ownership, we find that private facilities are less efficient than the public units. Furthermore, the size of the nursing home has a positive effect, and this reinforces our finding that Irish homes produce at increasing returns to scale. Also, notably, we find that a tendency towards quality improvements can lead to poorer technical efficiency performance.
DEFF Research Database (Denmark)
Sørensen, P.; Jensen, E.S.
1991-01-01
A novel diffusion method was used for preparation of NH4+- and NO3--N samples from soil extracts for N-15 determination. Ammonium, and nitrate following reduction to ammonia, are allowed to diffuse to an acid-wetted glass filter enclosed in polytetrafluoroethylene tape. The method was evaluated...... with simulated soil extracts obtained using 50 ml of 2 M potassium chloride solution containing 130-mu-g of NH4+-N (2.3 atom% N-15) and 120-mu-g of NO3--N (natural N-15 abundance). No cross-over in the N-15 abundances of NH4+-N and NO3--N was observed, indicating a quantitative diffusion process (72 h, 25......-degrees-C). Owing to the presence of inorganic nitrogen impurities in the potassium chloride, the N-15 enrichments should be corrected for the blank nitrogen content....
Einecke, Sabrina; Bissantz, Nicolai; Clevermann, Fabian; Rhode, Wolfgang
2016-01-01
Astroparticle experiments such as IceCube or MAGIC require a deconvolution of their measured data with respect to the response function of the detector to provide the distributions of interest, e.g. energy spectra. In this paper, appropriate uncertainty limits that also allow to draw conclusions on the geometric shape of the underlying distribution are determined using bootstrap methods, which are frequently applied in statistical applications. Bootstrap is a collective term for resampling methods that can be employed to approximate unknown probability distributions or features thereof. A clear advantage of bootstrap methods is their wide range of applicability. For instance, they yield reliable results, even if the usual normality assumption is violated. The use, meaning and construction of uncertainty limits to any user-specific confidence level in the form of confidence intervals and levels are discussed. The precise algorithms for the implementation of these methods, applicable for any deconvolution algor...
Gómez-Nieto, Beatriz; Gismera, Mª Jesús; Sevilla, Mª Teresa; Procopio, Jesús R
2017-03-15
A simple method based on FAAS was developed for the sequential multi-element determination of Cu, Zn, Mn, Mg and Si in beverages and food supplements with successful results. The main absorption lines for Cu, Zn and Si and secondary lines for Mn and Mg were selected to carry out the measurements. The sample introduction was performed using a flow injection system. Using the choice of the absorption line wings, the upper limit of the linear range increased up to 110mgL(-1) for Mg, 200mgL(-1) for Si and 13mgL(-1) for Zn. The determination of the five elements was carried out, in triplicate, without the need of additional sample dilutions and/or re-measurements, using less than 3.5mL of sample to perform the complete analysis. The LODs were 0.008mgL(-1) for Cu, 0.017mgL(-1) for Zn, 0.011mgL(-1) for Mn, 0.16mgL(-1) for Si and 0.11mgL(-1) for Mg.
Energy Technology Data Exchange (ETDEWEB)
Mitani, Constantina [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University, Thessaloniki 54124 (Greece); Anthemidis, Aristidis N., E-mail: anthemid@chem.auth.gr [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University, Thessaloniki 54124 (Greece)
2013-04-10
Highlights: ► Drop-in-plug micro-extraction based on SI-LAV platform for metal preconcentration. ► Automatic liquid phase micro-extraction coupled with FAAS. ► Organic solvents with density higher than water are used. ► Lead determination in environmental water and urine samples. -- Abstract: A novel automatic on-line liquid phase micro-extraction method based on drop-in-plug sequential injection lab-at-valve (LAV) platform was proposed for metal preconcentration and determination. A flow-through micro-extraction chamber mounted at the selection valve was adopted without the need of sophisticated lab-on-valve components. Coupled to flame atomic absorption spectrometry (FAAS), the potential of this lab-at-valve scheme is demonstrated for trace lead determination in environmental and biological water samples. A hydrophobic complex of lead with ammonium pyrrolidine dithiocarbamate (APDC) was formed on-line and subsequently extracted into an 80 μL plug of chloroform. The extraction procedure was performed by forming micro-droplets of aqueous phase into the plug of the extractant. All critical parameters that affect the efficiency of the system were studied and optimized. The proposed method offered good performance characteristics and high preconcentration ratios. For 10 mL sample consumption an enhancement factor of 125 was obtained. The detection limit was 1.8 μg L{sup −1} and the precision expressed as relative standard deviation (RSD) at 50.0 μg L{sup −1} of lead was 2.9%. The proposed method was evaluated by analyzing certified reference materials and applied for lead determination in natural waters and urine samples.
Berlinger, B; Náray, M; Sajó, I; Záray, G
2009-06-01
In this work, welding fume samples were collected in a welding plant, where corrosion-resistant steel and unalloyed structural steel were welded by gas metal arc welding (GMAW) and manual metal arc welding (MMAW) techniques. The welding fumes were sampled with a fixed-point sampling strategy applying Higgins-Dewell cyclones. The following solutions were used to dissolve the different species of Ni and Mn: ammonium citrate solution [1.7% (m/v) diammonium hydrogen citrate and 0.5% (m/v) citric acid monohydrate] for 'soluble' Ni, 50:1 methanol-bromine solution for metallic Ni, 0.01 M ammonium acetate for soluble Mn, 25% acetic acid for Mn(0) and Mn(2+) and 0.5% hydroxylammonium chloride in 25% acetic acid for Mn(3+) and Mn(4+). 'Insoluble' Ni and Mn contents of the samples were determined after microwave-assisted digestion with the mixture of concentrated (cc). HNO(3), cc. HCl and cc. HF. The sample solutions were analysed by inductively coupled plasma quadrupole mass spectrometry and inductively coupled plasma atomic emission spectrometry. The levels of total Ni and Mn measured in the workplace air were different because of significant differences of the fume generation rates and the distributions of the components in the welding fumes between the welding processes. For quality control of the leaching process, dissolution of the pure stoichiometric Mn and Ni compounds and their mixtures weighing was investigated using the optimized leaching conditions. The results showed the adequacy of the procedure for the pure metal compounds. Based on the extraction procedures, the predominant oxidation states of Ni and Mn proved to be very different depending on the welding techniques and type of the welded steels. The largest amount of Mn in GMAW fumes were found as insoluble Mn (46 and 35% in case of corrosion-resistant steel and unalloyed structural steel, respectively), while MMAW fumes contain mainly soluble Mn, Mn(0) and Mn(2+) (78%) and Mn(3+) and Mn(4+) (54%) in case of
A bootstrap estimation scheme for chemical compositional data with nondetects
Palarea-Albaladejo, J; Martín-Fernández, J.A; Olea, Ricardo A.
2014-01-01
The bootstrap method is commonly used to estimate the distribution of estimators and their associated uncertainty when explicit analytic expressions are not available or are difficult to obtain. It has been widely applied in environmental and geochemical studies, where the data generated often represent parts of whole, typically chemical concentrations. This kind of constrained data is generically called compositional data, and they require specialised statistical methods to properly account for their particular covariance structure. On the other hand, it is not unusual in practice that those data contain labels denoting nondetects, that is, concentrations falling below detection limits. Nondetects impede the implementation of the bootstrap and represent an additional source of uncertainty that must be taken into account. In this work, a bootstrap scheme is devised that handles nondetects by adding an imputation step within the resampling process and conveniently propagates their associated uncertainly. In doing so, it considers the constrained relationships between chemical concentrations originated from their compositional nature. Bootstrap estimates using a range of imputation methods, including new stochastic proposals, are compared across scenarios of increasing difficulty. They are formulated to meet compositional principles following the log-ratio approach, and an adjustment is introduced in the multivariate case to deal with nonclosed samples. Results suggest that nondetect bootstrap based on model-based imputation is generally preferable. A robust approach based on isometric log-ratio transformations appears to be particularly suited in this context. Computer routines in the R statistical programming language are provided.
Multivariate bootstrapped relative positioning of spacecraft using GPS L1/Galileo E1 signals
Buist, Peter J.; Teunissen, Peter J. G.; Giorgi, Gabriele; Verhagen, Sandra
2011-03-01
GNSS-based precise relative positioning between spacecraft normally requires dual frequency observations, whereas attitude determination of the spacecraft, mainly due to the stronger model given by the a priori knowledge of the length and geometry of the baselines, can be performed precisely using only single frequency observations. When the Galileo signals will come available, the number of observations at the L1 frequency will increase as we will have a GPS and Galileo multi-constellation. Moreover the L1 observations of the Galileo system and modernized GPS are more precise than legacy GPS and this, combined with the increased number of observations, will result in a stronger model for single frequency relative positioning. In this contribution we will develop an even stronger model by combining the attitude determination problem with relative positioning. The attitude determination problem will be solved by the recently developed Multivariate Constrained (MC-) LAMBDA method. We will do this for each spacecraft and use the outcome for an ambiguity constrained solution on the baseline between the spacecraft. In this way the solution for the unconstrained baseline is bootstrapped from the MC-LAMBDA solutions of each spacecraft in what is called: multivariate bootstrapped relative positioning. The developed approach will be compared in simulations with relative positioning using a single antenna at each spacecraft (standard LAMBDA) and a vectorial bootstrapping approach. In the simulations we will analyze single epoch, single frequency success rates as the most challenging application. The difference in performance for the approaches for single epoch solutions, is a good indication of the strength of the underlying models. As the multivariate bootstrapping approach has a stronger model by applying information on the geometry of the constrained baselines, for applications with large observation noise and limited number of observations this will result in a better
Horstkotte, Burkhard; Jarošová, Patrícia; Chocholouš, Petr; Sklenářová, Hana; Solich, Petr
2015-05-01
In this work, the applicability of Sequential Injection Chromatography for the determination of transition metals in water is evaluated for the separation of copper(II), zinc(II), and iron(II) cations. Separations were performed using a Dionex IonPAC™ guard column (50mm×2mm i.d., 9 µm). Mobile phase composition and post-column reaction were optimized by modified SIMPLEX method with subsequent study of the concentration of each component. The mobile phase consisted of 2,6-pyridinedicarboxylic acid as analyte-selective compound, sodium sulfate, and formic acid/sodium formate buffer. Post-column addition of 4-(2-pyridylazo)resorcinol was carried out for spectrophotometric detection of the analytes׳ complexes at 530nm. Approaches to achieve higher robustness, baseline stability, and detection sensitivity by on-column stacking of the analytes and initial gradient implementation as well as air-cushion pressure damping for post-column reagent addition were studied. The method allowed the rapid separation of copper(II), zinc(II), and iron(II) within 6.5min including pump refilling and aspiration of sample and 1mmol HNO3 for analyte stacking on the separation column. High sensitivity was achieved applying an injection volume of up to 90µL. A signal repeatability of<2% RSD of peak height was found. Analyte recovery evaluated by spiking of different natural water samples was well suited for routine analysis with sub-micromolar limits of detection.
Giakisikli, Georgia; Ayala Quezada, Alejandro; Tanaka, Junpei; Anthemidis, Aristidis N; Murakami, Hiroya; Teshima, Norio; Sakai, Tadao
2015-01-01
A fully automated sequential injection column preconcentration method for the on-line determination of trace vanadium, cadmium and lead in urine samples was successfully developed, utilizing electrothermal atomic absorption spectrometry (ETAAS). Polyamino-polycarboxylic acid chelating resin (Nobias chelate PA-1) packed into a handmade minicolumn was used as a sorbent material. Effective on-line retention of chelate complexes of analytes was achieved at pH 6.0, while the highest elution effectiveness was observed with 1.0 mol L(-1) HNO3 in the reverse phase. Several analytical parameters, like the sample acidity, concentration and volume of the eluent as well as the loading/elution flow rates, have been studied, regarding the efficiency of the method, providing appropriate conditions for the analysis of real samples. For a 4.5 mL sample volume, the sampling frequency was 27 h(-1). The detection limits were found to be 3.0, 0.06 and 2.0 ng L(-1) for V(V), Cd(II) and Pb(II), respectively, with the relative standard deviations ranging between 1.9 - 3.7%. The accuracy of the proposed method was evaluated by analyzing a certified reference material (Seronorm(TM) trace elements urine) and spiked urine samples.
Grotti, Marco; Ianni, Carmela; Frache, Roberto
2002-07-19
The interfering effects due to the reagents and matrix elements associated with a four step sequential extraction procedure on ICPOES determination of trace elements were investigated in a systematic way. The emission lines were selected in order to include the most interesting elements for environmental studies (Zn, Pb, Ni, Cr, V and Cu) and the concentrations ranged according with the values occurring in the real samples. In order to distinguish between chemical and physical interfering effects, the Mg 280.270-Mg 285.213 line intensity ratio was measured, in each condition. Both pneumatic and ultrasonic nebulization were considered for comparison. It was found that both the elements which constitute the sample and the reagents which are added during the sample preparation steps significantly influence the emission intensity of all the analytes, depending on the analytical concentration and the nebulization system. Generally, the signal variations were higher with ultrasonic nebulization. Concerning the interference mechanism, it was found that the effect of the major elements (Na, K, Mg, Ca, Al and Fe) is essentially related to a change of the aerosol generation and transport processes. Differently, acetic acid, ammonium acetate and hydroxylamine hydrochloride significantly improved the plasma excitation conditions, depending on their concentration. A change of the sample introduction efficiency due to the presence of these reagents was also evident. On the contrary, the effect of hydrochloric and nitric acid emerged to be related only to the processes occurring in the sample introduction system.
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Energy Technology Data Exchange (ETDEWEB)
Lavorante, Andre F. [Centro de Energia Nuclear na Agricultura, Universidade de Sao Paulo, Av. Centenario 303, CP 96, 13400-970, Piracicaba, SP (Brazil); Morales-Rubio, Angel; Guardia, Miguel de la [Department of Analytical Chemistry, Faculty of Chemistry, University of Valencia, Research Building, 50 Dr. Moliner St., 46100 Burjassot, Valencia (Spain); Reis, Boaventura F. [Centro de Energia Nuclear na Agricultura, Universidade de Sao Paulo, Av. Centenario 303, CP 96, 13400-970, Piracicaba, SP (Brazil)], E-mail: reis@cena.usp.br
2007-09-26
It has been developed an automatic stop-flow procedure for sequential photometric determination of anionic and cationic surfactants in a same sample of water. The flow system was based on multicommutation process that was designed employing two solenoid micro-pumps and six solenoid pinch valves, which under microcomputer control carry out fluid propelling and reagent solutions handling. A homemade photometer using a photodiode as detector and two light emitting diodes (LEDs) with emission at 470 nm (blue) and 650 nm (red) as radiation sources, which was tailored to allow the determination of anionic and cationic surfactants in waters. The procedure for anionic surfactant determination was based on the substitution reaction of methyl orange (MO) by the anionic surfactant sodium dodecylbenzene sulfonate (DBS) to form an ion-pair with the cetyl pyridine chloride (CPC). Features such as a linear response ranging from 0.35 to 10.5 mg L{sup -1} DBS (R = 0.999), a detection limit of 0.06 mg L{sup -1} DBS and a relative standard deviation of 0.6% (n = 11) were achieved. For cationic surfactant determination, the procedure was based on the ternary complex formation between cationic surfactant, Fe(III) and chromazurol S (CAS) using CPC as reference standard solution. A linear response range between 0.34 and 10.2 mg L{sup -1} CPC (R = 0.999), a detection limit of 0.05 mg L{sup -1} CPC and a relative standard deviation of 0.5% (n = 11) were obtained. In both cases, the sampling throughput was 60 determinations per hour. Reagents consumption of 7.8 {mu}g MO, 8.2 {mu}g CPC, 37.2 {mu}g CAS and 21.6 {mu}g Fe(III) per determination were achieved. Analyzing river water samples and applying t-test between the results found and those obtained using reference procedures for both surfactant types provide no significant differences at 95% confidence level.
2005-07-01
a constant factor of K + 2. (To see this, note sequential stacking requires training K+2 classifiers: the classifiers f1, . . . , fK used in cross...on the non- sequential learners (ME and VP) but improves per- formance of the sequential learners (CRFs and VPH - MMs) less consistently. This pattern
Benchmark of the bootstrap current simulation in helical plasmas
Huang, Botsz; Kanno, Ryutaro; Sugama, Hideo; Goto, Takuya
2016-01-01
The importance of the parallel momentum conservation on the bootstrap current evaluation in nonaxisymmetric systems is demonstrated by the benchmarks among the local drift-kinetic equation solvers, i.e., the Zero-Orbit-width(ZOW), DKES, and PENTA codes. The ZOW model is extended to include the ion parallel mean flow effect on the electron-ion parallel friction. Compared to the DKES model in which only the pitch-angle-scattering term is included in the collision operator, the PENTA model employs the Sugama-Nishimura method to correct the momentum balance. The ZOW and PENTA models agree each other well on the calculations of the bootstrap current. The DKES results without the parallel momentum conservation deviates significantly from those from the ZOW and PENTA models. This work verifies the reliability of the bootstrap current calculation with the ZOW and PENTA models for the helical plasmas.
Point Set Denoising Using Bootstrap-Based Radial Basis Function
Ramli, Ahmad; Abd. Majid, Ahmad
2016-01-01
This paper examines the application of a bootstrap test error estimation of radial basis functions, specifically thin-plate spline fitting, in surface smoothing. The presence of noisy data is a common issue of the point set model that is generated from 3D scanning devices, and hence, point set denoising is one of the main concerns in point set modelling. Bootstrap test error estimation, which is applied when searching for the smoothing parameters of radial basis functions, is revisited. The main contribution of this paper is a smoothing algorithm that relies on a bootstrap-based radial basis function. The proposed method incorporates a k-nearest neighbour search and then projects the point set to the approximated thin-plate spline surface. Therefore, the denoising process is achieved, and the features are well preserved. A comparison of the proposed method with other smoothing methods is also carried out in this study. PMID:27315105
Design and Implementation of a Bootstrap Trust Chain
Institute of Scientific and Technical Information of China (English)
YU Fajiang; ZHANG Huanguo
2006-01-01
The chain of trust in bootstrap process is the basis of whole system trust in the trusted computing group (TCG) definition. This paper presents a design and implementation of a bootstrap trust chain in PC based on the Windows and today' commodity hardware, merely depends on availability of an embedded security module (ESM). ESM and security enhanced BIOS is the root of trust, PMBR (Pre-MBR) checks the integrity of boot data and Windows kernel, which is a checking agent stored in ESM. In the end, the paper analyzed the mathematic expression of the chain of trust and the runtime performance compared with the common booting process. The trust chain bootstrap greatly strengthens the security of personal computer system, and affects the runtime performance with only adding about 12% booting time.
Addressing the P2P Bootstrap Problem for Small Networks
Wolinsky, David Isaac; Boykin, P Oscar; Figueiredo, Renato
2010-01-01
P2P overlays provide a framework for building distributed applications consisting of few to many resources with features including self-configuration, scalability, and resilience to node failures. Such systems have been successfully adopted in large-scale services for content delivery networks, file sharing, and data storage. In small-scale systems, they can be useful to address privacy concerns and for network applications that lack dedicated servers. The bootstrap problem, finding an existing peer in the overlay, remains a challenge to enabling these services for small-scale P2P systems. In large networks, the solution to the bootstrap problem has been the use of dedicated services, though creating and maintaining these systems requires expertise and resources, which constrain their usefulness and make them unappealing for small-scale systems. This paper surveys and summarizes requirements that allow peers potentially constrained by network connectivity to bootstrap small-scale overlays through the use of e...
A Statistical Mechanics Approach to Approximate Analytical Bootstrap Averages
DEFF Research Database (Denmark)
Malzahn, Dorthe; Opper, Manfred
2003-01-01
We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages ob...... obtained by Monte-Carlo sampling.......We apply the replica method of Statistical Physics combined with a variational method to the approximate analytical computation of bootstrap averages for estimating the generalization error. We demonstrate our approach on regression with Gaussian processes and compare our results with averages...
PyCFTBoot: A flexible interface for the conformal bootstrap
Behan, Connor
2016-01-01
We introduce PyCFTBoot, a wrapper designed to reduce the barrier to entry in conformal bootstrap calculations that require semidefinite programming. Symengine and SDPB are used for the most intensive symbolic and numerical steps respectively. After reviewing the built-in algorithms for conformal blocks, we explain how to use the code through a number of examples that verify past results. As an application, we show that the multi-correlator bootstrap still appears to single out the Wilson-Fisher fixed points as special theories in dimensions between 3 and 4 despite the recent proof that they violate unitarity.
Teixeira, Leonel Silva; Vieira, Heulla Pereira; Windmöller, Cláudia Carvalhinho; Nascentes, Clésia Cristina
2014-02-01
A fast and accurate method based on ultrasound-assisted extraction in a cup-horn sonoreactor was developed to determine the total content of Cd, Co, Cr, Cu, Mn, Ni, Pb and Zn in organic fertilizers by fast sequential flame atomic absorption spectrometry (FS FAAS). Multivariate optimization was used to establish the optimal conditions for the extraction procedure. An aliquot containing approximately 120 mg of the sample was added to a 500 µL volume of an acid mixture (HNO3/HCl/HF, 5:3:3, v/v/v). After a few minutes, 500 µL of deionized water was added and eight samples were simultaneously sonicated for 10 min at 50% amplitude, allowing a sample throughput of 32 extractions per hour. The performance of the method was evaluated with a certified reference material of sewage sludge (CRM 029). The precision, expressed as the relative standard deviation, ranged from 0.58% to 5.6%. The recoveries of analytes were found to 100%, 109%, 96%, 92%, 101%, 104% and 102% for Cd, Cr, Cu, Mn, Ni, Pb and Zn, respectively. The linearity, limit of detection and limit of quantification were calculated and the values obtained were adequate for the quality control of organic fertilizers. The method was applied to the analysis of several commercial organic fertilizers and organic wastes used as fertilizers, and the results were compared with those obtained using the microwave digestion procedure. A good agreement was found between the results obtained by microwave and ultrasound procedures with recoveries ranging from 80.4% to 117%. Two organic waste samples were not in accordance with the Brazilian legislation regarding the acceptable levels of contaminants.
Batista, Bruno Lemos; Rodrigues, Jairo Lisboa; Nunes, Juliana Andrade; Souza, Vanessa Cristina de Oliveira; Barbosa, Fernando
2009-04-20
Inductively coupled plasma mass spectrometry with quadrupole (q-ICP-MS) and dynamic reaction cell (DRC-ICP-MS) were evaluated for sequential determination of As, Cd, Co, Cr, Cu, Mn, Pb, Se, Tl, V and Zn in blood. The method requires as little as 100 microL of blood. Prior to analysis, samples (100 microL) were diluted 1:50 in a solution containing 0.01% (v/v) Triton X-100 and 0.5% (v/v) nitric acid. The use of the DRC was only mandatory for Cr, Cu, V and Zn. For the other elements the equipment may be operated in a standard mode (q-ICP-MS). Ammonia was used as reaction gas. Selection of best flow rate of ammonium gas and optimization of the quadrupole dynamic band-pass tuning parameter (RPq) were carried out, using a ovine base blood for Cr and V and a synthetic matrix solution (SMS) for Zn and Cu diluted 1:50 and spiked to contain 1 microg L(-1) of each element. Method detection limits (3 s) for (75)As, (114)Cd, (59)Co, (51)Cr, (63)Cu (55)Mn, (208)Pb, (82)Se, (205)Tl, (51)V, and (64)Zn were 14.0, 3.0, 11.0, 7.0, 280, 9.0, 3.0, 264, 0.7, 6.0 and 800 ng L(-1), respectively. Method validation was accomplished by the analysis of blood Reference Materials produced by the L'Institut National de Santé Publique du Quebec (Canada).
Anthemidis, Aristidis N; Ioannou, Kallirroy-Ioanna G
2009-06-30
A simple, sensitive and powerful on-line sequential injection (SI) dispersive liquid-liquid microextraction (DLLME) system was developed as an alternative approach for on-line metal preconcentration and separation, using extraction solvent at microlitre volume. The potentials of this novel schema, coupled to flame atomic absorption spectrometry (FAAS), were demonstrated for trace copper and lead determination in water samples. The stream of methanol (disperser solvent) containing 2.0% (v/v) xylene (extraction solvent) and 0.3% (m/v) ammonium diethyldithiophosphate (chelating agent) was merged on-line with the stream of sample (aqueous phase), resulting a cloudy mixture, which was consisted of fine droplets of the extraction solvent dispersed entirely into the aqueous phase. By this continuous process, metal chelating complexes were formed and extracted into the fine droplets of the extraction solvent. The hydrophobic droplets of organic phase were retained into a microcolumn packed with PTFE-turnings. A portion of 300 microL isobutylmethylketone was used for quantitative elution of the analytes, which transported directly to the nebulizer of FAAS. All the critical parameters of the system such as type of extraction solvent, flow-rate of disperser and sample, extraction time as well as the chemical parameters were studied. Under the optimum conditions the enhancement factor for copper and lead was 560 and 265, respectively. For copper, the detection limit and the precision (R.S.D.) were 0.04 microg L(-1) and 2.1% at 2.0 microg L(-1) Cu(II), respectively, while for lead were 0.54 microg L(-1) and 1.9% at 30.0 microg L(-1) Pb(II), respectively. The developed method was evaluated by analyzing certified reference material and applied successfully to the analysis of environmental water samples.
C*-Algebras over Topological Spaces: The Bootstrap Class
Meyer, Ralf
2007-01-01
We carefully define and study C*-algebras over topological spaces, possibly non-Hausdorff, and review some relevant results from point-set topology along the way. We explain the triangulated category structure on the bivariant Kasparov theory over a topological space. We introduce and describe an analogue of the bootstrap class for C*-algebras over a finite topological space.
Bootstrapping Rapidity Anomalous Dimensions for Transverse-Momentum Resummation
Energy Technology Data Exchange (ETDEWEB)
Li, Ye; Zhu, Hua Xing
2017-01-01
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
Sidecoin: a snapshot mechanism for bootstrapping a blockchain
Krug, Joseph; Peterson, Jack
2015-01-01
Sidecoin is a mechanism that allows a snapshot to be taken of Bitcoin's blockchain. We compile a list of Bitcoin's unspent transaction outputs, then use these outputs and their corresponding balances to bootstrap a new blockchain. This allows the preservation of Bitcoin's economic state in the context of a new blockchain, which may provide new features and technical innovations.
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Bootstrapping Rapidity Anomalous Dimension for Transverse-Momentum Resummation
Energy Technology Data Exchange (ETDEWEB)
Li, Ye [Fermilab; Zhu, Hua Xing [MIT, Cambridge, CTP
2016-04-05
Soft function relevant for transverse-momentum resummation for Drell-Yan or Higgs production at hadron colliders are computed through to three loops in the expansion of strong coupling, with the help of bootstrap technique and supersymmetric decomposition. The corresponding rapidity anomalous dimension is extracted. An intriguing relation between anomalous dimensions for transverse-momentum resummation and threshold resummation is found.
A neural network based reputation bootstrapping approach for service selection
Wu, Quanwang; Zhu, Qingsheng; Li, Peng
2015-10-01
With the concept of service-oriented computing becoming widely accepted in enterprise application integration, more and more computing resources are encapsulated as services and published online. Reputation mechanism has been studied to establish trust on prior unknown services. One of the limitations of current reputation mechanisms is that they cannot assess the reputation of newly deployed services as no record of their previous behaviours exists. Most of the current bootstrapping approaches merely assign default reputation values to newcomers. However, by this kind of methods, either newcomers or existing services will be favoured. In this paper, we present a novel reputation bootstrapping approach, where correlations between features and performance of existing services are learned through an artificial neural network (ANN) and they are then generalised to establish a tentative reputation when evaluating new and unknown services. Reputations of services published previously by the same provider are also incorporated for reputation bootstrapping if available. The proposed reputation bootstrapping approach is seamlessly embedded into an existing reputation model and implemented in the extended service-oriented architecture. Empirical studies of the proposed approach are shown at last.
Bootstrapping the energy flow in the beginning of life
Hengeveld, R.; Fedonkin, M.A.
2007-01-01
This paper suggests that the energy flow on which all living structures depend only started up slowly, the low-energy, initial phase starting up a second, slightly more energetic phase, and so on. In this way, the build up of the energy flow follows a bootstrapping process similar to that found in t
Smirenin, S A; Khabova, Z S; Fetisov, V A
2015-01-01
The objective of the present study was to determine the diagnostic coefficients (DC) of injuries to the upper and lower extremities of the passengers inside the car passenger compartment based on the analysis of 599 archival expert documents available from 45 regional state bureaus of forensic medical examination of the Russian federation for the period from 1995 till 2014. These materials included the data obtained by the examination of 200 corpses and 300 live persons involved in the traffic accidents. The statistical and mathematical treatment of these materials with the use the sequential analysis method based on the Byes and Wald formulas yielded the diagnostic coefficients that made it possible to identify the most important signs characterizing the risk of injuries for the passenger occupying the front seat of the vehicle. In the case of the lethal outcome, such injuries include fractures of the right femur (DC -8.9), bleeding (DC -7.1), wounds in the soft tissues of the right thigh (DC -5.0) with the injurious force applied to its anterior surface, bruises on the posterior surface of the right shoulder (DC -6.2), the right deltoid region (DC -5.9), and the posterior surface of the right forearm (DC -5.5), fractures of the right humerus (DC -5.), etc. When both the driver and the passengers survive, the most informative signs in the latter are bleeding and scratches (DC -14.5 and 11.5 respectively) in the soft tissues at the posterior surface of the right shoulder, fractures of the right humerus (DC -10.0), bruises on the anterior surface of the right thigh (DC -13.0), the posterior surface of the right forearm (DC -10.0) and the fontal region of the right lower leg (DC -10.0), bleeding in the posterior region of the right forearm (DC -9.0) and the anterior region of the left thigh (DC -8.6), fractures of the right femur (DG -8.1), etc. It is concluded that the knowledge of diagnostic coefficients helps to draw attention of the experts to the analysis of the
DEFF Research Database (Denmark)
Hansen, Elo Harald; Miró, Manuel; Long, Xiangbao;
2006-01-01
are presented as based on the exploitation of micro-sequential injection (μSI-LOV) using hydrophobic as well as hydrophilic bead materials. The examples given comprise the presentation of a universal approach for SPE-assays, front-end speciation of Cr(III) and Cr(VI) in a fully automated and enclosed set...
Franco, Glaura C.; Reisen, Valderio A.
2007-03-01
This paper deals with different bootstrap approaches and bootstrap confidence intervals in the fractionally autoregressive moving average (ARFIMA(p,d,q)) process [J. Hosking, Fractional differencing, Biometrika 68(1) (1981) 165-175] using parametric and semi-parametric estimation techniques for the memory parameter d. The bootstrap procedures considered are: the classical bootstrap in the residuals of the fitted model [B. Efron, R. Tibshirani, An Introduction to the Bootstrap, Chapman and Hall, New York, 1993], the bootstrap in the spectral density function [E. Paparoditis, D.N Politis, The local bootstrap for periodogram statistics. J. Time Ser. Anal. 20(2) (1999) 193-222], the bootstrap in the residuals resulting from the regression equation of the semi-parametric estimators [G.C Franco, V.A Reisen, Bootstrap techniques in semiparametric estimation methods for ARFIMA models: a comparison study, Comput. Statist. 19 (2004) 243-259] and the Sieve bootstrap [P. Bühlmann, Sieve bootstrap for time series, Bernoulli 3 (1997) 123-148]. The performance of these procedures and confidence intervals for d in the stationary and non-stationary ranges are empirically obtained through Monte Carlo experiments. The bootstrap confidence intervals here proposed are alternative procedures with some accuracy to obtain confidence intervals for d.
Institute of Scientific and Technical Information of China (English)
Fang-Ling Tao; Shi-Fan Min; Wei-Jian Wu; Guang-Wen Liang; Ling Zeng
2008-01-01
Taking a published natural population life table office leaf roller, Cnaphalocrocis medinalis (Lepidoptera: Pyralidae), as an example, we estimated the population trend index,I, via re-sampling methods (jackknife and bootstrap), determined its statistical properties and illustrated the application of these methods in determining the control effectiveness of bio-agents and chemical insecticides. Depending on the simulation outputs, the smoothed distribution pattern of the estimates of I by delete-1 jackknife is visually distinguishable from the normal density, but the smoothed pattern produced by delete-d jackknife, and logarithm-transformed smoothed patterns produced by both empirical and parametric bootstraps,matched well the corresponding normal density. Thus, the estimates of I produced by delete-1 jackknife were not used to determine the suppressive effect of wasps and insecticides. The 95% percent confidence intervals or the narrowest 95 percentiles and Z-test criterion were employed to compare the effectiveness of Trichogrammajaponicum Ashmead and insecti-cides (powder, 1.5% mevinphos + 3% alpha-hexachloro cyclohexane) against the rice leaf roller based on the estimates of I produced by delete-d jackknife and bootstrap techniques.At α= 0.05 level, there were statistical differences between wasp treatment and control, and between wasp and insecticide treatments, if the normality is ensured, or by the narrowest 95 percentiles. However, there is still no difference between insecticide treatment and control.By Z-test criterion, wasp treatment is better than control and insecticide treatment with P-value＜0.01. Insecticide treatment is similar to control with P-value ＞ 0.2 indicating that 95% confidence intervals procedure is more conservative. Although similar conclusions may be drawn by re-sampling techniques, such as the delta method, about the suppressive effect of trichogramma and insecticides, the normality of the estimates can be checked and guaranteed
Language specific bootstraps for UG categories
van Kampen, N.J.
2005-01-01
This paper argues that the universal categories N/V are not applied to content words before the grammatical markings for reference D(eterminers) and predication I(nflection) have been acquired (van Kampen, 1997, contra Pinker, 1984). Child grammar starts as proto-grammar with language-specific opera
BOOTSTRAP WAVELET IN THE NONPARAMETRIC REGRESSION MODEL WITH WEAKLY DEPENDENT PROCESSES
Institute of Scientific and Technical Information of China (English)
林路; 张润楚
2004-01-01
This paper introduces a method of bootstrap wavelet estimation in a nonparametric regression model with weakly dependent processes for both fixed and random designs. The asymptotic bounds for the bias and variance of the bootstrap wavelet estimators are given in the fixed design model. The conditional normality for a modified version of the bootstrap wavelet estimators is obtained in the fixed model. The consistency for the bootstrap wavelet estimator is also proved in the random design model. These results show that the bootstrap wavelet method is valid for the model with weakly dependent processes.
Model-Consistent Sparse Estimation through the Bootstrap
Bach, Francis
2009-01-01
We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.
Sisvar: a Guide for its Bootstrap procedures in multiple comparisons
Directory of Open Access Journals (Sweden)
Daniel Furtado Ferreira
2014-04-01
Full Text Available Sisvar is a statistical analysis system with a large usage by the scientific community to produce statistical analyses and to produce scientific results and conclusions. The large use of the statistical procedures of Sisvar by the scientific community is due to it being accurate, precise, simple and robust. With many options of analysis, Sisvar has a not so largely used analysis that is the multiple comparison procedures using bootstrap approaches. This paper aims to review this subject and to show some advantages of using Sisvar to perform such analysis to compare treatments means. Tests like Dunnett, Tukey, Student-Newman-Keuls and Scott-Knott are performed alternatively by bootstrap methods and show greater power and better controls of experimentwise type I error rates under non-normal, asymmetric, platykurtic or leptokurtic distributions.
Bootstrapping GEE models for fMRI regional connectivity.
D'Angelo, Gina M; Lazar, Nicole A; Zhou, Gongfu; Eddy, William F; Morris, John C; Sheline, Yvette I
2012-12-01
An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations during rest between groups. We apply generalized estimating equation (GEE) models to test for differences in regional correlations across groups. Both the GEE marginal model and GEE transition model are evaluated and compared to the standard pooling Fisher-z approach using simulation studies. Standard errors of all methods are estimated both theoretically (model-based) and empirically (bootstrap). Of all the methods, we find that the transition models have the best statistical properties. Overall, the model-based standard errors and bootstrap standard errors perform about the same. We also demonstrate the methods with a functional connectivity study in a healthy cognitively normal population of ApoE4+ participants and ApoE4- participants who are recruited from the Adult Children's Study conducted at the Washington University Knight Alzheimer's Disease Research Center.
Bolasso: model consistent Lasso estimation through the bootstrap
Bach, Francis
2008-01-01
We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...
Bootstrap bound for conformal multi-flavor QCD on lattice
Nakayama, Yu
2016-01-01
The recent work by Iha et al shows an upper bound on mass anomalous dimension $\\gamma_m$ of multi-flavor massless QCD at the renormalization group fixed point from the conformal bootstrap in $SU(N_F)_V$ symmetric conformal field theories under the assumption that the fixed point is realizable with the lattice regularization based on staggered fermions. We show that the almost identical but slightly stronger bound applies to the regularization based on Wilson fermions (or domain wall fermions) by studying the conformal bootstrap in $SU(N_f)_L \\times SU(N_f)_R$ symmetric conformal field theories. For $N_f=8$, our bound implies $\\gamma_m < 1.31$ to avoid dangerously irrelevant operators that are not compatible with the lattice symmetry.
A conformal bootstrap approach to critical percolation in two dimensions
Picco, Marco; Santachiara, Raoul
2016-01-01
We study four-point functions of critical percolation in two dimensions, and more generally of the Potts model. We propose an exact ansatz for the spectrum: an infinite, discrete and non-diagonal combination of representations of the Virasoro algebra. Based on this ansatz, we compute four-point functions using a numerical conformal bootstrap approach. The results agree with Monte-Carlo computations of connectivities of random clusters.
Bootstrapping a Five-Loop Amplitude from Steinmann Relations
Caron-Huot, Simon; McLeod, Andrew; von Hippel, Matt
2016-01-01
The analytic structure of scattering amplitudes is restricted by Steinmann relations, which enforce the vanishing of certain discontinuities of discontinuities. We show that these relations dramatically simplify the function space for the hexagon function bootstrap in planar maximally supersymmetric Yang-Mills theory. Armed with this simplification, along with the constraints of dual conformal symmetry and Regge exponentiation, we obtain the complete five-loop six-particle amplitude.
Non-critical string, Liouville theory and geometric bootstrap hypothesis
Hadasz, L; Hadasz, Leszek; Jaskolski, Zbigniew
2003-01-01
Basing on the standard construction of critical string amplitudes we analyze properties of the longitudinal sector of the non-critical Nambu-Goto string. We demonstrate that it cannot be described by standard (in the sense of BPZ) conformal field theory. As an alternative we propose a new version of the geometric approach to Liouville theory and formulate its basic consistency condition - the geometric bootstrap equation.
Necessary Condition for Emergent Symmetry from the Conformal Bootstrap
Nakayama, Yu; Ohtsuki, Tomoki
2016-09-01
We use the conformal bootstrap program to derive the necessary conditions for emergent symmetry enhancement from discrete symmetry (e.g., Zn ) to continuous symmetry [e.g., U (1 )] under the renormalization group flow. In three dimensions, in order for Z2 symmetry to be enhanced to U (1 ) symmetry, the conformal bootstrap program predicts that the scaling dimension of the order parameter field at the infrared conformal fixed point must satisfy Δ1>1.08 . We also obtain the similar necessary conditions for Z3 symmetry with Δ1>0.580 and Z4 symmetry with Δ1>0.504 from the simultaneous conformal bootstrap analysis of multiple four-point functions. As applications, we show that our necessary conditions impose severe constraints on the nature of the chiral phase transition in QCD, the deconfinement criticality in Néel valence bond solid transitions, and anisotropic deformations in critical O (n ) models. We prove that some fixed points proposed in the literature are unstable under the perturbation that cannot be forbidden by the discrete symmetry. In these situations, the second-order phase transition with enhanced symmetry cannot happen.
Truncatable bootstrap equations in algebraic form and critical surface exponents
Gliozzi, Ferdinando
2016-01-01
We describe examples of drastic truncations of conformal bootstrap equations encoding much more information than that obtained by a direct numerical approach. A three-term truncation of the four point function of a free scalar in any space dimensions provides algebraic identities among conformal block derivatives which generate the exact spectrum of the infinitely many primary operators contributing to it. In boundary conformal field theories, we point out that the appearance of free parameters in the solutions of bootstrap equations is not an artifact of truncations, rather it reflects a physical property of permeable conformal interfaces which are described by the same equations. Surface transitions correspond to isolated points in the parameter space. We are able to locate them in the case of 3d Ising model, thanks to a useful algebraic form of 3d boundary bootstrap equations. It turns out that the low-lying spectra of the surface operators in the ordinary and the special transitions of 3d Ising model form...
On Bootstrap Tests of Symmetry About an Unknown Median.
Zheng, Tian; Gastwirth, Joseph L
2010-07-01
It is important to examine the symmetry of an underlying distribution before applying some statistical procedures to a data set. For example, in the Zuni School District case, a formula originally developed by the Department of Education trimmed 5% of the data symmetrically from each end. The validity of this procedure was questioned at the hearing by Chief Justice Roberts. Most tests of symmetry (even nonparametric ones) are not distribution free in finite sample sizes. Hence, using asymptotic distribution may not yield an accurate type I error rate or/and loss of power in small samples. Bootstrap resampling from a symmetric empirical distribution function fitted to the data is proposed to improve the accuracy of the calculated p-value of several tests of symmetry. The results show that the bootstrap method is superior to previously used approaches relying on the asymptotic distribution of the tests that assumed the data come from a normal distribution. Incorporating the bootstrap estimate in a recently proposed test due to Miao, Gel and Gastwirth (2006) preserved its level and shows it has reasonable power properties on the family of distribution evaluated.
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2001-01-01
A sequential injection system for on-line ion-exchange separation and preconcentration of trace-level amounts of metal ions with ensuing detection by electrothermal atomic absorption spectrometry (ETAAS) is described. Based on the use of a renewable microcolumn incorporated within an integrated l.......3% for the determination of 2.0 mug/l Bi (n = 7). The procedure was validated by determination of bismuth in a certified reference material CRM 320 (river sediment), and by bismuth spike recoveries in two human urine samples....
Institute of Scientific and Technical Information of China (English)
王晖; 刘大有; 等
1994-01-01
In this paper we consider the problem of sequential processing and present a sequential model based on the back-propagation algorithm.This model is intended to deal with intrinsically sequential problems,such as word recognition,speech recognition,natural language understanding.This model can be used to train a network to learn the sequence of input patterns,in a fixed order or a random order.Besides,this model is open- and partial-associative,characterized as “resognizing while accumulating”, which, as we argue, is mental cognition process oriented.
Žemberyová, M.; Jankovič, R.; Hagarová, I.; Kuss, H.-M.
2007-05-01
A modified three-step sequential extraction procedure proposed by the Commission of European Communities Bureau of Reference (BCR) was applied to certified reference materials of three different soil groups (rendzina, luvisol, cambisol) and sewage sludge of different composition originating from a municipal water treatment plant in order to assess potential mobility and the distribution of vanadium in the resulting fractions. Analysis of the extracts was carried out by electrothermal atomic absorption spectrometry with Zeeman background correction using transversely heated graphite atomizers. Extracts showed significant matrix interferences which were overcome by the standard addition technique. The original soil and sludge certified reference materials (CRMs) and the extraction residue from the sequential extraction were decomposed by a mixture of HNO 3-HClO 4-HF in an open system. The content of V determined after decomposition of the samples was in very good agreement with the certified total values. The accuracy of the sequential extraction procedure was checked by comparing the sum of the vanadium contents in the three fractions and in the extraction residue with the certified total content of V. The amounts of vanadium leached were in good correlation with the certified total contents of V in the CRMs of soils and sewage sludge. In the soils examined, vanadium was present almost entirely in the mineral lattice, while in the sewage sludge samples 9-14% was found in the oxidizable and almost 25% in the reducible fractions. The recovery ranged from 93-106% and the precision (RSD) was below 10%.
Dowall, Stuart; Taylor, Irene; Yeates, Paul; Smith, Leonie; Rule, Antony; Easterbrook, Linda; Bruce, Christine; Cook, Nicola; Corbin-Lickfett, Kara; Empig, Cyril; Schlunegger, Kyle; Graham, Victoria; Dennis, Mike; Hewson, Roger
2013-02-01
Sequential sampling from animals challenged with highly pathogenic organisms, such as haemorrhagic fever viruses, is required for many pharmaceutical studies. Using the guinea pig model of Ebola virus infection, a catheterized system was used which had the benefits of allowing repeated sampling of the same cohort of animals, and also a reduction in the use of sharps at high biological containment. Levels of a PS-targeting antibody (Bavituximab) were measured in Ebola-infected animals and uninfected controls. Data showed that the pharmacokinetics were similar in both groups, therefore Ebola virus infection did not have an observable effect on the half-life of the antibody.
A universal property for sequential measurement
Westerbaan, Abraham; Westerbaan, Bas
2016-09-01
We study the sequential product the operation p ∗ q = √{ p } q √{ p } on the set of effects, [0, 1]𝒜, of a von Neumann algebra 𝒜 that represents sequential measurement of first p and then q. In their work [J. Math. Phys. 49(5), 052106 (2008)], Gudder and Latrémolière give a list of axioms based on physical grounds that completely determines the sequential product on a von Neumann algebra of type I, that is, a von Neumann algebra ℬ(ℋ) of all bounded operators on some Hilbert space ℋ. In this paper we give a list of axioms that completely determines the sequential product on all von Neumann algebras simultaneously (Theorem 4).
Bootstrap inversion for Pn wave velocity in North-Western Italy
Directory of Open Access Journals (Sweden)
C. Eva
1997-06-01
Full Text Available An inversion of Pn arrival times from regional distance earthquakes (180-800 km, recorded by 94 seismic stations operating in North-Western Italy and surrounding areas, was carried out to image lateral variations of P-wave velocity at the crust-mantle boundary, and to estimate the static delay time at each station. The reliability of the obtained results was assessed using both synthetic tests and the bootstrap Monte Carlo resampling technique. Numerical simulations demonstrated the existence of a trade-off between cell velocities and estimated station delay times along the edge of the model. Bootstrap inversions were carried out to determine the standard deviation of velocities and time terms. Low Pn velocity anomalies are detected beneath the outer side of the Alps (-6% and the Western Po plain (-4% in correspondence with two regions of strong crustal thickening and negative Bouguer anomaly. In contrast, high Pn velocities are imaged beneath the inner side of the Alps (+4% indicating the presence of high velocity and density lower crust-upper mantle. The Ligurian sea shows high Pn velocities close to the Ligurian coastlines (+3% and low Pn velocities (-1.5% in the middle of the basin in agreement with the upper mantle velocity structure revealed by seismic refraction profiles.
Ramírez-Prado, Dolores; Cortés, Ernesto; Aguilar-Segura, María Soledad; Gil-Guillén, Vicente Francisco
2016-01-01
In January 2012, a review of the cases of chromosome 15q24 microdeletion syndrome was published. However, this study did not include inferential statistics. The aims of the present study were to update the literature search and calculate confidence intervals for the prevalence of each phenotype using bootstrap methodology. Published case reports of patients with the syndrome that included detailed information about breakpoints and phenotype were sought and 36 were included. Deletions in megabase (Mb) pairs were determined to calculate the size of the interstitial deletion of the phenotypes studied in 2012. To determine confidence intervals for the prevalence of the phenotype and the interstitial loss, we used bootstrap methodology. Using the bootstrap percentiles method, we found wide variability in the prevalence of the different phenotypes (3–100%). The mean interstitial deletion size was 2.72 Mb (95% CI [2.35–3.10 Mb]). In comparison with our work, which expanded the literature search by 45 months, there were differences in the prevalence of 17% of the phenotypes, indicating that more studies are needed to analyze this rare disease. PMID:26925314
A Bootstrap Approach to an Affordable Exploration Program
Oeftering, Richard C.
2011-01-01
This paper examines the potential to build an affordable sustainable exploration program by adopting an approach that requires investing in technologies that can be used to build a space infrastructure from very modest initial capabilities. Human exploration has had a history of flight programs that have high development and operational costs. Since Apollo, human exploration has had very constrained budgets and they are expected be constrained in the future. Due to their high operations costs it becomes necessary to consider retiring established space facilities in order to move on to the next exploration challenge. This practice may save cost in the near term but it does so by sacrificing part of the program s future architecture. Human exploration also has a history of sacrificing fully functional flight hardware to achieve mission objectives. An affordable exploration program cannot be built when it involves billions of dollars of discarded space flight hardware, instead, the program must emphasize preserving its high value space assets and building a suitable permanent infrastructure. Further this infrastructure must reduce operational and logistics cost. The paper examines the importance of achieving a high level of logistics independence by minimizing resource consumption, minimizing the dependency on external logistics, and maximizing the utility of resources available. The approach involves the development and deployment of a core suite of technologies that have minimum initial needs yet are able expand upon initial capability in an incremental bootstrap fashion. The bootstrap approach incrementally creates an infrastructure that grows and becomes self sustaining and eventually begins producing the energy, products and consumable propellants that support human exploration. The bootstrap technologies involve new methods of delivering and manipulating energy and materials. These technologies will exploit the space environment, minimize dependencies, and
Two novel applications of bootstrap currents: Snakes and jitter stabilization
Energy Technology Data Exchange (ETDEWEB)
Thyagaraja, A.; Haas, F.A. (AEA Fusion (AEA Fusion/Euratom Fusion Association), Culham Laboratory, Abingdon, OX14 3DB (United Kingdom))
1993-09-01
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of noninductive current are considered in this work. In the first, an earlier model of the nonlinearly saturated [ital m]=1 tearing mode is extended to include the stabilizing effect of a bootstrap current [ital inside] the island. This is used to explain several observed features of the so-called snake'' reported in the Joint European Torus (JET) [R. D. Gill, A. W. Edwards, D. Pasini, and A. Weller, Nucl. Fusion [bold 32], 723 (1992)]. The second application involves an alternating current (ac) form of bootstrap current, produced by pressure-gradient fluctuations. It is suggested that a time-dependent (in the plasma frame), radio-frequency (rf) power source can be used to produce localized pressure fluctuations of suitable frequency and amplitude to implement the dynamic stabilization method for suppressing gross modes in tokamaks suggested in a recent paper [A. Thyagaraja, R. D. Hazeltine, and A. Y. Aydemir, Phys. Fluids B [bold 4], 2733 (1992)]. This method works by detuning'' the resonant layer by rapid current/shear fluctuations. Estimates made for the power source requirements both for small machines such as COMPASS and for larger machines like JET suggest that the method could be practically feasible. This jitter'' (i.e., dynamic) stabilization method could provide a useful form of active instability control to avoid both gross/disruptive and fine-scale/transportive instabilities, which may set severe operating/safety constraints in the reactor regime. The results are also capable, in principle, of throwing considerable light on the local properties of current generation and diffusion in tokamaks, which may be enhanced by turbulence, as has been suggested recently by several researchers.
Bootstrap 方法建立生物参考区间的研究%Estimation of reference intervals by bootstrap methods
Institute of Scientific and Technical Information of China (English)
李小佩; 王建新; 施秀英; 王惠民
2016-01-01
目的：探讨Bootstrap方法建立生物参考区间的可行性及最少样本量。方法选取120例健康者检测血清尿素（BUN），采用美国临床和实验室标准协会（CLSI）C28‐A3推荐参数、非参数方法及Bootstrap方法建立参考区间；计算参考区间上下限值、均方根误差（RMSE）、精密度比值（Id ）等指标，研究Bootstrap方法建立参考区间最少样本量。结果非参数、参数、Bootstrap百分位数、Bootstrap标准方法建立参考区间分别为3．00～7．39 mmol／L、3．05～7．33 mmol／L、3．12～7．18 mmol／L、3．10～7．33 mmol／L ，参考区间覆盖率大于95％，Bootstrap方法与C28‐A3推荐方法结果一致；进一步研究发现随着样本量减少，参考区间增宽、RMSE和Id 逐渐升高，样本量减少到60时结果稳定。结论 Bootstrap方法可应用于各种数据分布类型生物参考区间的建立，在小样本量时有一定优势，为临床实验室制订参考区间提供了新思路。%Objective To research the bootstrap methods for reference intervals calculation are available and the minimum samples .Methods The concentrations of serum ureanitrogen(BUN) levels in 120 healthy reference subjects were determined and the reference intervals of BUN were calculated .Comparisons between results from par‐ametric and nonparametric method which were recommended by Clinical and Laboratory Standards Institute(CLSI) C28‐A3 and bootstrap methods .Reference interval limit and the values of RMSE ,Id were performed using RefValAdv to identify the minimum number of samples .Results The reference intervals calculated by nonparametric and para‐metric methods ,percentile and standard bootstrap methods were 3 .00-7 .39 mmol/L ,3 .05 -7 .33 mmol/L ,3 .12-7 .18 mmol/L ,3 .10 -7 .33 mmol/L respectively ,results were basically consistent .Overlap between reference inter‐vals were greater than 95% .The values of RMSE and Id increased ,the reference
Intervalos de confiança bootstrap e subamostragem
Sebastião, João; Nunes, Sara
2003-01-01
Neste trabalho consideram-se as metodologias de reamostragem e subamostragem. Estas metodologias, computacionalmente intensivas, são hoje amplamente utilizadas na Inferência Estatística no cálculo de intervalos de confiança para um determinado parâmetro de interesse. Num estudo de simulação aplica-se o Bootstrap e a subamostragem a conjuntos de observações provenientes de populações normais com o objectivo de determinar intervalos de confiança para o valor médio.
Two novel applications of bootstrap currents: snakes and jitter stabilization
Energy Technology Data Exchange (ETDEWEB)
Thyagaraja, A. [AEA Technology, Culham (United Kingdom); Haas, F.A. [The Open University, Oxford Research Unit, Oxford (United Kingdom)
1993-12-31
Both neoclassical theory and certain turbulence theories of particle transport in tokamaks predict the existence of bootstrap (i.e., pressure-driven) currents. Two new applications of this form of non-inductive current are considered in this work. The first is an explanation of the `snake` phenomenon observed in JET based on steady-state nonlinear tearing theory. The second is an active method of dynamic stabilization of the m=1 mode using the `jitter` approach suggested by Thyagaraja et al in a recent paper. (author) 11 refs.
Comparing groups randomization and bootstrap methods using R
Zieffler, Andrew S; Long, Jeffrey D
2011-01-01
A hands-on guide to using R to carry out key statistical practices in educational and behavioral sciences research Computing has become an essential part of the day-to-day practice of statistical work, broadening the types of questions that can now be addressed by research scientists applying newly derived data analytic techniques. Comparing Groups: Randomization and Bootstrap Methods Using R emphasizes the direct link between scientific research questions and data analysis. Rather than relying on mathematical calculations, this book focus on conceptual explanations and
Remuestreo Bootstrap y Jackknife en confiabilidad: Caso Exponencial y Weibull
Directory of Open Access Journals (Sweden)
Javier Ramírez-Montoya
2016-01-01
Full Text Available Se comparan los métodos de remuestreo Bootstrap-t y Jackknife delete I y delete II, utilizando los estimadores no paramétricos de Kaplan-Meier y Nelson-Aalen, que se utilizan con frecuencia en la práctica, teniendo en cuenta diferentes porcentajes de censura, tamaños de muestra y tiempos de interés. La comparación se realiza vía simulación, mediante el error cuadrático medio.
The S-matrix Bootstrap II: Two Dimensional Amplitudes
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We consider constraints on the S-matrix of any gapped, Lorentz invariant quantum field theory in 1 + 1 dimensions due to crossing symmetry and unitarity. In this way we establish rigorous bounds on the cubic couplings of a given theory with a fixed mass spectrum. In special cases we identify interesting integrable theories saturating these bounds. Our analytic bounds match precisely with numerical bounds obtained in a companion paper where we consider massive QFT in an AdS box and study boundary correlators using the technology of the conformal bootstrap.
Directory of Open Access Journals (Sweden)
Kristina Novak Zelenika
2014-07-01
Full Text Available Geostatistical methods are very successfully used in Upper Miocene (Lower Pontian Kloštar structure modelling. Mapping of the two variables (porosity and thickness and their common observation in certain cut-off values gave the insight in depositional channel location, transitional lithofacies, material transport direction and variables distribution within representative Lower Pontian reservoir. It was possible to observe direction of the turbidites and role of the normal fault in detritus flow direction in the analyzed structure. Intercalation between turbiditic sandstones and basinal pelitic marls were the locations with the highest thicknesses. Sequential Indicator Simulations highlighted porosity maps as primary and thickness maps as secondary (additional data source (the paper is published in Croatian.
Bootstrap embedding: An internally consistent fragment-based method
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy
2016-08-01
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.
Bootstrap Percolation on Complex Networks with Community Structure
Chong, Wu; Rui, Zhang; Liujun, Chen; Jiawei, Chen; Xiaobin, Li; Yanqing, Hu
2014-01-01
Real complex networks usually involve community structure. How innovation and new products spread on social networks which have internal structure is a practically interesting and fundamental question. In this paper we study the bootstrap percolation on a single network with community structure, in which we initiate the bootstrap process by activating different fraction of nodes in each community. A previously inactive node transfers to active one if it detects at least $k$ active neighbors. The fraction of active nodes in community $i$ in the final state $S_i$ and its giant component size $S_{gci}$ are theoretically obtained as functions of the initial fractions of active nodes $f_i$. We show that such functions undergo multiple discontinuous transitions; The discontinuous jump of $S_i$ or $S_{gci}$ in one community may trigger a simultaneous jump of that in the other, which leads to multiple discontinuous transitions for the total fraction of active nodes $S$ and its associated giant component size $S_{gc}$...
A Parsimonious Bootstrap Method to Model Natural Inflow Energy Series
Directory of Open Access Journals (Sweden)
Fernando Luiz Cyrino Oliveira
2014-01-01
Full Text Available The Brazilian energy generation and transmission system is quite peculiar in its dimension and characteristics. As such, it can be considered unique in the world. It is a high dimension hydrothermal system with huge participation of hydro plants. Such strong dependency on hydrological regimes implies uncertainties related to the energetic planning, requiring adequate modeling of the hydrological time series. This is carried out via stochastic simulations of monthly inflow series using the family of Periodic Autoregressive models, PAR(p, one for each period (month of the year. In this paper it is shown the problems in fitting these models by the current system, particularly the identification of the autoregressive order “p” and the corresponding parameter estimation. It is followed by a proposal of a new approach to set both the model order and the parameters estimation of the PAR(p models, using a nonparametric computational technique, known as Bootstrap. This technique allows the estimation of reliable confidence intervals for the model parameters. The obtained results using the Parsimonious Bootstrap Method of Moments (PBMOM produced not only more parsimonious model orders but also adherent stochastic scenarios and, in the long range, lead to a better use of water resources in the energy operation planning.
Bootstrapping Q Methodology to Improve the Understanding of Human Perspectives.
Zabala, Aiora; Pascual, Unai
2016-01-01
Q is a semi-qualitative methodology to identify typologies of perspectives. It is appropriate to address questions concerning diverse viewpoints, plurality of discourses, or participation processes across disciplines. Perspectives are interpreted based on rankings of a set of statements. These rankings are analysed using multivariate data reduction techniques in order to find similarities between respondents. Discussing the analytical process and looking for progress in Q methodology is becoming increasingly relevant. While its use is growing in social, health and environmental studies, the analytical process has received little attention in the last decades and it has not benefited from recent statistical and computational advances. Specifically, the standard procedure provides overall and arguably simplistic variability measures for perspectives and none of these measures are associated to individual statements, on which the interpretation is based. This paper presents an innovative approach of bootstrapping Q to obtain additional and more detailed measures of variability, which helps researchers understand better their data and the perspectives therein. This approach provides measures of variability that are specific to each statement and perspective, and additional measures that indicate the degree of certainty with which each respondent relates to each perspective. This supplementary information may add or subtract strength to particular arguments used to describe the perspectives. We illustrate and show the usefulness of this approach with an empirical example. The paper provides full details for other researchers to implement the bootstrap in Q studies with any data collection design.
Bootstrapping One-Loop QCD Amplitudeswith General Helicities
Energy Technology Data Exchange (ETDEWEB)
Berger, Carola F.; Bern, Zvi; Dixon, Lance J.; Forde, Darren; Kosower, David A.
2006-04-25
The recently developed on-shell bootstrap for computing one-loop amplitudes in non-supersymmetric theories such as QCD combines the unitarity method with loop-level on-shell recursion. For generic helicity configurations, the recursion relations may involve undetermined contributions from non-standard complex singularities or from large values of the shift parameter. Here we develop a strategy for sidestepping difficulties through use of pairs of recursion relations. To illustrate the strategy, we present sets of recursion relations needed for obtaining n-gluon amplitudes in QCD. We give a recursive solution for the one-loop n-gluon QCD amplitudes with three or four color-adjacent gluons of negative helicity and the remaining ones of positive helicity. We provide an explicit analytic formula for the QCD amplitude A{sub 6;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}), as well as numerical results for A{sub 7;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}), A{sub 8;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}, 8{sup +}), and A{sub 8;1}(1{sup -}, 2{sup -}, 3{sup -}, 4{sup -}, 5{sup +}, 6{sup +}, 7{sup +}, 8{sup +}). We expect the on-shell bootstrap approach to have widespread applications to phenomenological studies at colliders.
Assessing Uncertainty in LULC Classification Accuracy by Using Bootstrap Resampling
Directory of Open Access Journals (Sweden)
Lin-Hsuan Hsiao
2016-08-01
Full Text Available Supervised land-use/land-cover (LULC classifications are typically conducted using class assignment rules derived from a set of multiclass training samples. Consequently, classification accuracy varies with the training data set and is thus associated with uncertainty. In this study, we propose a bootstrap resampling and reclassification approach that can be applied for assessing not only the uncertainty in classification results of the bootstrap-training data sets, but also the classification uncertainty of individual pixels in the study area. Two measures of pixel-specific classification uncertainty, namely the maximum class probability and Shannon entropy, were derived from the class probability vector of individual pixels and used for the identification of unclassified pixels. Unclassified pixels that are identified using the traditional chi-square threshold technique represent outliers of individual LULC classes, but they are not necessarily associated with higher classification uncertainty. By contrast, unclassified pixels identified using the equal-likelihood technique are associated with higher classification uncertainty and they mostly occur on or near the borders of different land-cover.
Nose, Holliness; Chen, Yu; Rodgers, M T
2013-05-23
The third sequential binding energies of the late first-row divalent transition metal cations to 1,10-phenanthroline (Phen) are determined by energy-resolved collision-induced dissociation (CID) techniques using a guided ion beam tandem mass spectrometer. Five late first-row transition metal cations in their +2 oxidation states are examined including: Fe(2+), Co(2+), Ni(2+), Cu(2+), and Zn(2+). The kinetic energy dependent CID cross sections for loss of an intact Phen ligand from the M(2+)(Phen)3 complexes are modeled to obtain 0 and 298 K bond dissociation energies (BDEs) after accounting for the effects of the internal energy of the complexes, multiple ion-neutral collisions, and unimolecular decay rates. Electronic structure theory calculations at the B3LYP, BHandHLYP, and M06 levels of theory are employed to determine the structures and theoretical estimates for the first, second, and third sequential BDEs of the M(2+)(Phen)x complexes. B3LYP was found to deliver results that are most consistent with the measured values. Periodic trends in the binding of these complexes are examined and compared to the analogous complexes to the late first-row monovalent transition metal cations, Co(+), Ni(+), Cu(+), and Zn(+), previously investigated.
Sequential stochastic optimization
Cairoli, Renzo
1996-01-01
Sequential Stochastic Optimization provides mathematicians and applied researchers with a well-developed framework in which stochastic optimization problems can be formulated and solved. Offering much material that is either new or has never before appeared in book form, it lucidly presents a unified theory of optimal stopping and optimal sequential control of stochastic processes. This book has been carefully organized so that little prior knowledge of the subject is assumed; its only prerequisites are a standard graduate course in probability theory and some familiarity with discrete-paramet
Sequential memory: Binding dynamics
Afraimovich, Valentin; Gong, Xue; Rabinovich, Mikhail
2015-10-01
Temporal order memories are critical for everyday animal and human functioning. Experiments and our own experience show that the binding or association of various features of an event together and the maintaining of multimodality events in sequential order are the key components of any sequential memories—episodic, semantic, working, etc. We study a robustness of binding sequential dynamics based on our previously introduced model in the form of generalized Lotka-Volterra equations. In the phase space of the model, there exists a multi-dimensional binding heteroclinic network consisting of saddle equilibrium points and heteroclinic trajectories joining them. We prove here the robustness of the binding sequential dynamics, i.e., the feasibility phenomenon for coupled heteroclinic networks: for each collection of successive heteroclinic trajectories inside the unified networks, there is an open set of initial points such that the trajectory going through each of them follows the prescribed collection staying in a small neighborhood of it. We show also that the symbolic complexity function of the system restricted to this neighborhood is a polynomial of degree L - 1, where L is the number of modalities.
Forced Sequence Sequential Decoding
DEFF Research Database (Denmark)
Jensen, Ole Riis; Paaske, Erik
1998-01-01
the iteration process provides the sequential decoders with side information that allows a smaller average load and minimizes the probability of computational overflow. Analytical results for the probability that the first RS word is decoded after C computations are presented. These results are supported...
Renyi Entropies, the Analytic Bootstrap, and 3D Quantum Gravity at Higher Genus
Headrick, Matthew; Perlmutter, Eric; Zadeh, Ida G
2015-01-01
We compute the contribution of the vacuum Virasoro representation to the genus-two partition function of an arbitrary CFT with central charge $c>1$. This is the perturbative pure gravity partition function in three dimensions. We employ a sewing construction, in which the partition function is expressed as a sum of sphere four-point functions of Virasoro vacuum descendants. For this purpose, we develop techniques to efficiently compute correlation functions of holomorphic operators, which by crossing symmetry are determined exactly by a finite number of OPE coefficients; this is an analytic implementation of the conformal bootstrap. Expanding the results in $1/c$, corresponding to the semiclassical bulk gravity expansion, we find that---unlike at genus one---the result does not truncate at finite loop order. Our results also allow us to extend earlier work on multiple-interval Renyi entropies and on the partition function in the separating degeneration limit.
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Timmerman, Marieke E.; Kiers, Henk A.L.; Smilde, Age K.
2007-01-01
Confidence intervals (Cis) in principal component analysis (PCA) can be based on asymptotic standard errors and on the bootstrap methodology. The present paper offers an overview of possible strategies for bootstrapping in PCA. A motivating example shows that Ci estimates for the component loadings
Institute of Scientific and Technical Information of China (English)
2000-01-01
In this paper,the author studies the asymptotic accuracies of the one-term Edgeworth expansions and the bootstrap approximation for the studentized MLE from randomly censored exponential population.It is shown that the Edgeworth expansions and the bootstrap approximation are asymptotically close to the exact distribution of the studentized MLE with a rate.
Spanning Trees and bootstrap reliability estimation in correlation based networks
Tumminello, M; Lillo, F; Micciché, S; Mantegna, R N
2006-01-01
We introduce a new technique to associate a spanning tree to the average linkage cluster analysis. We term this tree as the Average Linkage Minimum Spanning Tree. We also introduce a technique to associate a value of reliability to links of correlation based graphs by using bootstrap replicas of data. Both techniques are applied to the portfolio of the 300 most capitalized stocks traded at New York Stock Exchange during the time period 2001-2003. We show that the Average Linkage Minimum Spanning Tree recognizes economic sectors and sub-sectors as communities in the network slightly better than the Minimum Spanning Tree does. We also show that the average reliability of links in the Minimum Spanning Tree is slightly greater than the average reliability of links in the Average Linkage Minimum Spanning Tree.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs.
González-Tablas, Ana I; Tapiador, Juan E
2016-05-11
We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN) composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user's security preferences implicitly captured by policies already in place.
A bootstrap method for estimating uncertainty of water quality trends
Hirsch, Robert M.; Archfield, Stacey A.; DeCicco, Laura
2015-01-01
Estimation of the direction and magnitude of trends in surface water quality remains a problem of great scientific and practical interest. The Weighted Regressions on Time, Discharge, and Season (WRTDS) method was recently introduced as an exploratory data analysis tool to provide flexible and robust estimates of water quality trends. This paper enhances the WRTDS method through the introduction of the WRTDS Bootstrap Test (WBT), an extension of WRTDS that quantifies the uncertainty in WRTDS-estimates of water quality trends and offers various ways to visualize and communicate these uncertainties. Monte Carlo experiments are applied to estimate the Type I error probabilities for this method. WBT is compared to other water-quality trend-testing methods appropriate for data sets of one to three decades in length with sampling frequencies of 6–24 observations per year. The software to conduct the test is in the EGRETci R-package.
Bootstrapping Security Policies for Wearable Apps Using Attributed Structural Graphs
Directory of Open Access Journals (Sweden)
Ana I. González-Tablas
2016-05-01
Full Text Available We address the problem of bootstrapping security and privacy policies for newly-deployed apps in wireless body area networks (WBAN composed of smartphones, sensors and other wearable devices. We introduce a framework to model such a WBAN as an undirected graph whose vertices correspond to devices, apps and app resources, while edges model structural relationships among them. This graph is then augmented with attributes capturing the features of each entity together with user-defined tags. We then adapt available graph-based similarity metrics to find the closest app to a new one to be deployed, with the aim of reusing, and possibly adapting, its security policy. We illustrate our approach through a detailed smartphone ecosystem case study. Our results suggest that the scheme can provide users with a reasonably good policy that is consistent with the user’s security preferences implicitly captured by policies already in place.
Bootstrapping Inductive and Coinductive Types in HasCASL
Schröder, Lutz
2008-01-01
We discuss the treatment of initial datatypes and final process types in the wide-spectrum language HasCASL. In particular, we present specifications that illustrate how datatypes and process types arise as bootstrapped concepts using HasCASL's type class mechanism, and we describe constructions of types of finite and infinite trees that establish the conservativity of datatype and process type declarations adhering to certain reasonable formats. The latter amounts to modifying known constructions from HOL to avoid unique choice; in categorical terminology, this means that we establish that quasitoposes with an internal natural numbers object support initial algebras and final coalgebras for a range of polynomial functors, thereby partially generalising corresponding results from topos theory. Moreover, we present similar constructions in categories of internal complete partial orders in quasitoposes.
Bootstrapping Mixed Correlators in the 3D Ising Model
Kos, Filip; Simmons-Duffin, David
2014-01-01
We study the conformal bootstrap for systems of correlators involving non-identical operators. The constraints of crossing symmetry and unitarity for such mixed correlators can be phrased in the language of semidefinite programming. We apply this formalism to the simplest system of mixed correlators in 3D CFTs with a $\\mathbb{Z}_2$ global symmetry. For the leading $\\mathbb{Z}_2$-odd operator $\\sigma$ and $\\mathbb{Z}_2$-even operator $\\epsilon$, we obtain numerical constraints on the allowed dimensions $(\\Delta_\\sigma, \\Delta_\\epsilon)$ assuming that $\\sigma$ and $\\epsilon$ are the only relevant scalars in the theory. These constraints yield a small closed region in $(\\Delta_\\sigma, \\Delta_\\epsilon)$ space compatible with the known values in the 3D Ising CFT.
Bootstrapping Pure Quantum Gravity in AdS3
Bae, Jin-Beom; Lee, Sungjay
2016-01-01
The three-dimensional pure quantum gravity with negative cosmological constant is supposed to be dual to the extremal conformal field theory of central charge $c=24k$ in two dimensions. We employ the conformal bootstrap method to analyze the extremal CFTs, and find numerical evidence for the non-existence of the extremal CFTs for sufficiently large central charge ($k \\ge 20$). We also explore near-extremal CFTs, a small modification of extremal ones, and find similar evidence for their non-existence for large central charge. This indicates, under the assumption of holomorphic factorization, the pure gravity in the weakly curved AdS$_3$ do not exist as a consistent quantum theory.
The Inverse Bagging Algorithm: Anomaly Detection by Inverse Bootstrap Aggregating
Vischia, Pietro
2016-01-01
For data sets populated by a very well modeled process and by another process of unknown probability density function (PDF), a desired feature when manipulating the fraction of the unknown process (either for enhancing it or suppressing it) consists in avoiding to modify the kinematic distributions of the well modeled one. A bootstrap technique is used to identify sub-samples rich in the well modeled process, and classify each event according to the frequency of it being part of such sub-samples. Comparisons with general MVA algorithms will be shown, as well as a study of the asymptotic properties of the method, making use of a public domain data set that models a typical search for new physics as performed at hadronic colliders such as the Large Hadron Collider (LHC).
Fuichtjohann, L; Jakubowski, N; Gladtke, D; Klocko, D; Broekaert, J A
2001-12-01
A four-stage sequential extraction procedure for the speciation of nickel has been applied to ambient aerosol samples. The determination of the soluble, sulfidic, metallic and oxidic Ni fractions in particulate matter was carried out by graphite furnace (electrothermal) atomic absorption spectrometry (ETAAS) and inductively coupled plasma mass spectrometry (ICP-MS). An EDTA solution, a mixture of diammonium citrate and hydrogen peroxide, and a KCuCl3 solution were used as leaching agents for the determination of the soluble, sulfidic and metallic species, respectively, and nitric acid was used for the determination of oxidic compounds after microwave digestion of particulate matter sampled on filters. A new micro scale filter holder placed in a closed flow injection analysis (FIA) system for use in nickel speciation by means of sequential extraction, and the results of the optimisation of the extraction conditions are described. The temperature program for ETAAS was optimised for all extraction solutions with the aid of temperature curves. Pyrolysis temperatures of 900. 600 and 1,000 degrees C were found to be optimum for EDTA, hydrogen peroxide plus ammonium citrate and KCuCl3-containing solutions, respectively. Airborne dust was sampled on lilters at two locations near to a metallurgical plant in Dortmund, Germany. Concentrations in the low ng m(-3) range down to the detections limits (0.1-0.3 ng m(-3)) and various nickel species were found to be present in the collected dust. The mean fractions of total nickel (sampling period of one month) were found to contain 36+20% of soluble, 6 +/- 4% of sulfidic, 11 +/- 15% of metallic and 48 +/- 18% of oxidic nickel.
Sequential measurements of conjugate observables
Energy Technology Data Exchange (ETDEWEB)
Carmeli, Claudio [Dipartimento di Fisica, Universita di Genova, Via Dodecaneso 33, 16146 Genova (Italy); Heinosaari, Teiko [Department of Physics and Astronomy, Turku Centre for Quantum Physics, University of Turku, 20014 Turku (Finland); Toigo, Alessandro, E-mail: claudio.carmeli@gmail.com, E-mail: teiko.heinosaari@utu.fi, E-mail: alessandro.toigo@polimi.it [Dipartimento di Matematica ' Francesco Brioschi' , Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)
2011-07-15
We present a unified treatment of sequential measurements of two conjugate observables. Our approach is to derive a mathematical structure theorem for all the relevant covariant instruments. As a consequence of this result, we show that every Weyl-Heisenberg covariant observable can be implemented as a sequential measurement of two conjugate observables. This method is applicable both in finite- and infinite-dimensional Hilbert spaces, therefore covering sequential spin component measurements as well as position-momentum sequential measurements.
Modelling sequentially scored item responses
Akkermans, W.
2000-01-01
The sequential model can be used to describe the variable resulting from a sequential scoring process. In this paper two more item response models are investigated with respect to their suitability for sequential scoring: the partial credit model and the graded response model. The investigation is c
Sequential cloning of chromosomes
Lacks, S.A.
1995-07-18
A method for sequential cloning of chromosomal DNA of a target organism is disclosed. A first DNA segment homologous to the chromosomal DNA to be sequentially cloned is isolated. The first segment has a first restriction enzyme site on either side. A first vector product is formed by ligating the homologous segment into a suitably designed vector. The first vector product is circularly integrated into the target organism`s chromosomal DNA. The resulting integrated chromosomal DNA segment includes the homologous DNA segment at either end of the integrated vector segment. The integrated chromosomal DNA is cleaved with a second restriction enzyme and ligated to form a vector-containing plasmid, which is replicated in a host organism. The replicated plasmid is then cleaved with the first restriction enzyme. Next, a DNA segment containing the vector and a segment of DNA homologous to a distal portion of the previously isolated DNA segment is isolated. This segment is then ligated to form a plasmid which is replicated within a suitable host. This plasmid is then circularly integrated into the target chromosomal DNA. The chromosomal DNA containing the circularly integrated vector is treated with a third, retrorestriction (class IIS) enzyme. The cleaved DNA is ligated to give a plasmid that is used to transform a host permissive for replication of its vector. The sequential cloning process continues by repeated cycles of circular integration and excision. The excision is carried out alternately with the second and third enzymes. 9 figs.
Fernandez, C; Conceição, Antonio C L; Rial-Otero, R; Vaz, C; Capelo, J L
2006-04-15
A new concept is presented for green analytical applications based on coupling on-line high-intensity focused ultrasound (HIFU) with a sequential injection/flow injection analysis (SIA/FIA) system. The potential of the SIA/HIFU/FIA scheme is demonstrated by taking mercury as a model analyte. Using inorganic mercury, methylmercury, phenylmercury, and diphenylmercury, the usefulness of the proposed methodology for the determination of inorganic and total mercury in waters and urine was demonstrated. The procedure requires low sample volumes and reagents and can be further applied to all chemical reactions involving HIFU. The inherent advantages of SIA, FIA, and HIFU applications in terms of high throughput, automation, low reagent consumption, and green chemistry are accomplished together for the first time in the present work.
Sequential Analysis in High Dimensional Multiple Testing and Sparse Recovery
Malloy, Matt
2011-01-01
This paper studies the problem of high-dimensional multiple testing and sparse recovery from the perspective of sequential analysis. In this setting, the probability of error is a function of the dimension of the problem. A simple sequential testing procedure for this problem is proposed. We derive necessary conditions for reliable recovery in the non-sequential setting and contrast them with sufficient conditions for reliable recovery using the proposed sequential testing procedure. Applications of the main results to several commonly encountered models show that sequential testing can be exponentially more sensitive to the difference between the null and alternative distributions (in terms of the dependence on dimension), implying that subtle cases can be much more reliably determined using sequential methods.
Energy Technology Data Exchange (ETDEWEB)
Reis Junior, Aluisio S.; Temba, Eliane S.C.; Kastner, Geraldo F.; Monteiro, Roberto P.G., E-mail: reisas@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (SERTA/CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico do Reator e Tecnicas Analiticas
2015-07-01
Alpha spectrometry analyzes were used for activity determinations of Pu and Am isotopes in evaporator concentrate samples from nuclear power plants. In this work it was used tracers for isotopes determination and quantification. The radiometric yields ranged from 60% to 100% and the Lower Limit of Detection was estimated as being 2.05 mBqKg{sup -1}. The relative standard deviations for replicate analysis were calculated for {sup 241}Am, 15.13% (sample J) and 9.38% (sample N), and for {sup 239+240}Pu, 8.16% (sample J) and 7.95% (sample N). (author)
DEFF Research Database (Denmark)
Hounyo, Ulrich
We propose a bootstrap mehtod for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootsratp method suggested for the pre-averaged realized volatility estimator......-studentized statistics, our results justify using the bootstrap to esitmate the covariance matrix of a broad class of covolatility estimators. The bootstrap variance estimator is positive semi-definite by construction, an appealing feature that is not always shared by existing variance estimators of the integrated...
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify usage extensively with several real-world applications. In passing we give a pedagogical introduction to the numerical bootstrap methods.
Forced Sequence Sequential Decoding
DEFF Research Database (Denmark)
Jensen, Ole Riis
In this thesis we describe a new concatenated decoding scheme based on iterations between an inner sequentially decoded convolutional code of rate R=1/4 and memory M=23, and block interleaved outer Reed-Solomon codes with non-uniform profile. With this scheme decoding with good performance is pos...... of computational overflow. Analytical results for the probability that the first Reed-Solomon word is decoded after C computations are presented. This is supported by simulation results that are also extended to other parameters....
DEFF Research Database (Denmark)
Hansen, Elo Harald
2005-01-01
Despite their excellent analytical chemical capacities for determination of low levels of metal species, electrothermal atomic absorption spectrometry and inductively coupled plasma mass spectrometry often require suitable pretreatment (separation and preconcentration) of the sample material....../preconcentration procedures are feasible, such as liquid-liquid extraction (possibly including back extraction), (co)precipitation with collection in knotted coils, adsorption on hydrophilic or hydrophobic reactors, hydride generation, or the use of ionexchange/chelating packed columns. After describing the particulars...... and characteristics of FI and SI, we present, via selected examples, various separation/preconcentration FI/Sl schemes for the determination of trace levels of metals, with particular emphasis on the use of the novel extension of FI/SI, that is, the so-called lab-on-valve concept....
Energy Technology Data Exchange (ETDEWEB)
Santos, Inês C. [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal); Mesquita, Raquel B.R. [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal); Laboratório de Hidrobiologia, Instituto de Ciências Biomédicas Abel Salazar (ICBAS), Universidade do Porto, Rua de Jorge Viterbo Ferreira no. 228, 4050-313 Porto (Portugal); Rangel, António O.S.S., E-mail: arangel@porto.ucp.pt [CBQF–Centro de Biotecnologia e Química Fina – Laboratório Associado, Escola Superior de Biotecnologia, Universidade Católica Portuguesa/Porto, Rua Arquiteto Lobão Vital, Apartado 2511, 4202-401 Porto (Portugal)
2015-09-03
This work describes the development of a solid phase spectrophotometry method in a μSI-LOV system for cadmium, zinc, and copper determination in freshwaters. NTA (Nitrilotriacetic acid) beads with 60–160 μm diameter were packed in the flow cell of the LOV for a μSPE column of 1 cm length. The spectrophotometric determination is based on the colourimetric reaction between dithizone and the target metals, previously retained on NTA resin. The absorbance of the coloured product formed is measured, at 550 nm, on the surface of the NTA resin beads in a solid phase spectrophotometry approach. The developed method presented preconcentration factors in the range of 11–21 for the metal ions. A LOD of 0.23 μg L{sup −1} for cadmium, 2.39 μg L{sup −1} for zinc, and 0.11 μg L{sup −1} for copper and a sampling rate of 12, 13, and 15 h{sup −1} for cadmium, zinc, and copper were obtained, respectively. The proposed method was successfully applied to freshwater samples. - Highlights: • Multi-parametric determination of cadmium, zinc, and copper at the μg L{sup −1} level. • In-line metal ions preconcentration using NTA resin. • Minimization of matrix interferences by performing solid phase spectrometry in a SI-LOV platform. • Successful application to metal ions determination in freshwaters.
Agrawal, Y K; Sharma, K R
2005-07-15
A new functionalized calix[6]crown hydroxamic acid is reported for the speciation, liquid-liquid extraction, sequential separation and trace determination of Cr(III), Mo(VI) and W(VI). Chromium(III), molybdenum(VI) and tungsten(VI) are extracted at pH 4.5, 1.5M HCl and 6.0M HCl, respectively with calixcrown hydroxamic acid (37,38,39,40,41,42-hexahydroxy7,25,31-calix[6]crown hydroxamic acid) in chloroform in presence of large number of cations and anions. The extraction mechanism is investigated. The various extraction parameters, appropriate pH/M HCl, choice of solvent, effect of the reagent concentration, temperature and distribution constant have been studied. The speciation, preconcentration and kinetic of transport has been investigated. The maximum transport is observed 35, 45 and 30min for chromium(III), molybdenum(VI) and tungsten(IV), respectively. For trace determination the extracts were directly inserted into the plasma for inductively coupled plasma atomic emission spectrometry, ICP-AES, measurements of chromium, molybdenum and tungsten which increase the sensitivity by 30-fold, with detection limits of 3ngml(-1). The method is applied for the determination of chromium, molybdenum and tungsten in high purity grade ores, biological and environmental samples. The chromium was recovered from the effluent of electroplating industries.
Language bootstrapping: learning word meanings from perception-action association.
Salvi, Giampiero; Montesano, Luis; Bernardino, Alexandre; Santos-Victor, José
2012-06-01
We address the problem of bootstrapping language acquisition for an artificial system similarly to what is observed in experiments with human infants. Our method works by associating meanings to words in manipulation tasks, as a robot interacts with objects and listens to verbal descriptions of the interactions. The model is based on an affordance network, i.e., a mapping between robot actions, robot perceptions, and the perceived effects of these actions upon objects. We extend the affordance model to incorporate spoken words, which allows us to ground the verbal symbols to the execution of actions and the perception of the environment. The model takes verbal descriptions of a task as the input and uses temporal co-occurrence to create links between speech utterances and the involved objects, actions, and effects. We show that the robot is able form useful word-to-meaning associations, even without considering grammatical structure in the learning process and in the presence of recognition errors. These word-to-meaning associations are embedded in the robot's own understanding of its actions. Thus, they can be directly used to instruct the robot to perform tasks and also allow to incorporate context in the speech recognition task. We believe that the encouraging results with our approach may afford robots with a capacity to acquire language descriptors in their operation's environment as well as to shed some light as to how this challenging process develops with human infants.
A Mellin space approach to the conformal bootstrap
Gopakumar, Rajesh; Sen, Kallol; Sinha, Aninda
2016-01-01
We describe in more detail our approach to the conformal bootstrap which uses the Mellin representation of $CFT_d$ four point functions and expands them in terms of crossing symmetric combinations of $AdS_{d+1}$ Witten exchange functions. We consider arbitrary external scalar operators and set up the conditions for consistency with the operator product expansion. Namely, we demand cancellation of spurious powers (of the cross ratios, in position space) which translate into spurious poles in Mellin space. We discuss two contexts in which we can immediately apply this method by imposing the simplest set of constraint equations. The first is the epsilon expansion. We mostly focus on the Wilson-Fisher fixed point as studied in an epsilon expansion about $d=4$. We reproduce Feynman diagram results for operator dimensions to $O(\\epsilon^3)$ rather straightforwardly. This approach also yields new analytic predictions for OPE coefficients to the same order which fit nicely with recent numerical estimates for the Isin...
Bootstrap equations for $\\mathcal{N}=4$ SYM with defects
Liendo, Pedro
2016-01-01
This paper focuses on the analysis of $4d$ $\\mathcal{N}=4$ superconformal theories in the presence of a defect from the point of view of the conformal bootstrap. We will concentrate first on the case of codimension one, where the defect is a boundary that preserves half of the supersymmetry. After studying the constraints imposed by supersymmetry, we will write the Ward identities associated to two-point functions of $\\tfrac{1}{2}$-BPS operators and write their solution as a superconformal block expansion. Due to a surprising connection between spacetime and R-symmetry conformal blocks, our results not only apply to $4d$ $\\Nm=4$ superconformal theories with a boundary, but also to three more systems that have the same symmetry algebra: $4d$ $\\Nm=4$ superconformal theories with a line defect, $3d$ $\\Nm=4$ superconformal theories with no defect, and $OSP(4^*|4)$ superconformal quantum mechanics. The superconformal algebra implies that all these systems possess a closed subsector of operators in which the bootst...
Bootstrapping Multi-Parton Loop Amplitudes in QCD
Energy Technology Data Exchange (ETDEWEB)
Bern, Zvi; /UCLA; Dixon, Lance J.; /SLAC; Kosower, David A.; /Saclay, SPhT
2005-07-06
The authors present a new method for computing complete one-loop amplitudes, including their rational parts, in non-supersymmetric gauge theory. This method merges the unitarity method with on-shell recursion relations. It systematizes a unitarity-factorization bootstrap approach previously applied by the authors to the one-loop amplitudes required for next-to-leading order QCD corrections to the processes e{sup +}e{sup -} {yields} Z, {gamma}* {yields} 4 jets and pp {yields} W + 2 jets. We illustrate the method by reproducing the one-loop color-ordered five-gluon helicity amplitudes in QCD that interfere with the tree amplitude, namely A{sub 5;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}) and A{sub 5;1}(1{sup -}, 2{sup +}, 3{sup -}, 4{sup +}, 5{sup +}). Then we describe the construction of the six- and seven-gluon amplitudes with two adjacent negative-helicity gluons, A{sub 6;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}) and A{sub 7;1}(1{sup -}, 2{sup -}, 3{sup +}, 4{sup +}, 5{sup +}, 6{sup +}, 7{sup +}), which uses the previously-computed logarithmic parts of the amplitudes as input. They present a compact expression for the six-gluon amplitude. No loop integrals are required to obtain the rational parts.
DEFF Research Database (Denmark)
Hansen, Elo Harald
, such as liquid-liquid extraction, (co)precipitation with collection in knotted reactors, adsorption, hydride generation, or the use of ion-exchange columns. Apart from hydride generation, where the analyte is converted into a gaseous species, the common denominator for these approaches is that the analyte...... solvent and secondly is back-extracted into an aqueous phase before being presented to the plasma. Selected examples of separation/preconcentration FI/SI-procedures will be presented, emphasis being placed on the determination of trace levels of metals in elevated salt-containing matrices. Such samples...
Santos, Inês C; Mesquita, Raquel B R; Rangel, António O S S
2015-09-03
This work describes the development of a solid phase spectrophotometry method in a μSI-LOV system for cadmium, zinc, and copper determination in freshwaters. NTA (Nitrilotriacetic acid) beads with 60-160 μm diameter were packed in the flow cell of the LOV for a μSPE column of 1 cm length. The spectrophotometric determination is based on the colourimetric reaction between dithizone and the target metals, previously retained on NTA resin. The absorbance of the coloured product formed is measured, at 550 nm, on the surface of the NTA resin beads in a solid phase spectrophotometry approach. The developed method presented preconcentration factors in the range of 11-21 for the metal ions. A LOD of 0.23 μg L(-1) for cadmium, 2.39 μg L(-1) for zinc, and 0.11 μg L(-1) for copper and a sampling rate of 12, 13, and 15 h(-1) for cadmium, zinc, and copper were obtained, respectively. The proposed method was successfully applied to freshwater samples.
High Bootstrap Current Fraction during the Synergy of LHCD and IBW on the HT-7 Tokamak
Institute of Scientific and Technical Information of China (English)
ZHANG Xian-Mei; WAN Bao-Nian; WU Zhen-Wei; HT-7 Team
2005-01-01
@@ More than 70% of the total plasma current is sustained by the bootstrap current and current drive during the synergy of lower hybrid current driving (LHCD) and ion Berstein wave (IBW) heating on the HT-7 tokamak.The lower hybrid non-inductive current source is off-axis and well localized, and more than 35% bootstrap current plasma has been obtained. The IBW in controlling electron pressure profile can be integrated into the LHCD target plasma. The largest steep gradient of the electron pressure profile in the region ρ～ 0.5-0.7 mostly comes from the electron temperature profile, which may induce the large fraction bootstrap current. The large off-axis bootstrap current can help to create negative magnetic shear, and the good plasma confinement is achieved.
Variable selection under multiple imputation using the bootstrap in a prognostic study
Heymans, M.W.; Buuren, S. van; Knol, D.L.; Mechelen, W. van; Vet, H.C.W. de
2007-01-01
Background. Missing data is a challenging problem in many prognostic studies. Multiple imputation (MI) accounts for imputation uncertainty that allows for adequate statistical testing. We developed and tested a methodology combining MI with bootstrapping techniques for studying prognostic variable s
BOOTSTRAP TECHNIQUE FOR ROC ANALYSIS: A STABLE EVALUATION OF FISHER CLASSIFIER PERFORMANCE
Institute of Scientific and Technical Information of China (English)
Xie Jigang; iu Zhengding
2007-01-01
This paper presents a novel bootstrap based method for Receiver Operating Characteristic (ROC) analysis of Fisher classifier. By defining Fisher classifier's output as a statistic, the bootstrap technique is used to obtain the sampling distributions of the outputs for the positive class and the negative class respectively. As a result, the ROC curve is a plot of all the (False Positive Rate (FPR),True Positive Rate (TPR)) pairs by varying the decision threshold over the whole range of the bootstrap sampling distributions. The advantage of this method is, the bootstrap based ROC curves are much stable than those of the holdout or cross-validation, indicating a more stable ROC analysis of Fisher classifier. Experiments on five data sets publicly available demonstrate the effectiveness of the proposed method.
JuliBootS: a hands-on guide to the conformal bootstrap
Paulos, Miguel F.
2014-01-01
We introduce {\\tt JuliBootS}, a package for numerical conformal bootstrap computations coded in {\\tt Julia}. The centre-piece of {\\tt JuliBootS} is an implementation of Dantzig's simplex method capable of handling arbitrary precision linear programming problems with continuous search spaces. Current supported features include conformal dimension bounds, OPE bounds, and bootstrap with or without global symmetries. The code is trivially parallelizable on one or multiple machines. We exemplify u...
Darling, Stephen; Parker, Mary-Jane; Goodall, Karen E; Havelka, Jelena; Allen, Richard J
2014-03-01
When participants carry out visually presented digit serial recall, their performance is better if they are given the opportunity to encode extra visuospatial information at encoding-a phenomenon that has been termed visuospatial bootstrapping. This bootstrapping is the result of integration of information from different modality-specific short-term memory systems and visuospatial knowledge in long term memory, and it can be understood in the context of recent models of working memory that address multimodal binding (e.g., models incorporating an episodic buffer). Here we report a cross-sectional developmental study that demonstrated visuospatial bootstrapping in adults (n=18) and 9-year-old children (n=15) but not in 6-year-old children (n=18). This is the first developmental study addressing visuospatial bootstrapping, and results demonstrate that the developmental trajectory of bootstrapping is different from that of basic verbal and visuospatial working memory. This pattern suggests that bootstrapping (and hence integrative functions such as those associated with the episodic buffer) emerge independent of the development of basic working memory slave systems during childhood.
A bootstrap based space-time surveillance model with an application to crime occurrences
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Heald, S L; Tilton, R F; Hammond, L J; Lee, A; Bayney, R M; Kamarck, M E; Ramabhadran, T V; Dreyer, R N; Davis, G; Unterbeck, A
1991-10-29
Certain precursor proteins (APP751 and APP770) of the amyloid beta-protein (AP) present in Alzheimer's disease contain a Kunitz-type serine protease inhibitor domain (APPI). In this study, the domain is obtained as a functional inhibitor through both recombinant (APPIr) and synthetic (APPIs) methodologies, and the solution structure of APPI is determined by 1H 2D NMR techniques. Complete sequence-specific resonance assignments (except for P13 and G37 NH) for both APPIr and APPIs are achieved using standard procedures. Ambiguities arising from degeneracies in the NMR resonances are resolved by varying sample conditions. Qualitative interpretation of short- and long-range NOEs reveals secondary structural features similar to those extensively documented by NMR for bovine pancreatic trypsin inhibitor (BPTI). A more rigorous interpretation of the NOESY spectra yields NOE-derived interresidue distance restraints which are used in conjunction with dynamic simulated annealing to generate a family of APPI structures. Within this family, the beta-sheet and helical regions are in good agreement with the crystal structure of BPTI, whereas portions of the protease-binding loops deviate from those in BPTI. These deviations are consistent with those recently described in the crystal structure of APPI (Hynes et al., 1990). Also supported in the NMR study is the hydrophobic patch in the protease-binding domain created by side chain-side chain NOE contacts between M17 and F34. In addition, the NMR spectra indicate that the rotation of the W21 ring in APPI is hindered, unlike Y21 in BPTI, showing a greater than 90% preference for one orientation in the hydrophobic groove.
Active Sequential Hypothesis Testing
Naghshvar, Mohammad
2012-01-01
Consider a decision maker who is responsible to dynamically collect observations so as to enhance his information in a speedy manner about an underlying phenomena of interest while accounting for the penalty of wrong declaration. The special cases of the problem are shown to be that of variable-length coding with feedback and noisy dynamic search. Due to the sequential nature of the problem, the decision maker relies on his current information state to adaptively select the most "informative" sensing action among the available ones. In this paper, using results in dynamic programming, a lower bound for the optimal total cost is established. Moreover, upper bounds are obtained via an analysis of heuristic policies for dynamic selection of actions. It is shown that the proposed heuristics achieve asymptotic optimality in many practically relevant problems including the problems of variable-length coding with feedback and noisy dynamic search; where asymptotic optimality implies that the relative difference betw...
Predicting disease risk using bootstrap ranking and classification algorithms.
Manor, Ohad; Segal, Eran
2013-01-01
Genome-wide association studies (GWAS) are widely used to search for genetic loci that underlie human disease. Another goal is to predict disease risk for different individuals given their genetic sequence. Such predictions could either be used as a "black box" in order to promote changes in life-style and screening for early diagnosis, or as a model that can be studied to better understand the mechanism of the disease. Current methods for risk prediction typically rank single nucleotide polymorphisms (SNPs) by the p-value of their association with the disease, and use the top-associated SNPs as input to a classification algorithm. However, the predictive power of such methods is relatively poor. To improve the predictive power, we devised BootRank, which uses bootstrapping in order to obtain a robust prioritization of SNPs for use in predictive models. We show that BootRank improves the ability to predict disease risk of unseen individuals in the Wellcome Trust Case Control Consortium (WTCCC) data and results in a more robust set of SNPs and a larger number of enriched pathways being associated with the different diseases. Finally, we show that combining BootRank with seven different classification algorithms improves performance compared to previous studies that used the WTCCC data. Notably, diseases for which BootRank results in the largest improvements were recently shown to have more heritability than previously thought, likely due to contributions from variants with low minimum allele frequency (MAF), suggesting that BootRank can be beneficial in cases where SNPs affecting the disease are poorly tagged or have low MAF. Overall, our results show that improving disease risk prediction from genotypic information may be a tangible goal, with potential implications for personalized disease screening and treatment.
Bootstrap finance: the art of start-ups.
Bhide, A
1992-01-01
Entrepreneurship is more popular than ever: courses are full, policymakers emphasize new ventures, managers yearn to go off on their own. Would-be founders often misplace their energies, however. Believing in a "big money" model of entrepreneurship, they spend a lot of time trying to attract investors instead of using wits and hustle to get their ideas off the ground. A study of 100 of the 1989 Inc. "500" list of fastest growing U.S. start-ups attests to the value of bootstrapping. In fact, what it takes to start a business often conflicts with what venture capitalists require. Investors prefer solid plans, well-defined markets, and track records. Entrepreneurs are heavy on energy and enthusiasm but may be short on credentials. They thrive in rapidly changing environments where uncertain prospects may scare off established companies. Rolling with the punches is often more important than formal plans. Striving to adhere to investors' criteria can diminish the flexibility--the try-it, fix-it approach--an entrepreneur needs to make a new venture work. Seven principles are basic for successful start-ups: get operational fast; look for quick break-even, cash-generating projects; offer high-value products or services that can sustain direct personal selling; don't try to hire the crack team; keep growth in check; focus on cash; and cultivate banks early. Growth and change are the start-up's natural environment. But change is also the reward for success: just as ventures grow, their founders usually have to take a fresh look at everything again: roles, organization, even the very policies that got the business up and running.
Institute of Scientific and Technical Information of China (English)
梁幸甜; 杨承祥
2008-01-01
目的 通过序贯法测定氯胺酮在小儿 MRI检查麻醉中的半数有效剂量(ED50).方法 选择ASA Ⅰ～Ⅱ级需在全麻下行MRI检查的小儿 34例.氯胺酮初始剂量选择6 mg/kg肌肉注射,观察前一小儿麻醉效果而序贯地按0.5 m∥kg的药量递增或递减.一次肌肉注射即可连续无中断地完成MRI检查为麻醉效果满意,否则为麻醉失败,下一例小儿需递增氯胺酮用量.所得数据以加权均数法求得氯胺酮的ED50.结果 氯胺酮的ED50为7.058 mg/kg,95%的可信限为6.833 6 ms/ks～7.289 8 ms/ks.结论 序贯法测定半数有效剂量简便、高效、结果可信,所得氯胺酮的ED50可指导临床更迅速、安全地完成小儿 MRI检查的麻醉.%Objective To determine the median effective dose (ED50) of ketamine in pediatric anesthesia for magnetic resonance imaging (MRI) scanning with sequential method. Methods 34 children (ASA gradeⅠ-Ⅱ) scheduled receiving MRI scanning under ketamine anesthesia were chosen. The Ⅰ.M. dose was set as 6 mg/kg for the incipient child and the sequent dose for the next child was adjusted by increasing or decreasing 0.5 mg/kg using up-and-down sequential method, according to the response of previous child to the administered dose. Satisfactory anesthesia was defined as that scanning was successfully and uninterrupted completed with a single intramuscular injection of ketamine or otherwise as failure. In case of failure, a higher degree of dose was needed for the sequent child. The ED50 of ketamine was derived from the data obtained using weighted averaging method. Results The ED50 of ketamine is 7.058 mg/kg and the 95 % confidence interval is 6.833 6 mg/kg-7.289 8 mg/kg. Conclusion Using sequential method to determine ketamine ED50 is convenient and with high performance, and the result is creditable. Ketamine ED50 is useful to make pediatric anesthesia for MRI scanning more rapid and safer.
Energy Technology Data Exchange (ETDEWEB)
Michel, H.; Levent, D.; Barci, V.; Barci-Funel, G.; Hurel, C. [Laboratoire de Radiochimie, Sciences Analytiques et Environnement (LRSAE), Universite de Nice Sophia-Antipolis 06108 Nice Cedex (France)
2008-07-01
A new sequential method for the determination of both natural (U, Th) and anthropogenic (Sr, Cs, Pu, Am) radionuclides has been developed for application to soil and sediment samples. The procedure was optimised using a reference sediment (IAEA-368) and reference soils (IAEA-375 and IAEA-326). Reference materials were first digested using acids (leaching), 'total' acids on hot plate, and acids in microwave in order to compare the different digestion technique. Then, the separation and purification were made by anion exchange resin and selective extraction chromatography: Transuranic (TRU) and Strontium (SR) resins. Natural and anthropogenic alpha radionuclides were separated by Uranium and Tetravalent Actinide (UTEVA) resin, considering different acid elution medium. Finally, alpha and gamma semiconductor spectrometer and liquid scintillation spectrometer were used to measure radionuclide activities. The results obtained for strontium-90, cesium-137, thorium-232, uranium- 238, plutonium-239+240 and americium-241 isotopes by the proposed method for the reference materials provided excellent agreement with the recommended values and good chemical recoveries. (authors)
Economou, Anastasios; Voulgaropoulos, Anastasios
2003-01-01
The development of a dedicated automated sequential-injection analysis apparatus for anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (AdSV) is reported. The instrument comprised a peristaltic pump, a multiposition selector valve and a home-made potentiostat and used a mercury-film electrode as the working electrodes in a thin-layer electrochemical detector. Programming of the experimental sequence was performed in LabVIEW 5.1. The sequence of operations included formation of the mercury film, electrolytic or adsorptive accumulation of the analyte on the electrode surface, recording of the voltammetric current-potential response, and cleaning of the electrode. The stripping step was carried out by applying a square-wave (SW) potential-time excitation signal to the working electrode. The instrument allowed unattended operation since multiple-step sequences could be readily implemented through the purpose-built software. The utility of the analyser was tested for the determination of copper(II), cadmium(II), lead(II) and zinc(II) by SWASV and of nickel(II), cobalt(II) and uranium(VI) by SWAdSV.
Alfaro, Michael E; Zoller, Stefan; Lutzoni, François
2003-02-01
Bayesian Markov chain Monte Carlo sampling has become increasingly popular in phylogenetics as a method for both estimating the maximum likelihood topology and for assessing nodal confidence. Despite the growing use of posterior probabilities, the relationship between the Bayesian measure of confidence and the most commonly used confidence measure in phylogenetics, the nonparametric bootstrap proportion, is poorly understood. We used computer simulation to investigate the behavior of three phylogenetic confidence methods: Bayesian posterior probabilities calculated via Markov chain Monte Carlo sampling (BMCMC-PP), maximum likelihood bootstrap proportion (ML-BP), and maximum parsimony bootstrap proportion (MP-BP). We simulated the evolution of DNA sequence on 17-taxon topologies under 18 evolutionary scenarios and examined the performance of these methods in assigning confidence to correct monophyletic and incorrect monophyletic groups, and we examined the effects of increasing character number on support value. BMCMC-PP and ML-BP were often strongly correlated with one another but could provide substantially different estimates of support on short internodes. In contrast, BMCMC-PP correlated poorly with MP-BP across most of the simulation conditions that we examined. For a given threshold value, more correct monophyletic groups were supported by BMCMC-PP than by either ML-BP or MP-BP. When threshold values were chosen that fixed the rate of accepting incorrect monophyletic relationship as true at 5%, all three methods recovered most of the correct relationships on the simulated topologies, although BMCMC-PP and ML-BP performed better than MP-BP. BMCMC-PP was usually a less biased predictor of phylogenetic accuracy than either bootstrapping method. BMCMC-PP provided high support values for correct topological bipartitions with fewer characters than was needed for nonparametric bootstrap.
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be
A NEW INEXACT SEQUENTIAL QUADRATIC PROGRAMMING ALGORITHM
Institute of Scientific and Technical Information of China (English)
倪勤
2002-01-01
This paper represents an inexact sequential quadratic programming (SQP ) algorithm which can solve nonlinear programming (NLP ) problems. An inexact solution of the quadratic programming subproblem is determined by a projection and contraction method such that only matrix-vector product is required. Some truncated criteria are chosen such that the algorithm is suitable to large scale NLP problem. The global convergence of the algorithm is proved.
Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M
2016-01-01
Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.
Liang, Rong; Zhou, Shu-dong; Li, Li-xia; Zhang, Jun-guo; Gao, Yan-hui
2013-09-01
This paper aims to achieve Bootstraping in hierarchical data and to provide a method for the estimation on confidence interval(CI) of intraclass correlation coefficient(ICC).First, we utilize the mixed-effects model to estimate data from ICC of repeated measurement and from the two-stage sampling. Then, we use Bootstrap method to estimate CI from related ICCs. Finally, the influences of different Bootstraping strategies to ICC's CIs are compared. The repeated measurement instance show that the CI of cluster Bootsraping containing the true ICC value. However, when ignoring the hierarchy characteristics of data, the random Bootsraping method shows that it has the invalid CI. Result from the two-stage instance shows that bias observed between cluster Bootstraping's ICC means while the ICC of the original sample is the smallest, but with wide CI. It is necessary to consider the structure of data as important, when hierarchical data is being resampled. Bootstrapping seems to be better on the higher than that on lower levels.
Sequential Testing: Basics and Benefits
1978-03-01
103-109 44. A. Wald , Sequential Analysis, John Wiley and Sons, 1947 45. A Wald and J. Wolfowitz , "Optimum Character of The Sequential Probability Ratio...work done by A. Wald [44].. Wald’s work on sequential analysis can be used virtually’without modification in a situation where decisions are made... Wald can be used. The decision to accept, reject, or continue the test depends on: 8 < (8 0/el)r exp [-(1/01 - 1/0 0 )V(t)] < A (1) where 0 and A are
Sequential operators in computability logic
Japaridze, Giorgi
2007-01-01
Computability logic (CL) (see http://www.cis.upenn.edu/~giorgi/cl.html) is a semantical platform and research program for redeveloping logic as a formal theory of computability, as opposed to the formal theory of truth which it has more traditionally been. Formulas in CL stand for (interactive) computational problems, understood as games between a machine and its environment; logical operators represent operations on such entities; and "truth" is understood as existence of an effective solution, i.e., of an algorithmic winning strategy. The formalism of CL is open-ended, and may undergo series of extensions as the study of the subject advances. The main groups of operators on which CL has been focused so far are the parallel, choice, branching, and blind operators. The present paper introduces a new important group of operators, called sequential. The latter come in the form of sequential conjunction and disjunction, sequential quantifiers, and sequential recurrences. As the name may suggest, the algorithmic ...
Energy Technology Data Exchange (ETDEWEB)
Anthemidis, Aristidis N., E-mail: anthemid@chem.auth.gr [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University Thessaloniki, Thessaloniki 54124 (Greece); Ioannou, Kallirroy-Ioanna G. [Laboratory of Analytical Chemistry, Department of Chemistry, Aristotle University Thessaloniki, Thessaloniki 54124 (Greece)
2010-05-23
A novel on-line sequential injection (SI) dispersive liquid-liquid microextraction (DLLME) system coupled to electrothermal atomic absorption spectrometry (ETAAS) was developed for metal preconcentration in micro-scale, eliminating the laborious and time consuming procedure of phase separation with centrifugation. The potentials of the system were demonstrated for trace lead and cadmium determination in water samples. An appropriate disperser solution which contains the extraction solvent (xylene) and the chelating agent (ammonium pyrrolidine dithiocarbamate) in methanol is mixed on-line with the sample solution (aqueous phase), resulting thus, a cloudy solution, which is consisted of fine droplets of xylene, dispersed throughout the aqueous phase. Three procedures are taking place simultaneously: cloudy solution creation, analyte complex formation and extraction from aqueous phase into the fine droplets of xylene. Subsequently the droplets were retained on the hydrophobic surface of PTFE-turnings into the column. A part of 30 {mu}L of the eluent (methyl isobutyl ketone) was injected into furnace graphite for analyte atomization and quantification. The sampling frequency was 10 h{sup -1}, and the obtained enrichment factor was 80 for lead and 34 for cadmium. The detection limit was 10 ng L{sup -1} and 2 ng L{sup -1}, while the precision expressed as relative standard deviation (RSD) was 3.8% (at 0.5 {mu}g L{sup -1}) and 4.1% (at 0.03 {mu}g L{sup -1}) for lead and cadmium respectively. The proposed method was evaluated by analyzing certified reference materials and was applied to the analysis of natural waters.
Qiao, Jixin; Hou, Xiaolin; Roos, Per; Miró, Manuel
2011-01-01
This paper reports an automated analytical method for rapid and simultaneous determination of plutonium and neptunium in soil, sediment, and seaweed, with detection via inductively coupled plasma mass spectrometry (ICP-MS). A chromatographic column packed with a macroporous anion exchanger (AG MP-1 M) was incorporated in a sequential injection (SI) system for the efficient retrieval of plutonium, along with neptunium, from matrix elements and potential interfering nuclides. The sorption and elution behavior of plutonium and neptunium onto AG MP-1 M resin was compared with a commonly utilized AG 1-gel-type anion exchanger. Experimental results reveal that the pore structure of the anion exchanger plays a pivotal role in ensuring similar separation behavior of plutonium and neptunium along the separation protocol. It is proven that plutonium-242 ((242)Pu) performs well as a tracer for monitoring the chemical yield of neptunium when using AG MP-1 M resin, whereby the difficulties in obtaining a reliable and practicable isotopic neptunium tracer are overcome. An important asset of the SI setup is the feasibility of processing up to 100 g of solid substrates using a small-sized (ca. 2 mL) column with chemical yields of neptunium and plutonium being ≥79%. Analytical results of three certified/standard reference materials and two solid samples from intercomparison exercises are in good agreement with the reference values at the 0.05 significance level. The overall on-column separation can be completed within 3.5 h for 10 g of soil samples. Most importantly, the anion-exchange mini-column suffices to be reused up to 10-fold with satisfactory chemical yields (>70%), as demanded in environmental monitoring and emergency scenarios, making the proposed automated assembly well-suited for unattended and high-throughput analysis.
Duan, Taicheng; Song, Xuejie; Jin, Dan; Li, Hongfei; Xu, Jingwei; Chen, Hangting
2005-10-31
In this work, a method was developed for determination of ultra-trace levels of Cd in tea samples by atomic fluorescence spectrometry (AFS). A flow injection solid phase extraction (FI-SPE) separation and preconcentration technique, to on-line couple with a sequential injection hydride generation (SI-HG) technique is employed in this study. Cd was preconcentrated on the SPE column, which was made from a neutral extractant named Cyanex 923, while other matrix ions or interfering ions were completely or mostly separated off. Conditions for the SPE separation and preconcentration, as well as conditions for the HG technique, were studied. Due to the separation of interfering elements, Cd hydride generation efficiency could be greatly enhanced with the sole presence of Co(2+) with a concentration of 200mugL(-1), which is much lower than those in other works previously reported. Interferences on both the Cd separation and preconcentration, and Cd hydride generation (HG) were investigated; it showed that both the separation and preconcentration system, and the HG system had a strong anti-interference ability. The SPE column could be repeatedly used at least 400 times, a R.S.D. of 0.97% was obtained for 6 measurements of Cd with 0.2mugL(-1) and a correlation coefficiency of 1.0000 was obtained for the measurement of a series of solutions with Cd concentrations from 0.1 to 2mugL(-1). The method has a low detection limit of 10.8ngL(-1) for a 25mL solution and was successfully validated by using two tea standard reference materials (GBW08513 and GBW07605).
Anthemidis, Aristidis N; Ioannou, Kallirroy-Ioanna G
2010-05-23
A novel on-line sequential injection (SI) dispersive liquid-liquid microextraction (DLLME) system coupled to electrothermal atomic absorption spectrometry (ETAAS) was developed for metal preconcentration in micro-scale, eliminating the laborious and time consuming procedure of phase separation with centrifugation. The potentials of the system were demonstrated for trace lead and cadmium determination in water samples. An appropriate disperser solution which contains the extraction solvent (xylene) and the chelating agent (ammonium pyrrolidine dithiocarbamate) in methanol is mixed on-line with the sample solution (aqueous phase), resulting thus, a cloudy solution, which is consisted of fine droplets of xylene, dispersed throughout the aqueous phase. Three procedures are taking place simultaneously: cloudy solution creation, analyte complex formation and extraction from aqueous phase into the fine droplets of xylene. Subsequently the droplets were retained on the hydrophobic surface of PTFE-turnings into the column. A part of 30 microL of the eluent (methyl isobutyl ketone) was injected into furnace graphite for analyte atomization and quantification. The sampling frequency was 10 h(-1), and the obtained enrichment factor was 80 for lead and 34 for cadmium. The detection limit was 10 ng L(-1) and 2 ng L(-1), while the precision expressed as relative standard deviation (RSD) was 3.8% (at 0.5 microg L(-1)) and 4.1% (at 0.03 microg L(-1)) for lead and cadmium respectively. The proposed method was evaluated by analyzing certified reference materials and was applied to the analysis of natural waters.
The economics of bootstrapping space industries - Development of an analytic computer model
Goldberg, A. H.; Criswell, D. R.
1982-01-01
A simple economic model of 'bootstrapping' industrial growth in space and on the Moon is presented. An initial space manufacturing facility (SMF) is assumed to consume lunar materials to enlarge the productive capacity in space. After reaching a predetermined throughput, the enlarged SMF is devoted to products which generate revenue continuously in proportion to the accumulated output mass (such as space solar power stations). Present discounted value and physical estimates for the general factors of production (transport, capital efficiency, labor, etc.) are combined to explore optimum growth in terms of maximized discounted revenues. It is found that 'bootstrapping' reduces the fractional cost to a space industry of transport off-Earth, permits more efficient use of a given transport fleet. It is concluded that more attention should be given to structuring 'bootstrapping' scenarios in which 'learning while doing' can be more fully incorporated in program analysis.
Closure of the Operator Product Expansion in the Non-Unitary Bootstrap
Esterlis, Ilya; Ramirez, David
2016-01-01
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the "Gliozzi" bootstrap method, and provides a simpler setting in which to study technical challenges with the method. In the supplementary material, we provide a Mathematica notebook that automates the calculation of the crossing matrices and OPE coefficients for degenerate operators using the formulae of Dotsenko and Fateev.
Closure of the operator product expansion in the non-unitary bootstrap
Esterlis, Ilya; Fitzpatrick, A. Liam; Ramirez, David M.
2016-11-01
We use the numerical conformal bootstrap in two dimensions to search for finite, closed sub-algebras of the operator product expansion (OPE), without assuming unitarity. We find the minimal models as special cases, as well as additional lines of solutions that can be understood in the Coulomb gas formalism. All the solutions we find that contain the vacuum in the operator algebra are cases where the external operators of the bootstrap equation are degenerate operators, and we argue that this follows analytically from the expressions in arXiv:1202.4698 for the crossing matrices of Virasoro conformal blocks. Our numerical analysis is a special case of the "Gliozzi" bootstrap method, and provides a simpler setting in which to study technical challenges with the method.
The S-matrix Bootstrap I: QFT in AdS
Paulos, Miguel F; Toledo, Jonathan; van Rees, Balt C; Vieira, Pedro
2016-01-01
We propose a strategy to study massive Quantum Field Theory (QFT) using conformal bootstrap methods. The idea is to consider QFT in hyperbolic space and study correlation functions of its boundary operators. We show that these are solutions of the crossing equations in one lower dimension. By sending the curvature radius of the background hyperbolic space to infinity we expect to recover flat-space physics. We explain that this regime corresponds to large scaling dimensions of the boundary operators, and discuss how to obtain the flat-space scattering amplitudes from the corresponding limit of the boundary correlators. We implement this strategy to obtain universal bounds on the strength of cubic couplings in 2D flat-space QFTs using 1D conformal bootstrap techniques. Our numerical results match precisely the analytic bounds obtained in our companion paper using S-matrix bootstrap techniques.
Sequentially pulsed traveling wave accelerator
Caporaso, George J.; Nelson, Scott D.; Poole, Brian R.
2009-08-18
A sequentially pulsed traveling wave compact accelerator having two or more pulse forming lines each with a switch for producing a short acceleration pulse along a short length of a beam tube, and a trigger mechanism for sequentially triggering the switches so that a traveling axial electric field is produced along the beam tube in synchronism with an axially traversing pulsed beam of charged particles to serially impart energy to the particle beam.
Complementary sequential measurements generate entanglement
Coles, Patrick J.; Piani, Marco
2013-01-01
We present a new paradigm for capturing the complementarity of two observables. It is based on the entanglement created by the interaction between the system observed and the two measurement devices used to measure the observables sequentially. Our main result is a lower bound on this entanglement and resembles well-known entropic uncertainty relations. Besides its fundamental interest, this result directly bounds the effectiveness of sequential bipartite operations---corresponding to the mea...
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things.
Garcia-Carrillo, Dan; Marin-Lopez, Rafael
2016-03-11
The Internet of Things (IoT) is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP). Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP) and Authentication Authorization and Accounting (AAA). We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption) and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
Lightweight CoAP-Based Bootstrapping Service for the Internet of Things
Directory of Open Access Journals (Sweden)
Dan Garcia-Carrillo
2016-03-01
Full Text Available The Internet of Things (IoT is becoming increasingly important in several fields of industrial applications and personal applications, such as medical e-health, smart cities, etc. The research into protocols and security aspects related to this area is continuously advancing in making these networks more reliable and secure, taking into account these aspects by design. Bootstrapping is a procedure by which a user obtains key material and configuration information, among other parameters, to operate as an authenticated party in a security domain. Until now solutions have focused on re-using security protocols that were not developed for IoT constraints. For this reason, in this work we propose a design and implementation of a lightweight bootstrapping service for IoT networks that leverages one of the application protocols used in IoT : Constrained Application Protocol (CoAP. Additionally, in order to provide flexibility, scalability, support for large scale deployment, accountability and identity federation, our design uses technologies such as the Extensible Authentication Protocol (EAP and Authentication Authorization and Accounting (AAA. We have named this service CoAP-EAP. First, we review the state of the art in the field of bootstrapping and specifically for IoT. Second, we detail the bootstrapping service: the architecture with entities and interfaces and the flow operation. Third, we obtain performance measurements of CoAP-EAP (bootstrapping time, memory footprint, message processing time, message length and energy consumption and compare them with PANATIKI. The most significant and constrained representative of the bootstrapping solutions related with CoAP-EAP. As we will show, our solution provides significant improvements, mainly due to an important reduction of the message length.
View Dependent Sequential Point Trees
Institute of Scientific and Technical Information of China (English)
Wen-Cheng Wang; Feng Wei; En-Hua Wu
2006-01-01
Sequential point trees provide the state-of-the-art technique for rendering point models, by re-arranging hierarchical points sequentially according to geometric errors running on GPU for fast rendering. This paper presents a view dependent method to augment sequential point trees by embedding the hierarchical tree structures in the sequential list of hierarchical points. By the method, two kinds of indices are constructed to facilitate the points rendering in an order mostly from near to far and from coarse to fine. As a result, invisible points can be culled view-dependently in high efficiency for hardware acceleration, and at the same time, the advantage of sequential point trees could be still fully taken. Therefore, the new method can run much faster than the conventional sequential point trees, and the acceleration can be highly promoted particularly when the objects possess complex occlusion relationship and viewed closely because invisible points would be in a high percentage of the points at finer levels.
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2002-01-01
A sequential injection (SI) on-line matrix removal and trace metal preconcentration procedure by using a novel microcolumn packed with PTFE beads is described, and demonstrated for trace cadmium analysis with detection by electrothermal atomic absorption spectrometry (ETAAS). The analyte is initi......A sequential injection (SI) on-line matrix removal and trace metal preconcentration procedure by using a novel microcolumn packed with PTFE beads is described, and demonstrated for trace cadmium analysis with detection by electrothermal atomic absorption spectrometry (ETAAS). The analyte...
Improving Web Learning through model Optimization using Bootstrap for a Tour-Guide Robot
Directory of Open Access Journals (Sweden)
Rafael León
2012-09-01
Full Text Available We perform a review of Web Mining techniques and we describe a Bootstrap Statistics methodology applied to pattern model classifier optimization and verification for Supervised Learning for Tour-Guide Robot knowledge repository management. It is virtually impossible to test thoroughly Web Page Classifiers and many other Internet Applications with pure empirical data, due to the need for human intervention to generate training sets and test sets. We propose using the computer-based Bootstrap paradigm to design a test environment where they are checked with better reliability
A New Regime for Studying the High Bootstrap Current Fraction Plasma
Institute of Scientific and Technical Information of China (English)
A. Isayama; Y. Kamada; K. Ushigusa; T. Fujita; T. Suzuki; X. Gao
2001-01-01
A new experimental regime has recently been studied for achieving the high fraction of the bootstrap current in the JT-60U hydrogen discharges. The high poloidal beta(βp ～ 3.61) plasma was obtained by high-power neutral beam injection heating at a very high edge safety factor (Ip = 0.3 MA, Bt = 3.65 T, qeff = 25 - 35) region, and the bootstrap current fraction (fBS) was correspondingly about 40% using the ACCOME code calculation. It was observed that there were no magnetohydrodynamic instabilities to retard the increase of the βp and fBS parameters in the new regime.
Excitons in solids with time-dependent density-functional theory: the bootstrap kernel and beyond
Byun, Young-Moo; Yang, Zeng-Hui; Ullrich, Carsten
Time-dependent density-functional theory (TDDFT) is an efficient method to describe the optical properties of solids. Lately, a series of bootstrap-type exchange-correlation (xc) kernels have been reported to produce accurate excitons in solids, but different bootstrap-type kernels exist in the literature, with mixed results. In this presentation, we reveal the origin of the confusion and show a new empirical TDDFT xc kernel to compute excitonic properties of semiconductors and insulators efficiently and accurately. Our method can be used for high-throughput screening calculations and large unit cell calculations. Work supported by NSF Grant DMR-1408904.
Pareto, Deborah; Aguiar, Pablo; Pavía, Javier; Gispert, Juan Domingo; Cot, Albert; Falcón, Carles; Benabarre, Antoni; Lomeña, Francisco; Vieta, Eduard; Ros, Domènec
2008-07-01
Statistical parametric mapping (SPM) has become the technique of choice to statistically evaluate positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and single photon emission computed tomography (SPECT) functional brain studies. Nevertheless, only a few methodological studies have been carried out to assess the performance of SPM in SPECT. The aim of this paper was to study the performance of SPM in detecting changes in regional cerebral blood flow (rCBF) in hypo- and hyperperfused areas in brain SPECT studies. The paper seeks to determine the relationship between the group size and the rCBF changes, and the influence of the correction for degradations. The assessment was carried out using simulated brain SPECT studies. Projections were obtained with Monte Carlo techniques, and a fan-beam collimator was considered in the simulation process. Reconstruction was performed by using the ordered subsets expectation maximization (OSEM) algorithm with and without compensation for attenuation, scattering, and spatial variant collimator response. Significance probability maps were obtained with SPM2 by using a one-tailed two-sample t-test. A bootstrap resampling approach was used to determine the sample size for SPM to detect the between-group differences. Our findings show that the correction for degradations results in a diminution of the sample size, which is more significant for small regions and low-activation factors. Differences in sample size were found between hypo- and hyperperfusion. These differences were larger for small regions and low-activation factors, and when no corrections were included in the reconstruction algorithm.
Energy Technology Data Exchange (ETDEWEB)
Mervartova, Katerina [Department of Analytical Chemistry, Faculty of Pharmacy, Charles University, Heyrovskeho 1203, CZ-500 05 Hradec Kralove (Czech Republic)], E-mail: Katerina.Mervartova@faf.cuni.cz; Polasek, Miroslav [Department of Analytical Chemistry, Faculty of Pharmacy, Charles University, Heyrovskeho 1203, CZ-500 05 Hradec Kralove (Czech Republic); Calatayud, Jose Martinez [Department of Analytical Chemistry, Faculty of Chemistry, University of Valencia, Valencia (Spain)
2007-09-26
Automated sequential injection (SIA) method for chemiluminescence (CL) determination of nonsteroidal anti-inflammatory drug indomethacin (I) was devised. The CL radiation was emitted in the reaction of I (dissolved in aqueous 50% v/v ethanol) with intermediate reagent tris(2,2'-bipyridyl)ruthenium(III) (Ru(bipy){sub 3}{sup 3+}) in the presence of acetate. The Ru(bipy){sub 3}{sup 3+} was generated on-line in the SIA system by the oxidation of 0.5 mM tris(2,2'-bipyridyl)ruthenium(II) (Ru(bipy){sub 3}{sup 2+}) with Ce(IV) ammonium sulphate in diluted sulphuric acid. The optimum sequence, concentrations, and aspirated volumes of reactant zones were: 15 mM Ce(IV) in 50 mM sulphuric acid 41 {mu}L, 0.5 mM Ru(bipy){sub 3}{sup 2+} 30 {mu}L, 0.4 M Na acetate 16 {mu}L and I sample 15 {mu}L; the flow rates were 60 {mu}L s{sup -1} for the aspiration into the holding coil and 100 {mu}L s{sup -1} for detection. Calibration curve relating the intensity of CL (peak height of the transient CL signal) to concentration of I was curvilinear (second order polynomial) for 0.1-50 {mu}M I (r = 0.9997; n = 9) with rectilinear section in the range 0.1-10 {mu}M I (r = 0.9995; n = 5). The limit of detection (3{sigma}) was 0.05 {mu}M I. Repeatability of peak heights (R.S.D., n = 10) ranged between 2.4% (0.5 {mu}M I) and 2.0% (7 {mu}M I). Sample throughput was 180 h{sup -1}. The method was applied to determination of 1 to 5% of I in semisolid dosage forms (gels and ointments). The results compared well with those of UV spectrophotometric method.
Khabova, Z S; Smirenin, S A; Fetisov, V A; Tamberg, D K
2015-01-01
The objective of the present study was to determine the diagnostic coefficients (DC) for the injuries to the upper and lower extremities of the vehicle drivers inflicted inside the passenger compartment in the case of a traffic accident. We have analysed the archival expert documents collected from 45 regional bureaus of forensic medical expertise during the period from 1995 to 2014 that contained the results of examination of 200 corpses and 300 survivors who had suffered injuries in the traffic accidents. The statistical and mathematical treatment of these materials with the use of sequential mathematical analysis based on the Bayes and Wald formulas yielded diagnostic coefficients that make it possible to elucidate the most informative features characterizing the driver of a vehicle. In case of a lethal outcome, the most significant injuries include bleeding from the posterior left elbow region (DC +7.6), skin scratches on the palm surface of the right wrist (DC +7.6), bleeding from the postrerior region of the left lower leg (DC +7.6), wounds on the dorsal surface of the left wrist (DC +6.3), bruises at the anterior surface of the left knee (DC +6.3), etc. The most informative features in the survivals of the traffic accidents are bone fractures (DC +7.0), tension of ligaments and dislocation of the right talocrural joint (DC +6.5), fractures of the left kneecap and left tibial epiphysis (DC +5.4), hemorrhage and bruises in the anterior right knee region (DC + 5.4 each), skin scratches in the right posterior carpal region (DC +5.1). It is concluded that the use of the diagnostic coefficients makes it possible to draw the attention of the experts to the above features and to objectively determine the driver's seat position inside the car passenger compartment in the case of a traffic accident. Moreover such an approach contributes to the improvement of the quality of expert conclusions and the results of forensic medical expertise of the circumstance of traffic
Application of a New Resampling Method to SEM: A Comparison of S-SMART with the Bootstrap
Bai, Haiyan; Sivo, Stephen A.; Pan, Wei; Fan, Xitao
2016-01-01
Among the commonly used resampling methods of dealing with small-sample problems, the bootstrap enjoys the widest applications because it often outperforms its counterparts. However, the bootstrap still has limitations when its operations are contemplated. Therefore, the purpose of this study is to examine an alternative, new resampling method…
Kim, Se-Kang
2010-01-01
The aim of the current study is to validate the invariance of major profile patterns derived from multidimensional scaling (MDS) by bootstrapping. Profile Analysis via Multidimensional Scaling (PAMS) was employed to obtain profiles and bootstrapping was used to construct the sampling distributions of the profile coordinates and the empirical…
Directory of Open Access Journals (Sweden)
Thawatchai Onjun
2012-02-01
Full Text Available The investigation of bootstrap current formation in ITER is carried out using BALDUR integrated predictive modelingcode. The combination of Mixed B/gB anomalous transport model and NLCASS module together with the pedestal model isused in BALDUR code to simulate the time evolution of temperature, density, and plasma current profiles. It was found inthe simulations that without the presence of ITB, a minimal fraction of bootstrap current (as well as low fusion performancewas achieved. The enhancement due to ITB depends sensitively on the strength of toroidal velocity. A sensitivity study wasalso carried out to optimize the bootstrap current fraction and plasma performance. It was found that the bootstrap currentfraction slightly improved; while the plasma performance greatly improved with increasing of NBI power or pedestal temperature.On the other hand, higher impurity concentration resulted in a significant degradation of fusion performance, buta smaller degradation in bootstrap current.
The use of vector bootstrapping to improve variable selection precision in Lasso models.
Laurin, Charles; Boomsma, Dorret; Lubke, Gitta
2016-08-01
The Lasso is a shrinkage regression method that is widely used for variable selection in statistical genetics. Commonly, K-fold cross-validation is used to fit a Lasso model. This is sometimes followed by using bootstrap confidence intervals to improve precision in the resulting variable selections. Nesting cross-validation within bootstrapping could provide further improvements in precision, but this has not been investigated systematically. We performed simulation studies of Lasso variable selection precision (VSP) with and without nesting cross-validation within bootstrapping. Data were simulated to represent genomic data under a polygenic model as well as under a model with effect sizes representative of typical GWAS results. We compared these approaches to each other as well as to software defaults for the Lasso. Nested cross-validation had the most precise variable selection at small effect sizes. At larger effect sizes, there was no advantage to nesting. We illustrated the nested approach with empirical data comprising SNPs and SNP-SNP interactions from the most significant SNPs in a GWAS of borderline personality symptoms. In the empirical example, we found that the default Lasso selected low-reliability SNPs and interactions which were excluded by bootstrapping.
Parametric bootstrap methods for testing multiplicative terms in GGE and AMMI models.
Forkman, Johannes; Piepho, Hans-Peter
2014-09-01
The genotype main effects and genotype-by-environment interaction effects (GGE) model and the additive main effects and multiplicative interaction (AMMI) model are two common models for analysis of genotype-by-environment data. These models are frequently used by agronomists, plant breeders, geneticists and statisticians for analysis of multi-environment trials. In such trials, a set of genotypes, for example, crop cultivars, are compared across a range of environments, for example, locations. The GGE and AMMI models use singular value decomposition to partition genotype-by-environment interaction into an ordered sum of multiplicative terms. This article deals with the problem of testing the significance of these multiplicative terms in order to decide how many terms to retain in the final model. We propose parametric bootstrap methods for this problem. Models with fixed main effects, fixed multiplicative terms and random normally distributed errors are considered. Two methods are derived: a full and a simple parametric bootstrap method. These are compared with the alternatives of using approximate F-tests and cross-validation. In a simulation study based on four multi-environment trials, both bootstrap methods performed well with regard to Type I error rate and power. The simple parametric bootstrap method is particularly easy to use, since it only involves repeated sampling of standard normally distributed values. This method is recommended for selecting the number of multiplicative terms in GGE and AMMI models. The proposed methods can also be used for testing components in principal component analysis.
Bootstrap-estimated land-to-water coefficients from the CBTN_v4 SPARROW model
Ator, Scott; Brakebill, John W.; Blomquist, Joel D.
2017-01-01
This file contains 200 sets of bootstrap-estimated land-to-water coefficients from the CBTN_v4 SPARROW model, which is documented in USGS Scientific Investigations Report 2011-5167. The coefficients were produced as part of CBTN_v4 model calibration to provide information about the uncertainty in model estimates.
Inferences of Coordinates in Multidimensional Scaling by a Bootstrapping Procedure in R
Kim, Donghoh; Kim, Se-Kang; Park, Soyeon
2015-01-01
Recently, MDS has been utilized to identify and evaluate cognitive ability latent profiles in a population. However, dimension coordinates do not carry any statistical properties. To cope with statistical incompetence of MDS, we investigated the common aspects of various studies utilizing bootstrapping, and provided an R function for its…
Genetic divergence among cupuaçu accessions by multiscale bootstrap resampling
Directory of Open Access Journals (Sweden)
Vinicius Silva dos Santos
2015-06-01
Full Text Available This study aimed at investigating the genetic divergence of eighteen accessions of cupuaçu trees based on fruit morphometric traits and comparing usual methods of cluster analysis with the proposed multiscale bootstrap resampling methodology. The data were obtained from an experiment conducted in Tomé-Açu city (PA, Brazil, arranged in a completely randomized design with eighteen cupuaçu accessions and 10 repetitions, from 2004 to 2011. Genetic parameters were estimated by restricted maximum likelihood/best linear unbiased prediction (REML/BLUP methodology. The predicted breeding values were used in the study on genetic divergence through Unweighted Pair Cluster Method with Arithmetic Mean (UPGMA hierarchical clustering and Tocher’s optimization method based on standardized Euclidean distance. Clustering consistency and optimal number of clusters in the UPGMA method were verified by the cophenetic correlation coefficient (CCC and Mojena’s criterion, respectively, besides the multiscale bootstrap resampling technique. The use of the clustering UPGMA method in situations with and without multiscale bootstrap resulted in four and five clusters, respectively, while the Tocher’s method resulted in seven clusters. The multiscale bootstrap resampling technique proves to be efficient to assess the consistency of clustering in hierarchical methods and, consequently, the optimal number of clusters.
Schick, Simon; Rössler, Ole; Weingartner, Rolf
2016-10-01
Based on a hindcast experiment for the period 1982-2013 in 66 sub-catchments of the Swiss Rhine, the present study compares two approaches of building a regression model for seasonal streamflow forecasting. The first approach selects a single "best guess" model, which is tested by leave-one-out cross-validation. The second approach implements the idea of bootstrap aggregating, where bootstrap replicates are employed to select several models, and out-of-bag predictions provide model testing. The target value is mean streamflow for durations of 30, 60 and 90 days, starting with the 1st and 16th day of every month. Compared to the best guess model, bootstrap aggregating reduces the mean squared error of the streamflow forecast by seven percent on average. Thus, if resampling is anyway part of the model building procedure, bootstrap aggregating seems to be a useful strategy in statistical seasonal streamflow forecasting. Since the improved accuracy comes at the cost of a less interpretable model, the approach might be best suited for pure prediction tasks, e.g. as in operational applications.
Seol, Hyunsoo
2016-06-01
The purpose of this study was to apply the bootstrap procedure to evaluate how the bootstrapped confidence intervals (CIs) for polytomous Rasch fit statistics might differ according to sample sizes and test lengths in comparison with the rule-of-thumb critical value of misfit. A total of 25 simulated data sets were generated to fit the Rasch measurement and then a total of 1,000 replications were conducted to compute the bootstrapped CIs under each of 25 testing conditions. The results showed that rule-of-thumb critical values for assessing the magnitude of misfit were not applicable because the infit and outfit mean square error statistics showed different magnitudes of variability over testing conditions and the standardized fit statistics did not exactly follow the standard normal distribution. Further, they also do not share the same critical range for the item and person misfit. Based on the results of the study, the bootstrapped CIs can be used to identify misfitting items or persons as they offer a reasonable alternative solution, especially when the distributions of the infit and outfit statistics are not well known and depend on sample size.
A bootstrap procedure to select hyperspectral wavebands related to tannin content
Ferwerda, J.G.; Skidmore, A.K.; Stein, A.
2006-01-01
Detection of hydrocarbons in plants with hyperspectral remote sensing is hampered by overlapping absorption pits, while the `optimal' wavebands for detecting some surface characteristics (e.g. chlorophyll, lignin, tannin) may shift. We combined a phased regression with a bootstrap procedure to find
Bootsie: estimation of coefficient of variation of AFLP data by bootstrap analysis
Bootsie is an English-native replacement for ASG Coelho’s “DBOOT” utility for estimating coefficient of variation of a population of AFLP marker data using bootstrapping. Bootsie improves on DBOOT by supporting batch processing, time-to-completion estimation, built-in graphs, and a suite of export t...
DEFF Research Database (Denmark)
Long, Xiangbao; Chomchoei, Roongrat; Hansen, Elo Harald
2004-01-01
The operational characteristics of a novel poly(tetrafluoroethylene) (PTFE) bead material, granular Algoflon®, used for separation and preconcentration of metal ions via adsorption of on-line generated non-charged metal complexes, were evaluated in a sequential injection (SI) system furnished...... material, Aldrich PTFE, which had demonstrated that PTFE was the most promising for solid-state pretreatments. By comparing the two materials, the Algoflon® beads exhibited much higher sensitivity (1.6107 versus 0.2956 μg l-1 per integrated absorbance (s)), and better retention efficiency (82% versus 74...
DEFF Research Database (Denmark)
Linnet, Kristian
2005-01-01
Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...
Energy Technology Data Exchange (ETDEWEB)
Silva, Cleomacio Miguel da; Amaral, Romilton dos Santos; Santos Junior, Jose Araujo dos; Vieira, Jose Wilson; Leoterio, Dilmo Marques da Silva [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Radioecologia (RAE)], E-mail: cleomaciomiguel@yahoo.com.br; Amaral, Ademir [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Energia Nuclear. Grupo de Estudos em Radioprotecao e Radioecologia
2007-07-01
The distribution of natural radionuclides in samples from typically anomalous environments has generally a great significant asymmetry, as a result of outlier. For diminishing statistic fluctuation researchers, in radioecology, commonly use geometric mean or median, once the average has no stability under the effect of outliers. As the median is not affected by anomalous values, this parameter of central tendency is the most frequently employed for evaluation of a set of data containing discrepant values. On the other hand, Efron presented a non-parametric method the so-called bootstrap that can be used to decrease the dispersion around the central-tendency value. Generally, in radioecology, statistics procedures are used in order to reduce the effect results of the presence of anomalous values as regards averages. In this context, the present study had as an objective to evaluate the application of the non-parametric bootstrap method (BM) for determining the average concentration of {sup 226}Ra in cultivated forage palms (Opuntia spp.) in soils with uranium anomaly on the dairy milk farms, localized in the cities of Pedra and Venturosa, Pernambuco-Brazil, as well as discussing the utilization of this method in radioecology. The results of {sup 226}Ra in samples of forage palm varied from 1,300 to 25,000 mBq.kg{sup -1} (dry matter), with arithmetic average of 5,965.86 +- 5,903.05 mBq.kg{sup -1}. The result obtained for this average using BM was 5,963.82 +- 1,202.96 mBq.kg{sup -1} (dry matter). The use of BM allowed an automatic filtration of experimental data, without the elimination of outliers, leading to the reduction of dispersion around the average. As a result, the BM permitted reaching a stable arithmetic average of the effects of the outliers. (author)
Sequential Divestiture and Firm Asymmetry
Directory of Open Access Journals (Sweden)
Wen Zhou
2013-01-01
Full Text Available Simple Cournot models of divestiture tend to generate incentives to divest which are too strong, predicting that firms will break up into an infinite number of divisions resulting in perfect competition. This paper shows that if the order of divestitures is endogenized, firms will always choose sequential, and hence very limited, divestitures. Divestitures favor the larger firm and the follower in a sequential game. Divestitures in which the larger firm is the follower generate greater industry profit and social welfare, but a smaller consumer surplus.
Complementary sequential measurements generate entanglement
Coles, Patrick J.; Piani, Marco
2014-01-01
We present a paradigm for capturing the complementarity of two observables. It is based on the entanglement created by the interaction between the system observed and the two measurement devices used to measure the observables sequentially. Our main result is a lower bound on this entanglement and resembles well-known entropic uncertainty relations. Besides its fundamental interest, this result directly bounds the effectiveness of sequential bipartite operations—corresponding to the measurement interactions—for entanglement generation. We further discuss the intimate connection of our result with two primitives of information processing, namely, decoupling and coherent teleportation.
Transistor switching and sequential circuits
Sparkes, John J
1969-01-01
Transistor Switching and Sequential Circuits presents the basic ideas involved in the construction of computers, instrumentation, pulse communication systems, and automation. This book discusses the design procedure for sequential circuits. Organized into two parts encompassing eight chapters, this book begins with an overview of the ways on how to generate the types of waveforms needed in digital circuits, principally ramps, square waves, and delays. This text then considers the behavior of some simple circuits, including the inverter, the emitter follower, and the long-tailed pair. Other cha
Sequential triangulation of orbital photography
Rajan, M.; Junkins, J. L.; Turner, J. D.
1979-01-01
The feasibility of structuring the satellite photogrammetric triangulation as an iterative Extended Kalman estimation algorithm is demonstrated. Comparative numerical results of the sequential against batch estimation algorithm are presented. Difficulty of accurately modeling of the attitude motion is overcome by utilizing the on-board angular rate measurements. Solutions of the differential equations and the evaluation of state transition matrix are carried out numerically.
Bootstrap-based confidence estimation in PCA and multivariate statistical process control
DEFF Research Database (Denmark)
Babamoradi, Hamid
Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based...... on assumptions that do not always hold in practice. The aim of this thesis was to illustrate the concept of bootstrap-based confidence estimation in PCA and MSPC. It particularly shows how to build bootstrapbased confidence limits in these areas to be used as alternative to the traditional/asymptotic limits....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...
Langel, Steven E.; Khanafseh, Samer M.; Pervan, Boris
2016-06-01
Differential carrier phase applications that utilize cycle resolution need the probability density function of the baseline estimate to quantify its region of concentration. For the integer bootstrap estimator, the density function has an analytical definition that enables probability calculations given perfect statistical knowledge of measurement and process noise. This paper derives a method to upper bound the tail probability of the integer bootstrapped GNSS baseline when the measurement and process noise correlation functions are unknown, but can be upper and lower bounded. The tail probability is shown to be a non-convex function of a vector of conditional variances, whose feasible region is a convex polytope. We show how to solve the non-convex optimization problem globally by discretizing the polytope into small hyper-rectangular elements, and demonstrate the method for a static baseline estimation problem.
Langel, Steven E.; Khanafseh, Samer M.; Pervan, Boris
2016-11-01
Differential carrier phase applications that utilize cycle resolution need the probability density function of the baseline estimate to quantify its region of concentration. For the integer bootstrap estimator, the density function has an analytical definition that enables probability calculations given perfect statistical knowledge of measurement and process noise. This paper derives a method to upper bound the tail probability of the integer bootstrapped GNSS baseline when the measurement and process noise correlation functions are unknown, but can be upper and lower bounded. The tail probability is shown to be a non-convex function of a vector of conditional variances, whose feasible region is a convex polytope. We show how to solve the non-convex optimization problem globally by discretizing the polytope into small hyper-rectangular elements, and demonstrate the method for a static baseline estimation problem.
2007-01-01
A new chelating resin using chitosan as a base material was synthesized. Functional moiety of 2-amino-5-hydroxy benzoic acid (AHBA) was chemically bonded to the amino group of cross-linked chitosan (CCTS) through the arm of chloromethyloxirane (CCTS-AHBA resin). Several elements, such as Ag, Be, Cd, Co, Cu, Ni, Ph, U, V, and rare earth elements (REEs), could be adsorbed on the resin. To use the resin for on-line pretreatment, the resin was packed in a mini-column and installed into a sequenti...
Gap bootstrap methods for massive data sets with an application to transportation engineering
Lahiri, S.N.; Spiegelman, C.; Appiah, J.; Rilett, L.
2013-01-01
In this paper we describe two bootstrap methods for massive data sets. Naive applications of common resampling methodology are often impractical for massive data sets due to computational burden and due to complex patterns of inhomogeneity. In contrast, the proposed methods exploit certain structural properties of a large class of massive data sets to break up the original problem into a set of simpler subproblems, solve each subproblem separately where the data exhibit a...
A Simple Voltage Controlled Oscillator Using Bootstrap Circuits and NOR-RS Flip Flop
Chaikla, Amphawan; Pongswatd, Sawai; Sasaki, Hirofumi; Fujimoto, Kuniaki; Yahara, Mitsutoshi
This paper presents a simple and successful design for a voltage controlled oscillator. The proposed circuit is based on the use of two identical bootstrap circuits and a NOR-RS Flip Flop to generate wide-tunable sawtooth and square waves. Increasing control voltage linearly increases the output oscillation frequency. Experimental results verifying the performances of the proposed circuit are in agreement with the calculated values.
DEFF Research Database (Denmark)
Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio
2014-01-01
Industrial applications of computed tomography (CT) for dimensional metrology on various components are fast increasing, owing to a number of favorable properties such as capability of non-destructive internal measurements. Uncertainty evaluation is however more complex than in conventional...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....
Amplitudes and Correlators to Ten Loops Using Simple, Graphical Bootstraps
Bourjaily, Jacob L; Tran, Vuong-Viet
2016-01-01
We introduce two new graphical-level relations among possible contributions to the four-point correlation function and scattering amplitude in planar, maximally supersymmetric Yang-Mills theory. When combined with the rung rule, these prove powerful enough to fully determine both functions through ten loops. This then also yields the full five-point amplitude to eight loops and the parity-even part to nine loops. We derive these rules, illustrate their applications, compare their relative strengths for fixing coefficients, and survey some of the features of the previously unknown nine and ten loop expressions. Explicit formulae for amplitudes and correlators through ten loops are available at: http://goo.gl/JH0yEc.
Amplitudes and correlators to ten loops using simple, graphical bootstraps
Bourjaily, Jacob L.; Heslop, Paul; Tran, Vuong-Viet
2016-11-01
We introduce two new graphical-level relations among possible contributions to the four-point correlation function and scattering amplitude in planar, maximally supersymmetric Yang-Mills theory. When combined with the rung rule, these prove powerful enough to fully determine both functions through ten loops. This then also yields the full five-point amplitude to eight loops and the parity-even part to nine loops. We derive these rules, illustrate their applications, compare their relative strengths for fixing coefficients, and survey some of the features of the previously unknown nine and ten loop expressions. Explicit formulae for amplitudes and correlators through ten loops are available at: http://goo.gl/JH0yEc.
Sequential Detection of Digital Watermarking
Institute of Scientific and Technical Information of China (English)
LI Li; YU Yu-lian; WANG Pei
2005-01-01
The paper analyzed a new watermarking detection paradigm including double detection thresholds based on sequential hypothesis testing. A joint design of watermarking encoding and detection was proposed. The paradigm had good immunity to noisy signal attacks and high detection probability. Many experiments proved that the above algorithm can detect watermarks about 66% faster than popular detectors, which could have significant impact on many applications such as video watermarking detection and watermark-searching in a large database of digital contents.
Economic growth and energy consumption causal nexus viewed through a bootstrap rolling window
Energy Technology Data Exchange (ETDEWEB)
Balcilar, Mehmet [Department of Economics, Eastern Mediterranean University, Famagusta, Turkish Republic of Northern Cyprus, via Mersin 10 (Turkey); Ozdemir, Zeynel Abidin [Department of Economics, Gazi University, Besevler, 06500, Ankara (Turkey); Arslanturk, Yalcin [Gazi University, Teknikokullar, 06500, Ankara (Turkey)
2010-11-15
One puzzling results in the literature on energy consumption-economic growth causality is the variability of results particularly across sample periods, sample sizes, and model specification. In order overcome these issues this paper analyzes the causal links between energy consumption and economic growth for G-7 countries using bootstrap Granger non-causality tests with fixed size rolling subsamples. The data used includes annual total energy consumption and real Gross Domestic Product (GDP) series from 1960 to 2006 for G-7 countries, excluding Germany, for which the sample period starts from 1971. Using the full sample bootstrap Granger causality test, we find that there is predictive power from energy consumption to economic growth only for Canada. However, parameter instability tests show that none of the estimated models have constant parameters and hence the full sample results are not reliable. Analogous to the full sample results, the results obtained from the bootstrap rolling window estimation indicate no consistent causal links between energy consumption and economic growth. We, however, find that causal links are present between the series in various subsamples. Furthermore, these subsample periods correspond to significant economic events, indicating that the findings are not statistical artefacts, but correspond to real economic changes. Our results encompass previous findings and offer an explanation to varying findings. (author)
Y-90 PET imaging for radiation theragnosis using bootstrap event re sampling
Energy Technology Data Exchange (ETDEWEB)
Nam, Taewon; Woo, Sangkeun; Min, Gyungju; Kim, Jimin; Kang, Joohyun; Lim, Sangmoo; Kim, Kyeongmin [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)
2013-05-15
Surgical resection is the most effective method to recover the liver function. However, Yttrium-90 (Y-90) has been used as a new treatment due to the fact that it can be delivered to the tumors and results in greater radiation exposure to the tumors than using external radiation nowadays since most treatment is palliative in case of unresectable stage of hepatocellular carcinoma (HCC). Recently, Y-90 has been received much interest and studied by many researchers. Imaging of Y-90 has been conducted using most commonly gamma camera but PET imaging is required due to low sensitivity and resolution. The purpose of this study was to assess statistical characteristics and to improve count rate of image for enhancing image quality by using nonparametric bootstrap method. PET data was able to be improved using non-parametric bootstrap method and it was verified with showing improved uniformity and SNR. Uniformity showed more improvement under the condition of low count rate, i.e. Y-90, in case of phantom and also uniformity and SNR showed improvement of 15.6% and 33.8% in case of mouse, respectively. Bootstrap method performed in this study for PET data increased count rate of PET image and consequentially time for acquisition time can be reduced. It will be expected to improve performance for diagnosis.
The sound symbolism bootstrapping hypothesis for language acquisition and language evolution.
Imai, Mutsumi; Kita, Sotaro
2014-09-19
Sound symbolism is a non-arbitrary relationship between speech sounds and meaning. We review evidence that, contrary to the traditional view in linguistics, sound symbolism is an important design feature of language, which affects online processing of language, and most importantly, language acquisition. We propose the sound symbolism bootstrapping hypothesis, claiming that (i) pre-verbal infants are sensitive to sound symbolism, due to a biologically endowed ability to map and integrate multi-modal input, (ii) sound symbolism helps infants gain referential insight for speech sounds, (iii) sound symbolism helps infants and toddlers associate speech sounds with their referents to establish a lexical representation and (iv) sound symbolism helps toddlers learn words by allowing them to focus on referents embedded in a complex scene, alleviating Quine's problem. We further explore the possibility that sound symbolism is deeply related to language evolution, drawing the parallel between historical development of language across generations and ontogenetic development within individuals. Finally, we suggest that sound symbolism bootstrapping is a part of a more general phenomenon of bootstrapping by means of iconic representations, drawing on similarities and close behavioural links between sound symbolism and speech-accompanying iconic gesture.
Simulation of bootstrap current in 2D and 3D ideal magnetic fields in tokamaks
Raghunathan, M.; Graves, J. P.; Cooper, W. A.; Pedro, M.; Sauter, O.
2016-09-01
We aim to simulate the bootstrap current for a MAST-like spherical tokamak using two approaches for magnetic equilibria including externally caused 3D effects such as resonant magnetic perturbations (RMPs), the effect of toroidal ripple, and intrinsic 3D effects such as non-resonant internal kink modes. The first approach relies on known neoclassical coefficients in ideal MHD equilibria, using the Sauter (Sauter et al 1999 Phys. Plasmas 6 2834) expression valid for all collisionalities in axisymmetry, and the second approach being the quasi-analytic Shaing-Callen (Shaing and Callen 1983 Phys. Fluids 26 3315) model in the collisionless regime for 3D. Using the ideal free-boundary magnetohydrodynamic code VMEC, we compute the flux-surface averaged bootstrap current density, with the Sauter and Shaing-Callen expressions for 2D and 3D ideal MHD equilibria including an edge pressure barrier with the application of resonant magnetic perturbations, and equilibria possessing a saturated non-resonant 1/1 internal kink mode with a weak internal pressure barrier. We compare the applicability of the self-consistent iterative model on the 3D applications and discuss the limitations and advantages of each bootstrap current model for each type of equilibrium.
Finding confidence limits on population growth rates: bootstrap and analytic methods.
Picard, Nicolas; Chagneau, Pierrette; Mortier, Frédéric; Bar-Hen, Avner
2009-05-01
When predicting population dynamics, the value of the prediction is not enough and should be accompanied by a confidence interval that integrates the whole chain of errors, from observations to predictions via the estimates of the parameters of the model. Matrix models are often used to predict the dynamics of age- or size-structured populations. Their parameters are vital rates. This study aims (1) at assessing the impact of the variability of observations on vital rates, and then on model's predictions, and (2) at comparing three methods for computing confidence intervals for values predicted from the models. The first method is the bootstrap. The second method is analytic and approximates the standard error of predictions by their asymptotic variance as the sample size tends to infinity. The third method combines use of the bootstrap to estimate the standard errors of vital rates with the analytical method to then estimate the errors of predictions from the model. Computations are done for an Usher matrix models that predicts the asymptotic (as time goes to infinity) stock recovery rate for three timber species in French Guiana. Little difference is found between the hybrid and the analytic method. Their estimates of bias and standard error converge towards the bootstrap estimates when the error on vital rates becomes small enough, which corresponds in the present case to a number of observations greater than 5000 trees.
Stable bootstrap-current driven equilibria for low aspect ratio tokamaks
Energy Technology Data Exchange (ETDEWEB)
Miller, R.L.; Lin-Liu, Y.R.; Turnbull, A.D.; Chan, V.S. [General Atomics, San Diego, CA (United States); Pearlstein, L.D. [Lawrence Livermore National Lab., CA (United States); Sauter, O.; Villard, L. [Ecole Polytechnique Federale, Lausanne (Switzerland). Centre de Recherche en Physique des Plasma (CRPP)
1996-09-01
Low aspect ratio tokamaks can potentially provide a high ratio of plasma pressure to magnetic pressure {beta} and high plasma current I at a modest size, ultimately leading to a high power density compact fusion power plant. For the concept to be economically feasible, bootstrap current must be a major component of the plasma. A high value of the Troyon factor {beta}{sub N} and strong shaping are required to allow simultaneous operation at high {beta} and high bootstrap current fraction. Ideal magnetohydrodynamic stability of a range of equilibria at aspect 1.4 is systematically explored by varying the pressure profile and shape. The pressure and current profiles are constrained in such a way as to assure complete bootstrap current alignment. Both {beta}{sub N} and {beta} are defined in terms of the vacuum toroidal field. Equilibria with {beta} {sub N}{>=}8 and {beta} {approx_equal}35% to 55% exist which are stable to n = {infinity} ballooning modes, and stable to n = 0,1,2,3 kink modes with a conducting wall. The dependence of {beta} and {beta}{sub N} with respect to aspect ratio is also considered. (author) 9 figs., 14 refs.
Directory of Open Access Journals (Sweden)
NI PUTU AYU DINITA TRISNAYANTI
2015-06-01
Full Text Available In this research bootstrap methods are used to determine the points inference of biplot figures on the analysis of AMMI. If the environmental factors are assumed to be random factors, then Mixed AMMI is used as a model of analysis. In the analysis of the stabilit, the main components score interaction used are KUI1 and KUI2. The purpose of this study is to determine the Biplot figures based on two scores these are KUI with the greatest diversity of Mixed AMMI models and the points inference by using the bootstrap method. The stable genotypes obtained from biplot AMMI2 are G1, G5, and G6. Based on points inference of each genotype, G1 and G5 can be regarded as the most stable genotype. This is because the distribution of G1 and G5 are the closest to the center point (0,0 and both of them have a small radius.
On sequential countably compact topological semigroups
Gutik, Oleg V; RepovÅ¡, DuÅ¡an
2008-01-01
We study topological and algebraic properties of sequential countably compact topological semigroups similar to compact topological semigroups. We prove that a sequential countably compact topological semigroup does not contain the bicyclic semigroup. Also we show that the closure of a subgroup in a sequential countably compact topological semigroup is a topological group, that the inversion in a Clifford sequential countably compact topological semigroup is continuous and we prove the analogue of the Rees-Suschkewitsch Theorem for simple regular sequential countably compact topological semigroups.
Bootstrap current for the edge pedestal plasma in a diverted tokamak geometry
Energy Technology Data Exchange (ETDEWEB)
Koh, S.; Choe, W. [Korea Advanced Institute of Science and Technology, Department of Physics, Daejeon 305-701 (Korea, Republic of); Chang, C. S.; Ku, S.; Menard, J. E. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Weitzner, H. [Courant Institute of Mathematical Sciences, New York University, New York, New York 10012 (United States)
2012-07-15
The edge bootstrap current plays a critical role in the equilibrium and stability of the steep edge pedestal plasma. The pedestal plasma has an unconventional and difficult neoclassical property, as compared with the core plasma. It has a narrow passing particle region in velocity space that can be easily modified or destroyed by Coulomb collisions. At the same time, the edge pedestal plasma has steep pressure and electrostatic potential gradients whose scale-lengths are comparable with the ion banana width, and includes a magnetic separatrix surface, across which the topological properties of the magnetic field and particle orbits change abruptly. A drift-kinetic particle code XGC0, equipped with a mass-momentum-energy conserving collision operator, is used to study the edge bootstrap current in a realistic diverted magnetic field geometry with a self-consistent radial electric field. When the edge electrons are in the weakly collisional banana regime, surprisingly, the present kinetic simulation confirms that the existing analytic expressions [represented by O. Sauter et al., Phys. Plasmas 6, 2834 (1999)] are still valid in this unconventional region, except in a thin radial layer in contact with the magnetic separatrix. The agreement arises from the dominance of the electron contribution to the bootstrap current compared with ion contribution and from a reasonable separation of the trapped-passing dynamics without a strong collisional mixing. However, when the pedestal electrons are in plateau-collisional regime, there is significant deviation of numerical results from the existing analytic formulas, mainly due to large effective collisionality of the passing and the boundary layer trapped particles in edge region. In a conventional aspect ratio tokamak, the edge bootstrap current from kinetic simulation can be significantly less than that from the Sauter formula if the electron collisionality is high. On the other hand, when the aspect ratio is close to unity
Shen, Meiyu; Machado, Stella G
2016-12-01
Bioequivalence studies are an essential part of the evaluation of generic drugs. The most common in vivo bioequivalence study design is the two-period two-treatment crossover design. The observed drug concentration-time profile for each subject from each treatment under each sequence can be obtained. AUC (the area under the concentration-time curve) and Cmax (the maximum concentration) are obtained from the observed drug concentration-time profiles for each subject from each treatment under each sequence. However, such a drug concentration-time profile for each subject from each treatment under each sequence cannot possibly be available during the development of generic ophthalmic products since there is only one-time point measured drug concentration of aqueous humor for each eye. Instead, many subjects will be assigned to each of several prespecified sampling times. Then, the mean concentration at each sampling time can be obtained by the simple average of these subjects' observed concentration. One profile of the mean concentration vs. time can be obtained for one product (either the test or the reference product). One AUC value for one product can be calculated from the mean concentration-time profile using trapezoidal rules. This article develops a novel nonparametric method for obtaining the 90% confidence interval for the ratio of AUCT and AUCR (or CT,max/CR,max) in crossover studies by bootstrapping subjects at each time point with replacement or bootstrapping subjects at all sampling time points with replacement. Here T represents the test product, and R represents the reference product. It also develops a novel nonparametric method for estimating the standard errors (SEs) of AUCh and Ch,max in parallel studies by bootstrapping subjects treated by the hth product at each time point with replacement or bootstrapping subjects treated by the hth product at all sampling time points with replacement, h = T, R. Then, 90% confidence intervals for AUCT/AUCR and CT
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC
Institute of Scientific and Technical Information of China (English)
CHAN Kung-Sik; TONG Howell; STENSETH Nils Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton's pioneering work. Many scientific studies have been designed to collect yearly rodent abundance data, but the resulting time series are generally subject to at least two "problems": being short and non-linear. We explore the use of the continuous threshold autoregressive (TAR) models for analyzing such data. In the simplest case, the continuous TAR models are additive autoregressive models, being piecewise linear in one lag, and linear in all other lags. The location of the slope change is called the threshold parameter. The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions. The lag in which the threshold is located sheds important insights on the structure of the prey-predator system. We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap (NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process. Essentially, the NBB bootstraps blocks of random block sizes, with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series. We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjarvi, Northern Finland.
Confidence limits for the mean of exponential distribution in any time-sequential samples
Institute of Scientific and Technical Information of China (English)
CHEN Jiading; FANG Xiangzhong
2005-01-01
We present the general results determining confidence limits for the mean of exponential distribution in any time-sequential samples, which are obtained in any sequential life tests with replacement or without replacement. Especially, we give the best lower confidence limits in the case of no failure data.
DEFF Research Database (Denmark)
Wang, Jianhua; Hansen, Elo Harald
2002-01-01
An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium...... pyrrolidinedithiocarbamate (APDC) in citrate buffer and the chelate is extracted into isobutyl methyl ketone (IBMK), which is separated from the aqueous phase by means of a newly designed dual-conical gravitational phase separator. A metered amount of the organic eluate is aspirated and stored in the PTFE holding coil (HC......) of the SI-system. Afterwards, it is dispensed and mixed with an aqueous back extractant of dilute nitric acid containing Hg(II) ions as stripping agent, thereby facilitating a rapid metal-exchange reaction with the APDC ligand and transfer of the Cd into the aqueous phase. The aqueous phase is separated...
Sabarudin, Akhmad; Lenghor, Narong; Oshima, Mitsuko; Hakim, Lukman; Takayanagi, Toshio; Gao, Yun-Hua; Motomizu, Shoji
2007-07-31
A new chelating resin using chitosan as a base material was synthesized. Functional moiety of 2-amino-5-hydroxy benzoic acid (AHBA) was chemically bonded to the amino group of cross-linked chitosan (CCTS) through the arm of chloromethyloxirane (CCTS-AHBA resin). Several elements, such as Ag, Be, Cd, Co, Cu, Ni, Pb, U, V, and rare earth elements (REEs), could be adsorbed on the resin. To use the resin for on-line pretreatment, the resin was packed in a mini-column and installed into a sequential-injection/automated pretreatment system (Auto-Pret System) coupled with inductively coupled plasma-atomic emission spectrometry (ICP-AES). The sequential-injection/automated pretreatment system was a laboratory-assembled, and the program was written using Visual Basic software. This system can provide easy operation procedures, less reagent consumption, as well as less waste production. Experimental variables considered as effective factors in the improvement sensitivity, such as an eluent concentration, a sample and an eluent flow rate, pH of samples, and air-sandwiched eluent were carefully optimized. The proposed system provides excellent on-line collection efficiency, as well as high concentration factors of analytes in water samples, which results in highly sensitive detection of ultra-trace and trace analysis. Under the optimal conditions, the detection limits of 24 elements examined are in the range from ppt to sub-ppb levels. The proposed method was validated by using the standard reference material of a river water, SLRS-4, and the applicability was further demonstrated to the on-line collection/concentration of trace elements, such as Ag, Be, Cd, Co, Cu, Ni, Pb, U, V, and REEs in water samples.
Sequential Beamforming Synthetic Aperture Imaging
DEFF Research Database (Denmark)
Kortbek, Jacob; Jensen, Jørgen Arendt; Gammelmark, Kim Løkke
2013-01-01
Synthetic aperture sequential beamforming (SASB) is a novel technique which allows to implement synthetic aperture beamforming on a system with a restricted complexity, and without storing RF-data. The objective is to improve lateral resolution and obtain a more depth independent resolution...... and a range independent lateral resolution is obtained. The SASB method has been investigated using simulations in Field II and by off-line processing of data acquired with a commercial scanner. The lateral resolution increases with a decreasing F#. Grating lobes appear if F# 6 2 for a linear array with k-pitch...
AlHakeem, Donna Ibrahim
This thesis focuses on short-term photovoltaic forecasting (STPVF) for the power generation of a solar PV system using probabilistic forecasts and deterministic forecasts. Uncertainty estimation, in the form of a probabilistic forecast, is emphasized in this thesis to quantify the uncertainties of the deterministic forecasts. Two hybrid intelligent models are proposed in two separate chapters to perform the STPVF. In Chapter 4, the framework of the deterministic proposed hybrid intelligent model is presented, which is a combination of wavelet transform (WT) that is a data filtering technique and a soft computing model (SCM) that is generalized regression neural network (GRNN). Additionally, this chapter proposes a model that is combined as WT+GRNN and is utilized to conduct the forecast of two random days in each season for 1-hour-ahead to find the power generation. The forecasts are analyzed utilizing accuracy measures equations to determine the model performance and compared with another SCM. In Chapter 5, the framework of the proposed model is presented, which is a combination of WT, a SCM based on radial basis function neural network (RBFNN), and a population-based stochastic particle swarm optimization (PSO). Chapter 5 proposes a model combined as a deterministic approach that is represented as WT+RBFNN+PSO, and then a probabilistic forecast is conducted utilizing bootstrap confidence intervals to quantify uncertainty from the output of WT+RBFNN+PSO. In Chapter 5, the forecasts are conducted by furthering the tests done in Chapter 4. Chapter 5 forecasts the power generation of two random days in each season for 1-hour-ahead, 3-hour-ahead, and 6-hour-ahead. Additionally, different types of days were also forecasted in each season such as a sunny day (SD), cloudy day (CD), and a rainy day (RD). These forecasts were further analyzed using accuracy measures equations, variance and uncertainty estimation. The literature that is provided supports that the proposed
Kiselev, V V
2012-01-01
A huge value of cosmological constant characteristic for the particle physics and the inflation of early Universe are inherently related to each other: one can construct a fine-tuned superpotential, which produces a flat potential of inflaton with a constant density of energy V=\\Lambda^4 after taking into account for leading effects due to the supergravity, so that an introduction of small quantum loop-corrections to parameters of this superpotential naturally results in the dynamical instability relaxing the primary cosmological constant by means of inflationary regime. The model phenomenologically agrees with observational data on the large scale structure of Universe at \\Lambda~10^{16} GeV.
Shimada, Hirohiko; Hikami, Shinobu
2016-12-01
The fractal dimensions of polymer chains and high-temperature graphs in the Ising model both in three dimension are determined using the conformal bootstrap applied for the continuation of the O( N) models from N=1 (Ising model) to N=0 (polymer). Even for non-integer N, the O( N) sum rule allows one to study the unitarity bound formally defined from the positivity, which may be violated in a non-unitary CFT. This unitarity bound of the scaling dimension for the O( N)-symmetric-tensor develops a kink as a function of the fundamental field as in the case of the energy operator dimension in the Z_2 (Ising) sum rule. Although this kink structure becomes less pronounced as N tends to zero, we found instead an emerging asymmetric minimum in the current central charge C_J. Despite the non-unitarity of the O( N) model at non-integer N, we find the C_J-kink along the unitarity bound lies very close to the location of the infrared (IR) O( N) CFT estimated by other methods. It is pointed out that certain level degeneracies at the IR CFT should induce these singular shapes of the unitarity bounds. As an application to the quantum and classical spin systems, we also predict critical exponents associated with the N=1 supersymmetry, which could be relevant for locating the corresponding fixed point in the phase diagram.
Structural features of sequential weak measurements
Diósi, Lajos
2016-07-01
We discuss the abstract structure of sequential weak measurement (WM) of general observables. In all orders, the sequential WM correlations without postselection yield the corresponding correlations of the Wigner function, offering direct quantum tomography through the moments of the canonical variables. Correlations in spin-1/2 sequential weak measurements coincide with those in strong measurements, they are constrained kinematically, and they are equivalent with single measurements. In sequential WMs with postselection, an anomaly occurs, different from the weak value anomaly of single WMs. In particular, the spread of polarization σ ̂ as measured in double WMs of σ ̂ will diverge for certain orthogonal pre- and postselected states.
Takemoto, Seiji; Yamaoka, Kiyoshi; Nishikawa, Makiya; Takakura, Yoshinobu
2006-12-01
A bootstrap method is proposed for assessing statistical histograms of pharmacokinetic parameters (AUC, MRT, CL and V(ss)) from one-point sampling data in animal experiments. A computer program, MOMENT(BS), written in Visual Basic on Microsoft Excel, was developed for the bootstrap calculation and the construction of histograms. MOMENT(BS) was applied to one-point sampling data of the blood concentration of three physiologically active proteins ((111)In labeled Hsp70, Suc(20)-BSA and Suc(40)-BSA) administered in different doses to mice. The histograms of AUC, MRT, CL and V(ss) were close to a normal (Gaussian) distribution with the bootstrap resampling number (200), or more, considering the skewness and kurtosis of the histograms. A good agreement of means and SD was obtained between the bootstrap and Bailer's approaches. The hypothesis test based on the normal distribution clearly demonstrated that the disposition of (111)In-Hsp70 and Suc(20)-BSA was almost independent of dose, whereas that of (111)In-Suc(40)-BSA was definitely dose-dependent. In conclusion, the bootstrap method was found to be an efficient method for assessing the histogram of pharmacokinetic parameters of blood or tissue disposition data by one-point sampling.
Immediate Sequential Bilateral Cataract Surgery
DEFF Research Database (Denmark)
Kessel, Line; Andresen, Jens; Erngaard, Ditte
2015-01-01
The aim of the present systematic review was to examine the benefits and harms associated with immediate sequential bilateral cataract surgery (ISBCS) with specific emphasis on the rate of complications, postoperative anisometropia, and subjective visual function in order to formulate evidence......-based national Danish guidelines for cataract surgery. A systematic literature review in PubMed, Embase, and Cochrane central databases identified three randomized controlled trials that compared outcome in patients randomized to ISBCS or bilateral cataract surgery on two different dates. Meta-analyses were...... performed using the Cochrane Review Manager software. The quality of the evidence was assessed using the GRADE method (Grading of Recommendation, Assessment, Development, and Evaluation). We did not find any difference in the risk of complications or visual outcome in patients randomized to ISBCS or surgery...
Institute of Scientific and Technical Information of China (English)
郑蓉建; 周林成; 潘丰
2012-01-01
Fault monitoring of bioprocess is important to ensure safety of a reactor and maintain high quality of products. It is difficult to build an accurate mechanistic model for a bioprocess, so fault monitoring based on rich historical or online database is an effective way. A group of data based on bootstrap method could be resampling stochastically, improving generalization capability of model. In this paper, online fault monitoring of generalized additive models (GAMs) combining with bootstrap is proposed for glutamate fermentation process. GAMs and bootstrap are first used to decide confidence interval based on the online and off-line normal sampled data from glutamate fermentation experiments. Then GAMs are used to online fault monitoring for time, dissolved oxygen, oxygen uptake rate, and carbon dioxide evolution rate. The method can provide accurate fault alarm online and is helpful to provide useful information for removing fault and abnormal phenomena in the fermentation.
Analytic Bounds and Emergence of $\\textrm{AdS}_2$ Physics from the Conformal Bootstrap
Mazac, Dalimil
2016-01-01
We study analytically the constraints of the conformal bootstrap on the low-lying spectrum of operators in field theories with global conformal symmetry in one and two spacetime dimensions. We introduce a new class of linear functionals acting on the conformal bootstrap equation. In 1D, we use the new basis to construct extremal functionals leading to the optimal upper bound on the gap above identity in the OPE of two identical primary operators of integer or half-integer scaling dimension. We also prove an upper bound on the twist gap in 2D theories with global conformal symmetry. When the external scaling dimensions are large, our functionals provide a direct point of contact between crossing in a 1D CFT and scattering of massive particles in large $\\textrm{AdS}_2$. In particular, CFT crossing can be shown to imply that appropriate OPE coefficients exhibit an exponential suppression characteristic of massive bound states, and that the 2D flat-space S-matrix should be analytic away from the real axis.
Bootstrapping Mixed Correlators in the Five Dimensional Critical O(N) Models
Li, Zhijin
2016-01-01
We use the conformal bootstrap approach to explore $5D$ CFTs with $O(N)$ global symmetry, which contain $N$ scalars $\\phi_i$ transforming as $O(N)$ vector. Specifically, we study multiple four-point correlators of the leading $O(N)$ vector $\\phi_i$ and the $O(N)$ singlet $\\sigma$. The crossing symmetry of the four-point functions and the unitarity condition provide nontrivial constraints on the scaling dimensions ($\\Delta_\\phi$, $\\Delta_\\sigma$) of $\\phi_i$ and $\\sigma$. With reasonable assumptions on the gaps between scaling dimensions of $\\phi_i$ ($\\sigma$) and the next $O(N)$ vector (singlet) scalar, we are able to isolate the scaling dimensions $(\\Delta_\\phi$, $\\Delta_\\sigma)$ in small islands. In particular, for large $N=500$, the isolated region is highly consistent with the result obtained from large $N$ expansion. We also study the interacting $O(N)$ CFTs for $1\\leqslant N\\leqslant100$. Isolated regions on $(\\Delta_\\phi,\\Delta_\\sigma)$ plane are obtained using conformal bootstrap program with lower or...
Impurities in a non-axisymmetric plasma: Transport and effect on bootstrap current
Energy Technology Data Exchange (ETDEWEB)
Mollén, A., E-mail: albertm@chalmers.se [Department of Applied Physics, Chalmers University of Technology, Göteborg (Sweden); Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Landreman, M. [Institute for Research in Electronics and Applied Physics, University of Maryland, College Park, Maryland 20742 (United States); Smith, H. M.; Helander, P. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); Braun, S. [Max-Planck-Institut für Plasmaphysik, 17491 Greifswald (Germany); German Aerospace Center, Institute of Engineering Thermodynamics, Pfaffenwaldring 38-40, D-70569 Stuttgart (Germany)
2015-11-15
Impurities cause radiation losses and plasma dilution, and in stellarator plasmas the neoclassical ambipolar radial electric field is often unfavorable for avoiding strong impurity peaking. In this work we use a new continuum drift-kinetic solver, the SFINCS code (the Stellarator Fokker-Planck Iterative Neoclassical Conservative Solver) [M. Landreman et al., Phys. Plasmas 21, 042503 (2014)] which employs the full linearized Fokker-Planck-Landau operator, to calculate neoclassical impurity transport coefficients for a Wendelstein 7-X (W7-X) magnetic configuration. We compare SFINCS calculations with theoretical asymptotes in the high collisionality limit. We observe and explain a 1/ν-scaling of the inter-species radial transport coefficient at low collisionality, arising due to the field term in the inter-species collision operator, and which is not found with simplified collision models even when momentum correction is applied. However, this type of scaling disappears if a radial electric field is present. We also use SFINCS to analyze how the impurity content affects the neoclassical impurity dynamics and the bootstrap current. We show that a change in plasma effective charge Z{sub eff} of order unity can affect the bootstrap current enough to cause a deviation in the divertor strike point locations.
Directory of Open Access Journals (Sweden)
Tásia Hickmann
2015-11-01
Full Text Available In this paper, an iterative forecasting methodology for time series prediction that integrates wavelet de-noising and decomposition with an Artificial Neural Network (ANN and Bootstrap methods is put forward here. Basically, a given time series to be forecasted is initially decomposed into trend and noise (wavelet components by using a wavelet de-noising algorithm. Both trend and noise components are then further decomposed by means of a wavelet decomposition method producing orthonormal Wavelet Components (WCs for each one. Each WC is separately modelled through an ANN in order to provide both in-sample and out-of-sample forecasts. At each time t, the respective forecasts of the WCs of the trend and noise components are simply added to produce the in-sample and out-of-sample forecasts of the underlying time series. Finally, out-of-sample predictive densities are empirically simulated by the Bootstrap sampler and the confidence intervals are then yielded, considering some level of credibility. The proposed methodology, when applied to the well-known Canadian lynx data that exhibit non-linearity and non-Gaussian properties, has outperformed other methods traditionally used to forecast it.
Gray bootstrap method for estimating frequency-varying random vibration signals with small samples
Directory of Open Access Journals (Sweden)
Wang Yanqing
2014-04-01
Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
Directory of Open Access Journals (Sweden)
Gu Xun
2007-03-01
Full Text Available Abstract Background Phylogenetically related miRNAs (miRNA families convey important information of the function and evolution of miRNAs. Due to the special sequence features of miRNAs, pair-wise sequence identity between miRNA precursors alone is often inadequate for unequivocally judging the phylogenetic relationships between miRNAs. Most of the current methods for miRNA classification rely heavily on manual inspection and lack measurements of the reliability of the results. Results In this study, we designed an analysis pipeline (the Phylogeny-Bootstrap-Cluster (PBC pipeline to identify miRNA families based on branch stability in the bootstrap trees derived from overlapping genome-wide miRNA sequence sets. We tested the PBC analysis pipeline with the miRNAs from six animal species, H. sapiens, M. musculus, G. gallus, D. rerio, D. melanogaster, and C. elegans. The resulting classification was compared with the miRNA families defined in miRBase. The two classifications were largely consistent. Conclusion The PBC analysis pipeline is an efficient method for classifying large numbers of heterogeneous miRNA sequences. It requires minimum human involvement and provides measurements of the reliability of the classification results.
Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.
2010-01-01
Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.
A bootstrapped switch employing a new clock feed-through compensation technique
Institute of Scientific and Technical Information of China (English)
Wu Xiaofeng; Liu Hongxia; Su Li; Hao Yue; Li Di; Hu Shigang
2009-01-01
Nonlinearity caused by the clock feed-through of a bootstrapped switch and its compensation techniques are analyzed. All kinds of clock feed-through compensation configurations and their drawbacks are also investigated.It is pointed out that the delay path match of the clock boosting circuit is the critical factor that affects the effective-ness of clock feed-through compensation. Based on that, a new clock feed-through compensation configuration and corresponding bootstrapped switch are presented and designed optimally with the UMC mixed-mode/RF 0.18μm 1P6M P-sub twin-well CMOS process by orientating and elaborately designing the switch MOSFETs that influence the delay path match of the clock boosting circuit. HSPICE simulation results show that the proposed clock feed-through compensation configuration can not only enhance the sampling accuracy under variations of process, power supply voltage, temperature and capacitors but also decrease the even harmonic, high-order odd harmonic and THD on the whole effectively.
Lenardic, A.; Hoink, T.
2008-12-01
Several studies have highlighted the role of a low viscosity asthenosphere in promoting plate-like behavior in mantle convection models. It has also been argued that the asthenosphere is fed by mantle plumes (Phipps- Morgan et al. 1993; Deffeyes 1972) and that the existence of the specific plume types required for this depends on plate subduction (Lenardic and Kaula 1995; Jellinek et al. 2002). Independent of plumes, plate subduction can generate a non-adiabatic temperature gradient which, together with temperature dependent mantle viscosity, leads to a low viscosity near surface region. The above suggests a conceptual model in which the asthenosphere can not be defined solely in terms of material properties but must also be defined in terms of an active process, plate tectonics, which both maintains it and is maintained by it. The bootstrap aspect of the model is its circular causality between plates and the asthenosphere, neither being more fundamental than the other and the existence of each depending on the other. Several of the feedbacks key to the conceptual model will be quantified. The implications for modeling mantle convection in a plate-tectonic mode will also be discussed: 1) A key is to get numerical simulations into the bootstrap mode of operation and this is dependent on assumed initial conditions; 2) The model implies potentially strong hysteresis effects (e.g., transition between convection states, associated with variable yield stress, will occur at different values depending on whether the yield stress is systematically lowered or raised between successive models).
Salleh, Mad Ithnin; Ismail, Shariffah Nur Illiana Syed; Habidin, Nurul Fadly; Latip, Nor Azrin Md; Ishak, Salomawati
2014-12-01
The Malaysian government has put a greater attention on enhancing the productivity in Technical Vocational and Educational Training (TVET) sector to increase the development of a skilled workforce by the year 2020. The implementation of National Higher Education Strategic Plan (NHESP) in 2007 led to the changes in Malaysian Polytechnics sector. Thus, the study of efficiency and productivity make it possible to identify scope of improvement for the institution to perform in more efficient and effective manner. This paper aims to identify the efficiency and productivity of 24 polytechnics main campuses as in 2007. This paper applied bootstrapped Malmquist indices to investigate the effects of NHESP on the technical efficiency and changes in productivity in the Malaysian Polytechnics individually from the year 2007-2010. This method enables a more robust analysis of technical efficiency and productivity changes among polytechnics. The bootstrap simulation method is capable to identify whether or not the computed productivity changes are statistically significant. This paper founds that, the overall mean efficiency score demonstrate a significant growth. In addition, the sector as a whole has undergone positive productivity growth at the frontier during the post-NHESP period except in 2009-2010. The increase in productivity growth during post-NHESP was majorly led by technological growth. The empirical results indicated that during the post-NHESP period, the entire polytechnic showed significant TFP growth. This finding shows NHESP contribution towards the positive growth in the overall performance of the Malaysia's polytechnics sector.
Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.
Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena
2016-12-08
Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ(2) . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Analyzing Sequential Patterns in Retail Databases
Institute of Scientific and Technical Information of China (English)
Unil Yun
2007-01-01
Finding correlated sequential patterns in large sequence databases is one of the essential tasks in data miningsince a huge number of sequential patterns are usually mined, but it is hard to find sequential patterns with the correlation.According to the requirement of real applications, the needed data analysis should be different. In previous mining approaches,after mining the sequential patterns, sequential patterns with the weak affinity are found even with a high minimum support.In this paper, a new framework is suggested for mining weighted support affinity patterns in which an objective measure,sequential ws-confidence is developed to detect correlated sequential patterns with weighted support affinity patterns. Toefficiently prune the weak affinity patterns, it is proved that ws-confidence measure satisfies the anti-monotone and crossweighted support properties which can be applied to eliminate sequential patterns with dissimilar weighted support levels.Based on the framework, a weighted support affinity pattern mining algorithm (WSMiner) is suggested. The performancestudy shows that WSMiner is efficient and scalable for mining weighted support affinity patterns.
Sequential auctions, price trends, and risk preferences
Hu, A.; Zou, L.
2015-01-01
We analyze sequential auctions in a general environment where bidders are heterogeneous in risk exposures and exhibit non-quasilinear utilities. We derive a pure strategy symmetric equilibrium for the sequential Dutch and Vickrey auctions respectively, with an arbitrary number of identical objects f
An automatic, refrigerated, sequential precipitation sampler
Coscio, M. R.; Pratt, G. C.; Krupa, S. V.
The design and characteristics of an automated, refrigerated, sequential precipitation sampler are described. This sampler can collect rainfall on an event basis or as sequential segments within a rain event. Samples are sealed upon collection to prevent gas exchange and are refrigerated in situ at 4 ± 2° C. This sampler is commercially available.
Sequential association rules in atonal music
A. Honingh; T. Weyde; D. Conklin
2009-01-01
This paper describes a preliminary study on the structure of atonal music. In the same way as sequential association rules of chords can be found in tonal music, sequential association rules of pitch class set categories can be found in atonal music. It has been noted before that certain pitch class
Trial Sequential Methods for Meta-Analysis
Kulinskaya, Elena; Wood, John
2014-01-01
Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…
Lattice Sequential Decoder for Coded MIMO Channel: Performance and Complexity Analysis
Abediseid, Walid
2011-01-01
In this paper, the performance limit of lattice sequential decoder for lattice space-time coded MIMO channel is analysed. We determine the rates achievable by lattice coding and sequential decoding applied to such channel. The diversity-multiplexing tradeoff (DMT) under lattice sequential decoding is derived as a function of its parameter---the bias term. The bias parameter is critical for controlling the amount of computations required at the decoding stage. Achieving low decoding complexity requires increasing the value of the bias term. However, this is done at the expense of losing the optimal tradeoff of the channel. We show how such a decoder can bridge the gap between lattice decoder and low complexity decoders. Moreover, the computational complexity of lattice sequential decoder is analysed. Specifically, we derive the tail distribution of the decoder's computational complexity in the high signal-to-noise ratio regime. Similar to the conventional sequential decoder used in discrete memoryless channel,...
Institute of Scientific and Technical Information of China (English)
CHAN; Kung-Sik; TONG; Howell; STENSETH; Nils; Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton’s pioneering work.Many scientific studies have been designed to collect yearly rodent abundance data,but the resulting time series are generally subject to at least two "problems":being short and non-linear.We explore the use of the continuous threshold autoregressive(TAR) models for analyzing such data.In the simplest case,the continuous TAR models are additive autoregressive models,being piecewise linear in one lag,and linear in all other lags.The location of the slope change is called the threshold parameter.The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions.The lag in which the threshold is located sheds important insights on the structure of the prey-predator system.We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap(NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process.Essentially,the NBB bootstraps blocks of random block sizes,with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series.We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjrvi,Northern Finland.
SPATIAL DISTRIBUTION AND SEQUENTIAL SAMPLING OF Brevipalpus phoenicis IN CITRUS
Directory of Open Access Journals (Sweden)
WALTER MALDONADO JR
Full Text Available ABSTRACT Among the pests of citrus, one of the most important is the red and black flat mite Brevipalpus phoenicis (Geijskes, which transmits the Citrus leprosis virus C (CiLV-C.When a rational pest control plan is adopted, it is important to determine the correct timing for carrying out the control plan. Making this decision demands constant follow-up of the culture through periodic sampling where knowledge about the spatial distribution of the pest is a fundamental part to improve sampling and control decisions. The objective of this work was to study the spatial distribution pattern and build a sequential sampling plan for the pest. The data used were gathered from two blocks of Valencia sweet orange on a farm in São Paulo State, Brazil, by 40 inspectors trained for the data collection. The following aggregation indices were calculated: variance/ mean ratio, Morisita index, Green’s coefficient, and k parameter of the negative binomial distribution. The data were tested for fit with Poisson and negative binomial distributions using the chi-square goodness of fit test. The sequential sampling was developed using Wald’s Sequential Probability Ratio Test and validated through simulations. We concluded that the spatial distribution of B. phoenicis is aggregated, its behavior best fitted to the negative binomial distribution and we built and validated a sequential sampling plan for control decision-making.
Finding Sequential Patterns from Large Sequence Data
Esmaeili, Mahdi
2010-01-01
Data mining is the task of discovering interesting patterns from large amounts of data. There are many data mining tasks, such as classification, clustering, association rule mining, and sequential pattern mining. Sequential pattern mining finds sets of data items that occur together frequently in some sequences. Sequential pattern mining, which extracts frequent subsequences from a sequence database, has attracted a great deal of interest during the recent data mining research because it is the basis of many applications, such as: web user analysis, stock trend prediction, DNA sequence analysis, finding language or linguistic patterns from natural language texts, and using the history of symptoms to predict certain kind of disease. The diversity of the applications may not be possible to apply a single sequential pattern model to all these problems. Each application may require a unique model and solution. A number of research projects were established in recent years to develop meaningful sequential pattern...
Multi-agent sequential hypothesis testing
Kim, Kwang-Ki K.
2014-12-15
This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.
Braun, M
1995-01-01
The bootstrap condition is generalized to n reggeized gluons. As a result it is demonstrated that the intercept generated by n reggeized gluons cannot be lower than the one for n=2. Arguments are presented that in the limit N_{c}\\rightarrow\\infty the bootstrap condition reduces the n gluon chain with interacting neighbours to a single BFKL pomeron. In this limit the leading contribution from n gluons corresponds to n/2 non-interacting BFKL pomerons (the n/2 pomeron cut). The sum over n leads to a unitary \\gamma^{\\ast}\\gamma amplitude of the eikonal form.
Energy Technology Data Exchange (ETDEWEB)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.; Baddeley, Robert L.; Riensche, Roderick M.; Jensen, Russell S.; Verhagen, Marc; Pustejovsky, James
2011-02-18
Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant links between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Institute of Scientific and Technical Information of China (English)
HALL Peter
2009-01-01
@@ This is a very attractive article. It combines fascinating new methodology with a most interesting dataset, and a highly motivating presentation. However, despite the many opportunities for discussion, I am going to confine attention to the issue of the block bootstrap, ingeniously developed in this paper into the nearest block bootstrap.
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
A bootstrapped, low-noise, and high-gain photodetector for shot noise measurement
Energy Technology Data Exchange (ETDEWEB)
Zhou, Haijun; Yang, Wenhai; Li, Zhixiu; Li, Xuefeng; Zheng, Yaohui, E-mail: yhzheng@sxu.edu.cn [State Key Laboratory of Quantum Optics and Quantum Optics Devices, Institute of Opto-Electronics, Shanxi University, Taiyuan 030006 (China)
2014-01-15
We presented a low-noise, high-gain photodetector based on the bootstrap structure and the L-C (inductance and capacitance) combination. Electronic characteristics of the photodetector, including electronic noise, gain and frequency response, and dynamic range, were verified through a single-frequency Nd:YVO{sub 4} laser at 1064 nm with coherent output. The measured shot noise of 50 μW laser was 13 dB above the electronic noise at the analysis frequency of 2 MHz, and 10 dB at 3 MHz. And a maximum clearance of 28 dB at 2 MHz was achieved when 1.52 mW laser was illuminated. In addition, the photodetector showed excellent linearities for both DC and AC amplifications in the laser power range between 12.5 μW and 1.52 mW.
Karian, Zaven A
2000-01-01
Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...
Dynamic Assessment of Vibration of Tooth Modification Gearbox Using Grey Bootstrap Method
Directory of Open Access Journals (Sweden)
Hui-liang Wang
2015-01-01
Full Text Available The correlation analysis between gear modification and vibration characteristics of transmission system was difficult to quantify; a novel small sample vibration of gearbox prediction method based on grey system theory and bootstrap theory was presented. The method characterized vibration base feature of tooth modification gearbox by developing dynamic uncertainty, estimated true value, and systematic error measure, and these parameters could indirectly dynamically evaluate the effect of tooth modification. The method can evaluate the vibration signal of gearbox with installation of no tooth modification gear and topological modification gear, respectively, considering that 100% reliability is the constraints condition and minimum average uncertainty is the target value. Computer simulation and experiment results showed that vibration amplitude of gearbox was decreased partly due to topological tooth modification, and each value of average dynamic uncertainty, mean true value, and systematic error measure was smaller than the no tooth modification value. The study provided an important guide for tooth modification, dynamic performance optimization.
Solid oxide fuel cell power plant having a bootstrap start-up system
Lines, Michael T
2016-10-04
The bootstrap start-up system (42) achieves an efficient start-up of the power plant (10) that minimizes formation of soot within a reformed hydrogen rich fuel. A burner (48) receives un-reformed fuel directly from the fuel supply (30) and combusts the fuel to heat cathode air which then heats an electrolyte (24) within the fuel cell (12). A dilute hydrogen forming gas (68) cycles through a sealed heat-cycling loop (66) to transfer heat and generated steam from an anode side (32) of the electrolyte (24) through fuel processing system (36) components (38, 40) and back to an anode flow field (26) until fuel processing system components (38, 40) achieve predetermined optimal temperatures and steam content. Then, the heat-cycling loop (66) is unsealed and the un-reformed fuel is admitted into the fuel processing system (36) and anode flow (26) field to commence ordinary operation of the power plant (10).
Directory of Open Access Journals (Sweden)
Xintao Xia
2013-07-01
Full Text Available This study proposed the bootstrap maximum-entropy method to evaluate the uncertainty of the starting torque of a slewing bearing. Addressing the variation coefficient of the slewing bearing starting torque under load, the probability density function, estimated true value and variation domain are obtained through experimental investigation of the slewing bearing starting torque under various loads. The probability density function is found to be characterized by variational figure, scale and location. In addition, the estimated true value and the variation domain vary from large to small along with increasing load, indicating better evolution of the stability and reliability of the starting friction torque. Finally, a sensitive spot exists where the estimated true value and the variation domain rise abnormally, showing a fluctuation in the immunity and a degenerative disorder in the stability and reliability of the starting friction torque.
Application of Robust Regression and Bootstrap in Poductivity Analysis of GERD Variable in EU27
Directory of Open Access Journals (Sweden)
Dagmar Blatná
2014-06-01
Full Text Available The GERD is one of Europe 2020 headline indicators being tracked within the Europe 2020 strategy. The headline indicator is the 3% target for the GERD to be reached within the EU by 2020. Eurostat defi nes “GERD” as total gross domestic expenditure on research and experimental development in a percentage of GDP. GERD depends on numerous factors of a general economic background, namely of employment, innovation and research, science and technology. The values of these indicators vary among the European countries, and consequently the occurrence of outliers can be anticipated in corresponding analyses. In such a case, a classical statistical approach – the least squares method – can be highly unreliable, the robust regression methods representing an acceptable and useful tool. The aim of the present paper is to demonstrate the advantages of robust regression and applicability of the bootstrap approach in regression based on both classical and robust methods.
A spatial bootstrap technique for parameter estimation of rainfall annual maxima distribution
Directory of Open Access Journals (Sweden)
F. Uboldi
2013-09-01
Full Text Available Estimation of extreme event distributions and depth-duration-frequency (DDF curves is achieved at any target site by repeated sampling among all available raingauge data in the surrounding area. The estimate is computed over a gridded domain in Northern Italy, using precipitation time series from 1929 to 2011, including data from historical analog stations and from the present-day automatic observational network. The presented local regionalisation naturally overcomes traditional station-point methods, with their demand of long historical series and their sensitivity to very rare events occurring at very few stations, possibly causing unrealistic spatial gradients in DDF relations. At the same time, the presented approach allows for spatial dependence, necessary in a geographical domain such as Lombardy, complex for both its topography and its climatology. The bootstrap technique enables evaluating uncertainty maps for all estimated parameters and for rainfall depths at assigned return periods.
A bootstrapped, low-noise, and high-gain photodetector for shot noise measurement.
Zhou, Haijun; Yang, Wenhai; Li, Zhixiu; Li, Xuefeng; Zheng, Yaohui
2014-01-01
We presented a low-noise, high-gain photodetector based on the bootstrap structure and the L-C (inductance and capacitance) combination. Electronic characteristics of the photodetector, including electronic noise, gain and frequency response, and dynamic range, were verified through a single-frequency Nd:YVO4 laser at 1064 nm with coherent output. The measured shot noise of 50 μW laser was 13 dB above the electronic noise at the analysis frequency of 2 MHz, and 10 dB at 3 MHz. And a maximum clearance of 28 dB at 2 MHz was achieved when 1.52 mW laser was illuminated. In addition, the photodetector showed excellent linearities for both DC and AC amplifications in the laser power range between 12.5 μW and 1.52 mW.
Integrating Multiple Microarray Data for Cancer Pathway Analysis Using Bootstrapping K-S Test
Directory of Open Access Journals (Sweden)
Bing Han
2009-01-01
Full Text Available Previous applications of microarray technology for cancer research have mostly focused on identifying genes that are differentially expressed between a particular cancer and normal cells. In a biological system, genes perform different molecular functions and regulate various biological processes via interactions with other genes thus forming a variety of complex networks. Therefore, it is critical to understand the relationship (e.g., interactions between genes across different types of cancer in order to gain insights into the molecular mechanisms of cancer. Here we propose an integrative method based on the bootstrapping Kolmogorov-Smirnov test and a large set of microarray data produced with various types of cancer to discover common molecular changes in cells from normal state to cancerous state. We evaluate our method using three key pathways related to cancer and demonstrate that it is capable of finding meaningful alterations in gene relations.
Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems
DeDeo, Simon; Klingenstein, Sara; Hitchcock, Tim
2013-01-01
We characterize the statistical bootstrap for the estimation of information-theoretic quantities from data, with particular reference to its use in the study of large-scale social phenomena. Our methods allow one to preserve, approximately, the underlying axiomatic relationships of information theory---in particular, consistency under arbitrary coarse-graining---that motivate use of these quantities in the first place, while providing reliability comparable to the state of the art for Bayesian estimators. We show how information-theoretic quantities allow for rigorous empirical study of the decision-making capacities of rational agents, and the time-asymmetric flows of information in distributed systems. We provide illustrative examples by reference to ongoing collaborative work on the semantic structure of the British Criminal Court system and the conflict dynamics of the contemporary Afghanistan insurgency.
Bootstrapping in a language of thought: a formal model of numerical concept learning.
Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D
2012-05-01
In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning.
Financial Development and Economic Growth in European Countries: Bootstrap Causality Analysis
Directory of Open Access Journals (Sweden)
Fuat Lebe
2016-01-01
Full Text Available In the present study, it was investigated whether there was a causality relationship between financial development and economic growth for sixteen European countries. Data from the period of 1988-2012 was analyzed using the bootstrap panel causality test, which takes cross-section dependence and heterogeneity into account. The results of the test showed that there was a strong causality relationship between financial development and economic growth in European countries. In European countries, there was a causality relationship from economic growth to financial development and from financial development to economic growth. These results support both the supply-leading and the demand-following hypotheses. Therefore, it can be said that the feedback hypothesis is valid for European countries.
A Bootstrapping Based Approach for Open Geo-entity Relation Extraction
Directory of Open Access Journals (Sweden)
YU Li
2016-05-01
Full Text Available Extracting spatial relations and semantic relations between two geo-entities from Web texts, asks robust and effective solutions. This paper puts forward a novel approach: firstly, the characteristics of terms (part-of-speech, position and distance are analyzed by means of bootstrapping. Secondly, the weight of each term is calculated and the keyword is picked out as the clue of geo-entity relations. Thirdly, the geo-entity pairs and their keywords are organized into structured information. Finally, an experiment is conducted with Baidubaike and Stanford CoreNLP. The study shows that the presented method can automatically explore part of the lexical features and find additional relational terms which neither the domain expert knowledge nor large scale corpora need. Moreover, compared with three classical frequency statistics methods, namely Frequency, TF-IDF and PPMI, the precision and recall are improved about 5% and 23% respectively.
Multi-level bootstrap analysis of stable clusters in resting-state fMRI.
Bellec, Pierre; Rosa-Neto, Pedro; Lyttelton, Oliver C; Benali, Habib; Evans, Alan C
2010-07-01
A variety of methods have been developed to identify brain networks with spontaneous, coherent activity in resting-state functional magnetic resonance imaging (fMRI). We propose here a generic statistical framework to quantify the stability of such resting-state networks (RSNs), which was implemented with k-means clustering. The core of the method consists in bootstrapping the available datasets to replicate the clustering process a large number of times and quantify the stable features across all replications. This bootstrap analysis of stable clusters (BASC) has several benefits: (1) it can be implemented in a multi-level fashion to investigate stable RSNs at the level of individual subjects and at the level of a group; (2) it provides a principled measure of RSN stability; and (3) the maximization of the stability measure can be used as a natural criterion to select the number of RSNs. A simulation study validated the good performance of the multi-level BASC on purely synthetic data. Stable networks were also derived from a real resting-state study for 43 subjects. At the group level, seven RSNs were identified which exhibited a good agreement with the previous findings from the literature. The comparison between the individual and group-level stability maps demonstrated the capacity of BASC to establish successful correspondences between these two levels of analysis and at the same time retain some interesting subject-specific characteristics, e.g. the specific involvement of subcortical regions in the visual and fronto-parietal networks for some subjects.
BOBA FRET: bootstrap-based analysis of single-molecule FRET data.
Directory of Open Access Journals (Sweden)
Sebastian L B König
Full Text Available Time-binned single-molecule Förster resonance energy transfer (smFRET experiments with surface-tethered nucleic acids or proteins permit to follow folding and catalysis of single molecules in real-time. Due to the intrinsically low signal-to-noise ratio (SNR in smFRET time traces, research over the past years has focused on the development of new methods to extract discrete states (conformations from noisy data. However, limited observation time typically leads to pronounced cross-sample variability, i.e., single molecules display differences in the relative population of states and the corresponding conversion rates. Quantification of cross-sample variability is necessary to perform statistical testing in order to assess whether changes observed in response to an experimental parameter (metal ion concentration, the presence of a ligand, etc. are significant. However, such hypothesis testing has been disregarded to date, precluding robust biological interpretation. Here, we address this problem by a bootstrap-based approach to estimate the experimental variability. Simulated time traces are presented to assess the robustness of the algorithm in conjunction with approaches commonly used in thermodynamic and kinetic analysis of time-binned smFRET data. Furthermore, a pair of functionally important sequences derived from the self-cleaving group II intron Sc.ai5γ (d3'EBS1/IBS1 is used as a model system. Through statistical hypothesis testing, divalent metal ions are shown to have a statistically significant effect on both thermodynamic and kinetic aspects of their interaction. The Matlab source code used for analysis (bootstrap-based analysis of smFRET data, BOBA FRET, as well as a graphical user interface, is available via http://www.aci.uzh.ch/rna/.
The bootstrapped model--Lessons for the acceptance of intellectual technology.
Lovie, A D
1987-09-01
This paper is intended as a non-technical introduction to a growing aspect of what has been termed 'intellectual technology'. The particular area chosen is the use of simple linear additive models for judgement and decision making purposes. Such models are said to either outperform, or perform at least as well as, the human judges on which they are based, hence they are said to 'bootstrap' such human inputs. Although the paper will provide a fairly comprehensive list of recent applications of such models, from postgraduate selection to judgements of marital happiness, the work will concentrate on the topic of Credit Scoring as an exemplar - that is, the assignment of credit by means of a simple additive rule. The paper will also present a simple system, due to Dawes, of classifying such models according to the form and source of their weights. The paper further discusses the reasons for bootstrapping and that other major phenomenon of such models - that is, the one can rarely distinguish between the prescriptions of such models, however the weights have been arrived at. It is argued that this 'principle of the flat maximum' allows us to develop a technology of judgement. The paper continues with a brief historical survey of the reactions of human experts to such models and their superiority, and suggestions for a better mix of expert and model on human engineering lines. Finally, after a brief comparison between expert systems and linear additive models, the paper concludes with a brief survey of possible future developments. A short Appendix describes two applications of such models.
On the Model-Based Bootstrap with Missing Data: Obtaining a "P"-Value for a Test of Exact Fit
Savalei, Victoria; Yuan, Ke-Hai
2009-01-01
Evaluating the fit of a structural equation model via bootstrap requires a transformation of the data so that the null hypothesis holds exactly in the sample. For complete data, such a transformation was proposed by Beran and Srivastava (1985) for general covariance structure models and applied to structural equation modeling by Bollen and Stine…
Enders, Craig K.
2002-01-01
Proposed a method for extending the Bollen-Stine bootstrap model (K. Bollen and R. Stine, 1992) fit to structural equation models with missing data. Developed a Statistical Analysis System macro program to implement this procedure, and assessed its usefulness in a simulation. The new method yielded model rejection rates close to the nominal 5%…
Burg, van der Eeke; Leeuw, de Jan
1988-01-01
In this paper we discuss the estimation of mean and standard errors of the eigenvalues and category quantifications in generalized non-linear canonical correlation analysis (OVERALS). Starting points are the delta method equations, but the jack-knife and bootstrap are used to provide finite differen
Mafuba, Kay; Gates, Bob
2012-12-01
This paper explores and advocates the use of sequential multiple methods as a contemporary strategy for undertaking research. Sequential multiple methods involve the use of results obtained through one data collection method to determine the direction and implementation of subsequent stages of a research project (Morse, 1991; Morgan, 1998). This paper will also explore the significance of how triangulating research at the epistemological, theoretical and methodological levels could enhance research. Finally the paper evaluates the significance of sequential multiple method in learning disability nursing research practice.
Sequential monitoring with conditional randomization tests
Plamadeala, Victoria; 10.1214/11-AOS941
2012-01-01
Sequential monitoring in clinical trials is often employed to allow for early stopping and other interim decisions, while maintaining the type I error rate. However, sequential monitoring is typically described only in the context of a population model. We describe a computational method to implement sequential monitoring in a randomization-based context. In particular, we discuss a new technique for the computation of approximate conditional tests following restricted randomization procedures and then apply this technique to approximate the joint distribution of sequentially computed conditional randomization tests. We also describe the computation of a randomization-based analog of the information fraction. We apply these techniques to a restricted randomization procedure, Efron's [Biometrika 58 (1971) 403--417] biased coin design. These techniques require derivation of certain conditional probabilities and conditional covariances of the randomization procedure. We employ combinatoric techniques to derive t...
Delayed Sequential Coding of Correlated Sources
Ma, Nan; Ishwar, Prakash
2007-01-01
Motivated by video coding applications, we study the problem of sequential coding of correlated sources with (noncausal) encoding and/or decoding frame-delays. The fundamental tradeoffs between individual frame rates, individual frame distortions, and encoding/decoding frame-delays are derived in terms of a single-letter information-theoretic characterization of the rate-distortion region for general inter-frame source correlations and certain types of (potentially frame-specific and coupled) single-letter fidelity criteria. For video sources which are spatially stationary memoryless and temporally Gauss--Markov, MSE frame distortions, and a sum-rate constraint, our results expose the optimality of differential predictive coding among all causal sequential coders. Somewhat surprisingly, causal sequential encoding with one-step delayed noncausal sequential decoding can exactly match the sum-rate-MSE performance of joint coding for all nontrivial MSE-tuples satisfying certain positive semi-definiteness conditio...
Mining Sequential Patterns In Multidimensional Data
Plantevit, Marc
2008-01-01
Sequential pattern mining is a key technique of data mining with broad applications (user behavior analysis, bioinformatic, security, music, etc.). Sequential pattern mining aims at discovering correlations among events through time. There exist many algorithms to discover such patterns. However, these approaches only take one dimension into account (e.g. product dimension in customer market basket problem analysis) whereas data are multidimensional in nature. In this thesis, we define multid...
Exact Group Sequential Methods for Estimating a Binomial Proportion
Directory of Open Access Journals (Sweden)
Zhengjia Chen
2013-01-01
Full Text Available We first review existing sequential methods for estimating a binomial proportion. Afterward, we propose a new family of group sequential sampling schemes for estimating a binomial proportion with prescribed margin of error and confidence level. In particular, we establish the uniform controllability of coverage probability and the asymptotic optimality for such a family of sampling schemes. Our theoretical results establish the possibility that the parameters of this family of sampling schemes can be determined so that the prescribed level of confidence is guaranteed with little waste of samples. Analytic bounds for the cumulative distribution functions and expectations of sample numbers are derived. Moreover, we discuss the inherent connection of various sampling schemes. Numerical issues are addressed for improving the accuracy and efficiency of computation. Computational experiments are conducted for comparing sampling schemes. Illustrative examples are given for applications in clinical trials.
Markov sequential pattern recognition : dependency and the unknown class.
Energy Technology Data Exchange (ETDEWEB)
Malone, Kevin Thomas; Haschke, Greg Benjamin; Koch, Mark William
2004-10-01
The sequential probability ratio test (SPRT) minimizes the expected number of observations to a decision and can solve problems in sequential pattern recognition. Some problems have dependencies between the observations, and Markov chains can model dependencies where the state occupancy probability is geometric. For a non-geometric process we show how to use the effective amount of independent information to modify the decision process, so that we can account for the remaining dependencies. Along with dependencies between observations, a successful system needs to handle the unknown class in unconstrained environments. For example, in an acoustic pattern recognition problem any sound source not belonging to the target set is in the unknown class. We show how to incorporate goodness of fit (GOF) classifiers into the Markov SPRT, and determine the worse case nontarget model. We also develop a multiclass Markov SPRT using the GOF concept.
Finding Sequential Patterns from Large Sequence Data
Directory of Open Access Journals (Sweden)
Fazekas Gabor
2010-01-01
Full Text Available Data mining is the task of discovering interesting patterns from large amounts of data. There are many data mining tasks, such as classification, clustering, association rule mining, and sequential pattern mining. Sequential pattern mining finds sets of data items that occur together frequently in some sequences. Sequential pattern mining, which extracts frequent subsequences from a sequence database, has attracted a great deal of interest during the recent data mining research because it is the basis of many applications, such as: web user analysis, stock trend prediction, DNA sequence analysis, finding language or linguistic patterns from natural language texts, and using the history of symptoms to predict certain kind of disease. The diversity of the applications may not be possible to apply a single sequential pattern model to all these problems. Each application may require a unique model and solution. A number of research projects were established in recent years to develop meaningful sequential pattern models and efficient algorithms for mining these patterns. In this paper, we theoretically provided a brief overview three types of sequential patterns model.
Nyasulu, Frazier; Barlag, Rebecca
2011-01-01
The well-known colorimetric determination of the equilibrium constant of the iron(III-thiocyanate complex is simplified by preparing solutions in a cuvette. For the calibration plot, 0.10 mL increments of 0.00100 M KSCN are added to 4.00 mL of 0.200 M Fe(NO[subscript 3])[subscript 3], and for the equilibrium solutions, 0.50 mL increments of…
The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements
Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry; Watkins, Michael; Yuan, Dah-Ning
2013-01-01
The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.
Sequential analysis of dimethyl sulfur compounds in seawater
Institute of Scientific and Technical Information of China (English)
Meng Li; Dong Xing Yuan; Quan Long Li; Xiao Ying Jin
2007-01-01
A sequential method for the determination of dimethyl sulfur compounds, including dimethylsulfide (DMS), dimethylsulfoniopropionate (DMSP) and dimethylsulfoxide (DMSO), in seawater samples has been developed. Detection limit of 2.5 pmol of DMS in 25 mL sample, corresponding to 0.10 nmol/L, was achieved. Recoveries for dimethyl sulfur compounds were in the range of 68.6-78.3%. The relative standard deviations (R.S.D.s) for DMS, DMSP and DMSO determination were 3.0, 5.4 and 7.4%, respectively.
Shimada, Makoto K; Nishida, Tsunetoshi
2017-02-20
Felsenstein's PHYLIP package of molecular phylogeny tools has been used globally since 1980. The programs are receiving renewed attention because of their character-based user interface, which has the advantage of being scriptable for use with large-scale data studies based on super-computers or massively parallel computing clusters. However, occasionally we found, the PHYLIP Consense program output text file displays two or more divided bootstrap values for the same cluster in its result table, and when this happens the output Newick tree file incorrectly assigns only the last value to that cluster that disturbs correct estimation of a consensus tree. We ascertained the cause of this aberrant behavior in the bootstrapping calculation. Our rewrite of the Consense program source code outputs bootstrap values, without redundancy, in its result table, and a Newick tree file with appropriate, corresponding bootstrap values. Furthermore, we developed an add-on program and shell script, add_bootstrap.pl and fasta2tre_bs.bsh, to generate a Newick tree containing the topology and branch lengths inferred from the original data along with valid bootstrap values, and to actualize the automated inference of a phylogenetic tree containing the originally inferred topology and branch lengths with bootstrap values, from multiple unaligned sequences, respectively. These programs can be downloaded at: https://github.com/ShimadaMK/PHYLIP_enhance/.
Modern Sequential Analysis and Its Applications to Computerized Adaptive Testing
Bartroff, Jay; Finkelman, Matthew; Lai, Tze Leung
2008-01-01
After a brief review of recent advances in sequential analysis involving sequential generalized likelihood ratio tests, we discuss their use in psychometric testing and extend the asymptotic optimality theory of these sequential tests to the case of sequentially generated experiments, of particular interest in computerized adaptive testing. We…
Enzyme-catalyzed Sequential Reduction of Carbon Dioxide to Formaldehyde☆
Institute of Scientific and Technical Information of China (English)
Wenfang Liu; Yanhui Hou; Benxiang Hou; Zhiping Zhao
2014-01-01
It has been reported that enzymatic-catalyzed reduction of CO2 is feasible. Most of literature focuses on the con-version of CO2 to methanol. Herein we put emphasis on the sequential conversion of CO2 to formaldehyde and its single reactions. It appears that CO2 pressure plays a critical role and higher pressure is greatly helpful to form more HCOOH as well as HCHO. The reverse reaction became severe in the reduction of CO2 to formaldehyde after 10 h, decreasing HCHO production. Increasing the mass ratio of formate dehydrogenase to formaldehyde dehydrogenase could promote the sequential reaction. At concentrations of nicotinamide adenine dinucleotide lower than 100 mmol·L−1, the reduction of CO2 was accelerated by increasing cofactor concentration. The opti-mum pH value and concentration of phosphate buffer were determined as 6.0 and 0.05 mol·L−1, respectively, for the overall reaction. It seems that thermodynamic factor such as pH is restrictive to the sequential reaction due to distinct divergence in appropriate pH range between its single reactions.
Sequential biological process for molybdenum extraction from hydrodesulphurization spent catalyst.
Vyas, Shruti; Ting, Yen-Peng
2016-10-01
Spent catalyst bioleaching with Acidithiobacillus ferrooxidans has been widely studied and low Mo leaching has often been reported. This work describes an enhanced extraction of Mo via a two stage sequential process for the bioleaching of hydrodesulphurization spent catalyst containing Molybdenum, Nickel and, Aluminium. In the first stage, two-step bioleaching was performed using Acidithiobacillus ferrooxidans, and achieved 89.4% Ni, 20.9% Mo and 12.7% Al extraction in 15 days. To increase Mo extraction, the bioleached catalyst was subjected to a second stage bioleaching using Escherichia coli, during which 99% of the remaining Mo was extracted in 25 days. This sequential bioleaching strategy selectively extracted Ni in the first stage and Mo in the second stage, and is a more environmentally friendly alternative to sequential chemical leaching with alkaline reagents for improved Mo extraction. Kinetic modelling to establish the rate determining step in both stages of bioleaching showed that in the first stage, Mo extraction was chemical reaction controlled whereas in the subsequent stage, product layer diffusion model provided the best fit.
Modern Sequential Analysis and its Applications to Computerized Adaptive Testing
Bartroff, Jay; Lai, Tze Leung
2011-01-01
After a brief review of recent advances in sequential analysis involving sequential generalized likelihood ratio tests, we discuss their use in psychometric testing and extend the asymptotic optimality theory of these sequential tests to the case of sequentially generated experiments, of particular interest in computerized adaptive testing. We then show how these methods can be used to design adaptive mastery tests, which are asymptotically optimal and are also shown to provide substantial improvements over currently used sequential and fixed length tests.
Learning Biological Networks via Bootstrapping with Optimized GO-based Gene Similarity
Energy Technology Data Exchange (ETDEWEB)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.; Baddeley, Robert L.; Riensche, Roderick M.; Jensen, Russell S.; Verhagen, Marc
2010-08-02
Microarray gene expression data provide a unique information resource for learning biological networks using "reverse engineering" methods. However, there are a variety of cases in which we know which genes are involved in a given pathology of interest, but we do not have enough experimental evidence to support the use of fully-supervised/reverse-engineering learning methods. In this paper, we explore a novel semi-supervised approach in which biological networks are learned from a reference list of genes and a partial set of links for these genes extracted automatically from PubMed abstracts, using a knowledge-driven bootstrapping algorithm. We show how new relevant links across genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. We describe an application of this approach to the TGFB pathway as a case study and show how the ensuing results prove the feasibility of the approach as an alternate or complementary technique to fully supervised methods.
Gul, Sehrish; Zou, Xiang; Hassan, Che Hashim; Azam, Muhammad; Zaman, Khalid
2015-12-01
This study investigates the relationship between energy consumption and carbon dioxide emission in the causal framework, as the direction of causality remains has a significant policy implication for developed and developing countries. The study employed maximum entropy bootstrap (Meboot) approach to examine the causal nexus between energy consumption and carbon dioxide emission using bivariate as well as multivariate framework for Malaysia, over a period of 1975-2013. This is a unified approach without requiring the use of conventional techniques based on asymptotical theory such as testing for possible unit root and cointegration. In addition, it can be applied in the presence of non-stationary of any type including structural breaks without any type of data transformation to achieve stationary. Thus, it provides more reliable and robust inferences which are insensitive to time span as well as lag length used. The empirical results show that there is a unidirectional causality running from energy consumption to carbon emission both in the bivariate model and multivariate framework, while controlling for broad money supply and population density. The results indicate that Malaysia is an energy-dependent country and hence energy is stimulus to carbon emissions.
BoCluSt: Bootstrap Clustering Stability Algorithm for Community Detection.
Garcia, Carlos
2016-01-01
The identification of modules or communities in sets of related variables is a key step in the analysis and modeling of biological systems. Procedures for this identification are usually designed to allow fast analyses of very large datasets and may produce suboptimal results when these sets are of a small to moderate size. This article introduces BoCluSt, a new, somewhat more computationally intensive, community detection procedure that is based on combining a clustering algorithm with a measure of stability under bootstrap resampling. Both computer simulation and analyses of experimental data showed that BoCluSt can outperform current procedures in the identification of multiple modules in data sets with a moderate number of variables. In addition, the procedure provides users with a null distribution of results to evaluate the support for the existence of community structure in the data. BoCluSt takes individual measures for a set of variables as input, and may be a valuable and robust exploratory tool of network analysis, as it provides 1) an estimation of the best partition of variables into modules, 2) a measure of the support for the existence of modular structures, and 3) an overall description of the whole structure, which may reveal hierarchical modular situations, in which modules are composed of smaller sub-modules.
Kantar, E.; Deviren, B.; Keskin, M.
2011-11-01
We present a study, within the scope of econophysics, of the hierarchical structure of 98 among the largest international companies including 18 among the largest Turkish companies, namely Banks, Automobile, Software-hardware, Telecommunication Services, Energy and the Oil-Gas sectors, viewed as a network of interacting companies. We analyze the daily time series data of the Boerse-Frankfurt and Istanbul Stock Exchange. We examine the topological properties among the companies over the period 2006-2010 by using the concept of hierarchical structure methods (the minimal spanning tree (MST) and the hierarchical tree (HT)). The period is divided into three subperiods, namely 2006-2007, 2008 which was the year of global economic crisis, and 2009-2010, in order to test various time-windows and observe temporal evolution. We carry out bootstrap analyses to associate the value of statistical reliability to the links of the MSTs and HTs. We also use average linkage clustering analysis (ALCA) in order to better observe the cluster structure. From these studies, we find that the interactions among the Banks/Energy sectors and the other sectors were reduced after the global economic crisis; hence the effects of the Banks and Energy sectors on the correlations of all companies were decreased. Telecommunication Services were also greatly affected by the crisis. We also observed that the Automobile and Banks sectors, including Turkish companies as well as some companies from the USA, Japan and Germany were strongly correlated with each other in all periods.
Lee, Taesam
2017-02-01
The outputs from general circulation models (GCMs) provide useful information about the rate and magnitude of future climate change. The temperature variable is more reliable than other variables in GCM outputs. However, hydrological variables (e.g., precipitation) from GCM outputs for future climate change possess an uncertainty that is too high for practical use. Therefore, a method called intentionally biased bootstrapping (IBB), which simulates the increase of the temperature variable by a certain level as ascertained from observed global warming data, is proposed. In addition, precipitation data were resampled by employing a block-wise sampling technique associated with the temperature simulation. In summary, a warming temperature scenario is simulated, along with the corresponding precipitation values whose time indices are the same as those of the simulated warming temperature scenario. The proposed method was validated with annual precipitation data by truncating the recent years of the record. The proposed model was also employed to assess the future changes in seasonal precipitation in South Korea within a global warming scenario as well as in weekly timescales. The results illustrate that the proposed method is a good alternative for assessing the variation of hydrological variables such as precipitation under the warming condition.
Economic policy uncertainty and housing returns in Germany: Evidence from a bootstrap rolling window
Directory of Open Access Journals (Sweden)
David Su
2016-06-01
Full Text Available The purpose of this investigation is to research the causal link between economic policy uncertainty (EPU and the housing returns (HR in Germany. In the estimated vector autoregressive models, we test its stability and find the short-run relationship between HR and EPU is unstable. As a result, a time-varying approach (bootstrap rolling window causality test is utilized to revisit the dynamic causal link, and we find EPU has no impact on HR due to the stability of the real estate market in Germany. HR does not have significant effects on EPU in most time periods. However, significant feedback in several sub-periods (both positive and negative are found from HR to EPU, which indicates the causal link from HR to EPU varies over time. The empirical results do not support the general equilibrium model of government policy choices that indicate EPU does not play a role in the real estate market. The basic conclusion is that the real estate market shows its stability due to the social welfare nature and the rational institutional arrangement of the real estate in Germany, and the real estate market also shows its importance that it has significant effect on the economic policy choice in some periods when negative external shocks occur.
Progress toward steady-state tokamak operation exploiting the high bootstrap current fraction regime
Ren, Q. L.; Garofalo, A. M.; Gong, X. Z.; Holcomb, C. T.; Lao, L. L.; McKee, G. R.; Meneghini, O.; Staebler, G. M.; Grierson, B. A.; Qian, J. P.; Solomon, W. M.; Turnbull, A. D.; Holland, C.; Guo, W. F.; Ding, S. Y.; Pan, C. K.; Xu, G. S.; Wan, B. N.
2016-06-01
Recent DIII-D experiments have increased the normalized fusion performance of the high bootstrap current fraction tokamak regime toward reactor-relevant steady state operation. The experiments, conducted by a joint team of researchers from the DIII-D and EAST tokamaks, developed a fully noninductive scenario that could be extended on EAST to a demonstration of long pulse steady-state tokamak operation. Improved understanding of scenario stability has led to the achievement of very high values of βp and βN , despite strong internal transport barriers. Good confinement has been achieved with reduced toroidal rotation. These high βp plasmas challenge the energy transport understanding, especially in the electron energy channel. A new turbulent transport model, named TGLF-SAT1, has been developed which improves the transport prediction. Experiments extending results to long pulse on EAST, based on the physics basis developed at DIII-D, have been conducted. More investigations will be carried out on EAST with more additional auxiliary power to come online in the near term.
Directory of Open Access Journals (Sweden)
Łukasz Lach
2010-06-01
Full Text Available This paper examines the size performance of the Toda-Yamamoto testfor Granger causality in the case of trivariate integrated and cointegratedVAR systems. The standard asymptotic distribution theory andthe residual-based bootstrap approach are applied. A variety of typesof distribution of error term is considered. The impact of misspecificationof initial parameters as well as the influence of an increase in samplesize and number of bootstrap replications on size performance ofToda-Yamamoto test statistics is also examined. The results of the conductedsimulation study confirm that standard asymptotic distributiontheory may often cause significant over-rejection. Application of bootstrapmethods usually leads to improvement of size performance of theToda-Yamamoto test. However, in some cases the considered bootstrapmethod also leads to serious size distortion and performs worse thanthe traditional approach based on χ2 distribution.
Sequential dependencies in magnitude scaling of loudness
DEFF Research Database (Denmark)
Joshi, Suyash Narendra; Jesteadt, Walt
2013-01-01
B were used to program the sone-potentiometer. The knob settings systematically influenced the form of the loudness function. Time series analysis was used to assess the sequential dependencies in the data, which increased with increasing exponent and were greatest for the log-law. It would be possible......, therefore, to choose knob properties that minimized these dependencies. When the sequential dependencies were removed from the data, the slope of the loudness functions did not change, but the variability decreased. Sequential dependencies were only present when the level of the tone on the previous trial...... was higher than on the current trial. According to the attention band hypothesis[Green and Luce, 1974, Perception & Psychophysics] these dependencies arise from a process similar to selective attention, but observations of rapid adaptation of neurons in the inferior colliculus based on stimulus level...
Correlation and Sequential Filtering with Doppler Measurements
Institute of Scientific and Technical Information of China (English)
WANGJianguo; HEPeikun; HANYueqiu; WUSiliang
2004-01-01
Two sequential filters are developed for Doppler radar measurements in the presence of correlation between range and range rate measurement errors. Two ideal linear measurement equations with the pseudo measurements are constructed via block-partitioned Cholesky factorization and the practical measurement equationswith the pseudo measurements are obtained through the direction cosine estimation and error compensation. The resulting sequential filters make the position measurement be possibly processed before the pseudo measurement and hence the more accurate direction cosine estimate can be obtained from the filtered position estimate rather than the predicted state estimate. The numerical simulations with different rangerange rate correlation coefficients show thatthe proposed two sequential filters are almost equivalent in performance but both superior to the conventional extended Kalman filter for different correlation coefficients.
Sequential Pattern Mining Using Formal Language Tools
Directory of Open Access Journals (Sweden)
R. S. Jadon
2012-09-01
Full Text Available In present scenario almost every system and working is computerized and hence all information and data are being stored in Computers. Huge collections of data are emerging. Retrieval of untouched, hidden and important information from this huge data is quite tedious work. Data Mining is a great technological solution which extracts untouched, hidden and important information from vast databases to investigate noteworthy knowledge in the data warehouse. An important problem in data mining is to discover patterns in various fields like medical science, world wide web, telecommunication etc. In the field of Data Mining, Sequential pattern mining is one of the method in which we retrieve hidden pattern linked with instant or other sequences. In sequential pattern mining we extract those sequential patterns whose support count are greater than or equal to given minimum support threshold value. In current scenario users are interested in only specific and interesting pattern instead of entire probable sequential pattern. To control the exploration space users can use many heuristics which can be represented as constraints. Many algorithms have been developed in the fields of constraint mining which generate patterns as per user expectation. In the present work we will be exploring and enhancing the regular expression constraints .Regular expression is one of the constraint and number of algorithm developed for sequential pattern mining which uses regular expression as a constraint. Some constraints are neither regular nor context free like cross-serial pattern anbmcndm used in Swiss German Data. We cannot construct equivalent deterministic finite automata (DFA or Push down automata (PDA for such type of patterns. We have proposed a new algorithm PMFLT (Pattern Mining using Formal Language Tools for sequential pattern mining using formal language tools as constraints. The proposed algorithm finds only user specific frequent sequence in efficient
DJANSENA, Alradix; 田中, 宏明; 工藤, 亮
2015-01-01
CFRP has been used in aircraft structures for decades. Although CFRP is light, its laminationis its main weakness. We have developed a new method to increase the probability of detectingdelamination in carbon fiber reinforced plastic (CFRP) by narrowing the confidence interval ofthe changes in natural frequency. The changes in the natural frequency in delaminated CFRPare tiny compared with measurement errors. We use the bootstrap method, a statisticaltechnique that increases the estimation ac...
Directory of Open Access Journals (Sweden)
Campbell Michael J
2004-12-01
Full Text Available Abstract Health-Related Quality of Life (HRQoL measures are becoming increasingly used in clinical trials as primary outcome measures. Investigators are now asking statisticians for advice on how to analyse studies that have used HRQoL outcomes. HRQoL outcomes, like the SF-36, are usually measured on an ordinal scale. However, most investigators assume that there exists an underlying continuous latent variable that measures HRQoL, and that the actual measured outcomes (the ordered categories, reflect contiguous intervals along this continuum. The ordinal scaling of HRQoL measures means they tend to generate data that have discrete, bounded and skewed distributions. Thus, standard methods of analysis such as the t-test and linear regression that assume Normality and constant variance may not be appropriate. For this reason, conventional statistical advice would suggest that non-parametric methods be used to analyse HRQoL data. The bootstrap is one such computer intensive non-parametric method for analysing data. We used the bootstrap for hypothesis testing and the estimation of standard errors and confidence intervals for parameters, in four datasets (which illustrate the different aspects of study design. We then compared and contrasted the bootstrap with standard methods of analysing HRQoL outcomes. The standard methods included t-tests, linear regression, summary measures and General Linear Models. Overall, in the datasets we studied, using the SF-36 outcome, bootstrap methods produce results similar to conventional statistical methods. This is likely because the t-test and linear regression are robust to the violations of assumptions that HRQoL data are likely to cause (i.e. non-Normality. While particular to our datasets, these findings are likely to generalise to other HRQoL outcomes, which have discrete, bounded and skewed distributions. Future research with other HRQoL outcome measures, interventions and populations, is required to
Directory of Open Access Journals (Sweden)
Enrico Zio
2008-01-01
Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.
Asynchronous sequential machine design and analysis
Tinder, Richard
2009-01-01
Asynchronous Sequential Machine Design and Analysis provides a lucid, in-depth treatment of asynchronous state machine design and analysis presented in two parts: Part I on the background fundamentals related to asynchronous sequential logic circuits generally, and Part II on self-timed systems, high-performance asynchronous programmable sequencers, and arbiters.Part I provides a detailed review of the background fundamentals for the design and analysis of asynchronous finite state machines (FSMs). Included are the basic models, use of fully documented state diagrams, and the design and charac
Text Classification: A Sequential Reading Approach
Dulac-Arnold, Gabriel; Gallinari, Patrick
2011-01-01
We propose to model the text classification process as a sequential decision process. In this process, an agent learns to classify documents into topics while reading the document sentences sequentially and learns to stop as soon as enough information was read for deciding. The proposed algorithm is based on a modelisation of Text Classification as a Markov Decision Process and learns by using Reinforcement Learning. Experiments on four different classical mono-label corpora show that the proposed approach performs comparably to classical SVM approaches for large training sets, and better for small training sets. In addition, the model automatically adapts its reading process to the quantity of training information provided.
Li, Hao; Dong, Siping
2015-01-01
China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making.
Directory of Open Access Journals (Sweden)
Rohin Anhal
2013-10-01
Full Text Available The aim of this paper is to examine the direction of causality between real GDP on the one hand and final energy and coal consumption on the other in India, for the period from 1970 to 2011. The methodology adopted is the non-parametric bootstrap procedure, which is used to construct the critical values for the hypothesis of causality. The results of the bootstrap tests show that for total energy consumption, there exists no causal relationship in either direction with GDP of India. However, if coal consumption is considered, we find evidence in support of unidirectional causality running from coal consumption to GDP. This clearly has important implications for the Indian economy. The most important implication is that curbing coal consumption in order to reduce carbon emissions would in turn have a limiting effect on economic growth. Our analysis contributes to the literature in three distinct ways. First, this is the first paper to use the bootstrap method to examine the growth-energy connection for the Indian economy. Second, we analyze data for the time period 1970 to 2011, thereby utilizing recently available data that has not been used by others. Finally, in contrast to the recently done studies, we adopt a disaggregated approach for the analysis of the growth-energy nexus by considering not only aggregate energy consumption, but coal consumption as well.
Sequential protein NMR assignments in the liquid state via sequential data acquisition
Wiedemann, Christoph; Bellstedt, Peter; Kirschstein, Anika; Häfner, Sabine; Herbst, Christian; Görlach, Matthias; Ramachandran, Ramadurai
2014-02-01
Two different NMR pulse schemes involving sequential 1H data acquisition are presented for achieving protein backbone sequential resonance assignments: (i) acquisition of 3D {HCCNH and HNCACONH} and (ii) collection of 3D {HNCOCANH and HNCACONH} chemical shift correlation spectra using uniformly 13C,15N labelled proteins. The sequential acquisition of these spectra reduces the overall experimental time by a factor of ≈2 as compared to individual acquisitions. The suitability of this approach is experimentally demonstrated for the C-terminal winged helix (WH) domain of the minichromosome maintenance (MCM) complex of Sulfolobus solfataricus.
Directory of Open Access Journals (Sweden)
PEREIRA JOSÉ ERIVALDO
2000-01-01
Full Text Available Estimativas "bootstrap" da média aritmética dos genótipos de soja 'Pickett', 'Peking', PI88788 e PI90763 e os intervalos de confiança obtidos pela teoria normal e através da distribuição "bootstrap" deste estimador, como o percentil "bootstrap" e o BCa, correção para o viés e aceleração, do parâmetro de diferenciação da cultivar padrão de suscetibilidade Lee são utilizados para classificar raças do nematóide de cisto da soja. Os intervalos de confiança obtidos a partir da distribuição "bootstrap" apresentaram menor amplitude e foram muito similares, dessa forma, o limite inferior do intervalo de confiança percentil "bootstrap" foi tomado como nível de referência nas distribuições "bootstrap" do estimador da média aritmética dos genótipos diferenciadores, permitindo estimar a probabilidade empírica de uma reação positiva ou negativa, e, conseqüentemente, identificar a raça mais provável sob determinado teste.
A Bayesian sequential design with binary outcome.
Zhu, Han; Yu, Qingzhao; Mercante, Donald E
2017-03-02
Several researchers have proposed solutions to control type I error rate in sequential designs. The use of Bayesian sequential design becomes more common; however, these designs are subject to inflation of the type I error rate. We propose a Bayesian sequential design for binary outcome using an alpha-spending function to control the overall type I error rate. Algorithms are presented for calculating critical values and power for the proposed designs. We also propose a new stopping rule for futility. Sensitivity analysis is implemented for assessing the effects of varying the parameters of the prior distribution and maximum total sample size on critical values. Alpha-spending functions are compared using power and actual sample size through simulations. Further simulations show that, when total sample size is fixed, the proposed design has greater power than the traditional Bayesian sequential design, which sets equal stopping bounds at all interim analyses. We also find that the proposed design with the new stopping for futility rule results in greater power and can stop earlier with a smaller actual sample size, compared with the traditional stopping rule for futility when all other conditions are held constant. Finally, we apply the proposed method to a real data set and compare the results with traditional designs.
Terminating Sequential Delphi Survey Data Collection
Kalaian, Sema A.; Kasim, Rafa M.
2012-01-01
The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…
On Sequentially Co-Cohen-Macaulay Modules
Institute of Scientific and Technical Information of China (English)
Nguyen Thi Dung
2007-01-01
In this paper,we define the notion of dimension filtration of an Artinian module and study a class of Artinian modules,called sequentially co-Cohen-Macaulay modules,which contains strictly all co-Cohen-Macaulay modules.Some characterizations of co-Cohen-Macaulayness in terms of the Matlis duality and of local homology are also given.
A Parallel Programming Model With Sequential Semantics
1996-01-01
Parallel programming is more difficult than sequential programming in part because of the complexity of reasoning, testing, and debugging in the...context of concurrency. In the thesis, we present and investigate a parallel programming model that provides direct control of parallelism in a notation
Sequential motor skill: cognition, perception and action
Ruitenberg, M.F.L.
2013-01-01
Discrete movement sequences are assumed to be the building blocks of more complex sequential actions that are present in our everyday behavior. The studies presented in this dissertation address the (neuro)cognitive underpinnings of such movement sequences, in particular in relationship to the role
Comprehensive sequential interventional therapy for hepatocellular carcinoma
Institute of Scientific and Technical Information of China (English)
ZHANG Liang; FAN Wei-jun; HUANG Jin-hua; LI Chuan-xing; ZHAO Ming; WANG Li-gang; TANG Tian
2009-01-01
Background Since the 1980s, various approaches to interventional therapy have been developed, with the development and achievement of medical imaging technology. This study aimed to evaluate the effectiveness of comprehensive sequential interventional therapy especially personal therapeutic plan in 53 radical cure patients with hepatocellular carcinoma (HCC).Methods From January 2003 to January 2005, a total of 203 patients with HCC received sequential interventional treatment in our hospital. Fifty-three patients achieved radical cure outcomes. Those patients were treated with transcatheter arterial chemoembolization (TACE), radiofrequency ablation (RFA), percutaneous ethanol injection (PEI), or high intensity focused ultrasound (HIFU), sequentially and in combination depending on their clinical and pathological features. PET-CT was used to evaluate, assess, and guide treatment.Results Based on the imaging and serological data, all the patients had a personal therapeutic plan. The longest follow-up time was 24 months, the shortest was 6 months, and mean survival time was 16.5 months.Conclusion Comprehensive sequential interventional therapy especially personal therapeutic plan for HCC play roles in interventional treatment of HCC in middle or advanced stage.
Mathematical Problem Solving through Sequential Process Analysis
Codina, A.; Cañadas, M. C.; Castro, E.
2015-01-01
Introduction: The macroscopic perspective is one of the frameworks for research on problem solving in mathematics education. Coming from this perspective, our study addresses the stages of thought in mathematical problem solving, offering an innovative approach because we apply sequential relations and global interrelations between the different…
Sequentiality versus simultaneity: Interrelated factor demand
Asphjell, M.K.; Letterie, W.A.; Nilsen, O.A.; Pfann, G.A.
2014-01-01
Firms may adjust capital and labor sequentially or simultaneously. In this paper, we develop a structural model of interrelated factor demand subject to nonconvex adjustment costs and estimated by simulated method of moments. Based on Norwegian manufacturing industry plant-level data, parameter esti
Sequential Tests for Large Scale Learning
Korattikara, A.; Chen, Y.; Welling, M.
2016-01-01
We argue that when faced with big data sets, learning and inference algorithms should compute updates using only subsets of data items. We introduce algorithms that use sequential hypothesis tests to adaptively select such a subset of data points. The statistical properties of this subsampling proce
Stability of response characteristics of a Delphi panel: application of bootstrap data expansion
Directory of Open Access Journals (Sweden)
Cole Bryan R
2005-12-01
Full Text Available Abstract Background Delphi surveys with panels of experts in a particular area of interest have been widely utilized in the fields of clinical medicine, nursing practice, medical education and healthcare services. Despite this wide applicability of the Delphi methodology, there is no clear identification of what constitutes a sufficient number of Delphi survey participants to ensure stability of results. Methods The study analyzed the response characteristics from the first round of a Delphi survey conducted with 23 experts in healthcare quality and patient safety. The panel members had similar training and subject matter understanding of the Malcolm Baldrige Criteria for Performance Excellence in Healthcare. The raw data from the first round sampling, which usually contains the largest diversity of responses, were augmented via bootstrap sampling to obtain computer-generated results for two larger samples obtained by sampling with replacement. Response characteristics (mean, trimmed mean, standard deviation and 95% confidence intervals for 54 survey items were compared for the responses of the 23 actual study participants and two computer-generated samples of 1000 and 2000 resampling iterations. Results The results from this study indicate that the response characteristics of a small expert panel in a well-defined knowledge area are stable in light of augmented sampling. Conclusion Panels of similarly trained experts (who possess a general understanding in the field of interest provide effective and reliable utilization of a small sample from a limited number of experts in a field of study to develop reliable criteria that inform judgment and support effective decision-making.
Tsai, Chia-Ling; Li, Chun-Yi; Yang, Gehua
2008-03-01
Red-free (RF) fundus retinal images and fluorescein angiogram (FA) sequence are often captured from an eye for diagnosis and treatment of abnormalities of the retina. With the aid of multimodal image registration, physicians can combine information to make accurate surgical planning and quantitative judgment of the progression of a disease. The goal of our work is to jointly align the RF images with the FA sequence of the same eye in a common reference space. Our work is inspired by Generalized Dual-Bootstrap Iterative Closest Point (GDB-ICP), which is a fully-automatic, feature-based method using structural similarity. GDB-ICP rank-orders Lowe keypoint matches and refines the transformation computed from each keypoint match in succession. Albeit GDB-ICP has been shown robust to image pairs with illumination difference, the performance is not satisfactory for multimodal and some FA pairs which exhibit substantial non-linear illumination changes. Our algorithm, named Edge-Driven DBICP, modifies generation of keypoint matches for initialization by extracting the Lowe keypoints from the gradient magnitude image, and enriching the keypoint descriptor with global-shape context using the edge points. Our dataset consists of 61 randomly selected pathological sequences, each on average having two RF and 13 FA images. There are total of 4985 image pairs, out of which 1323 are multimodal pairs. Edge-Driven DBICP successfully registered 93% of all pairs, and 82% multimodal pairs, whereas GDB-ICP registered 80% and 40%, respectively. Regarding registration of the whole image sequence in a common reference space, Edge-Driven DBICP succeeded in 60 sequences, which is 26% improvement over GDB-ICP.
Dexter, Troy A; Kowalewski, Michał
2013-12-01
Quantitative estimates of growth rates can augment ecological and paleontological applications of body-size data. However, in contrast to body-size estimates, assessing growth rates is often time-consuming, expensive, or unattainable. Here we use an indirect approach, a jackknife-corrected parametric bootstrap, for efficient approximation of growth rates using nearest living relatives with known age-size relationships. The estimate is developed by (1) collecting a sample of published growth rates of closely related species, (2) calculating the average growth curve using those published age-size relationships, (3) resampling iteratively these empirically known growth curves to estimate the standard errors and confidence bands around the average growth curve, and (4) applying the resulting estimate of uncertainty to bracket age-size relationships of the species of interest. This approach was applied to three monophyletic families (Donacidae, Mactridae, and Semelidae) of mollusk bivalves, a group characterized by indeterministic shell growth, but widely used in ecological, paleontological, and geochemical research. The resulting indirect estimates were tested against two previously published geochemical studies and, in both cases, yielded highly congruent age estimates. In addition, a case study in applied fisheries was used to illustrate the potential of the proposed approach for augmenting aquaculture management practices. The resulting estimates of growth rates place body size data in a constrained temporal context and confidence intervals associated with resampling estimates allow for assessing the statistical uncertainty around derived temporal ranges. The indirect approach should allow for improved evaluation of diverse research questions, from sustainability of industrial shellfish harvesting to climatic interpretations of stable isotope proxies extracted from fossil skeletons.
Adams, Jenny; Cheng, Dunlei; Lee, John; Shock, Tiffany; Kennedy, Kathleen; Pate, Scotty
2014-07-01
Physical fitness testing is a common tool for motivating employees with strenuous occupations to reach and maintain a minimum level of fitness. Nevertheless, the use of such tests can be hampered by several factors, including required compliance with US antidiscrimination laws. The Highland Park (Texas) Department of Public Safety implemented testing in 1991, but no single test adequately evaluated its sworn employees, who are cross-trained and serve as police officers and firefighters. In 2010, the department's fitness experts worked with exercise physiologists from Baylor Heart and Vascular Hospital to develop and evaluate a single test that would be equitable regardless of race/ethnicity, disability, sex, or age >50 years. The new test comprised a series of exercises to assess overall fitness, followed by two sequences of job-specific tasks related to firefighting and police work, respectively. The study group of 50 public safety officers took the test; raw data (e.g., the number of repetitions performed or the time required to complete a task) were collected during three quarterly testing sessions. The statistical bootstrap method was then used to determine the levels of performance that would correlate with 0, 1, 2, or 3 points for each task. A sensitivity analysis was done to determine the overall minimum passing score of 17 points. The new physical fitness test and scoring system have been incorporated into the department's policies and procedures as part of the town's overall employee fitness program.
Directory of Open Access Journals (Sweden)
Bahram Andarzian
2015-06-01
Full Text Available Wheat production in the south of Khuzestan, Iran is constrained by heat stress for late sowing dates. For optimization of yield, sowing at the appropriate time to fit the cultivar maturity length and growing season is critical. Crop models could be used to determine optimum sowing window for a locality. The objectives of this study were to evaluate the Cropping System Model (CSM-CERES-Wheat for its ability to simulate growth, development, grain yield of wheat in the tropical regions of Iran, and to study the impact of different sowing dates on wheat performance. The genetic coefficients of cultivar Chamran were calibrated for the CSM-CERES-Wheat model and crop model performance was evaluated with experimental data. Wheat cultivar Chamran was sown on different dates, ranging from 5 November to 9 January during 5 years of field experiments that were conducted in the Khuzestan province, Iran, under full and deficit irrigation conditions. The model was run for 8 sowing dates starting on 25 October and repeated every 10 days until 5 January using long-term historical weather data from the Ahvaz, Behbehan, Dezful and Izeh locations. The seasonal analysis program of DSSAT was used to determine the optimum sowing window for different locations as well. Evaluation with the experimental data showed that performance of the model was reasonable as indicated by fairly accurate simulation of crop phenology, biomass accumulation and grain yield against measured data. The normalized RMSE were 3%, 2%, 11.8%, and 3.4% for anthesis date, maturity date, grain yield and biomass, respectively. Optimum sowing window was different among locations. It was opened and closed on 5 November and 5 December for Ahvaz; 5 November and 15 December for Behbehan and Dezful;and 1 November and 15 December for Izeh, respectively. CERES-Wheat model could be used as a tool to evaluate the effect of sowing date on wheat performance in Khuzestan conditions. Further model evaluations
Mean-Variance-Validation Technique for Sequential Kriging Metamodels
Energy Technology Data Exchange (ETDEWEB)
Lee, Tae Hee; Kim, Ho Sung [Hanyang University, Seoul (Korea, Republic of)
2010-05-15
The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean{sub 0} validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean{sub 0} validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels.
Physics-based, Bayesian sequential detection method and system for radioactive contraband
Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E
2014-03-18
A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.
Sequential Analysis in High Dimensional Multiple Testing and Sparse Recovery
Malloy, Matthew; Nowak, Robert
2011-01-01
This paper studies the problem of high-dimensional multiple testing and sparse recovery from the perspective of sequential analysis. In this setting, the probability of error is a function of the dimension of the problem. A simple sequential testing procedure is proposed. We derive necessary conditions for reliable recovery in the non-sequential setting and contrast them with sufficient conditions for reliable recovery using the proposed sequential testing procedure. Applications of the main ...
Sequential motif profile of natural visibility graphs
Iacovacci, Jacopo
2016-01-01
The concept of sequential visibility graph motifs -subgraphs appearing with characteristic frequencies in the visibility graphs associated to time series- has been advanced recently along with a theoretical framework to compute analytically the motif profiles associated to Horizontal Visibility Graphs (HVGs). Here we develop a theory to compute the profile of sequential visibility graph motifs in the context of Natural Visibility Graphs (VGs). This theory gives exact results for deterministic aperiodic processes with a smooth invariant density or stochastic processes that fulfil the Markov property and have a continuous marginal distribution. The framework also allows for a linear time numerical estimation in the case of empirical time series. A comparison between the HVG and the VG case (including evaluation of their robustness for short series polluted with measurement noise) is also presented.
Sequential pivotal mechanisms for public project problems
Apt, Krzysztof R
2008-01-01
It is well-known that for several natural decision problems no budget balanced Groves mechanisms exist. This motivated recent research on designing variants of feasible Groves mechanisms (termed as `redistribution of VCG (Vickrey-Clarke-Groves) payments') that generate reduced deficit. With this in mind, we study sequential Groves mechanisms and consider optimal strategies that can lower the taxes that the players would have to pay under the simultaneous mechanism. We show that in the sequential pivotal mechanism for several variants of public project problems such strategies do exist. These strategies differ from truth-telling. In particular we exhibit an optimal strategy with the property that when each player follows it a maximal social welfare is generated.
Sequential shrink photolithography for plastic microlens arrays.
Dyer, David; Shreim, Samir; Jayadev, Shreshta; Lew, Valerie; Botvinick, Elliot; Khine, Michelle
2011-07-18
Endeavoring to push the boundaries of microfabrication with shrinkable polymers, we have developed a sequential shrink photolithography process. We demonstrate the utility of this approach by rapidly fabricating plastic microlens arrays. First, we create a mask out of the children's toy Shrinky Dinks by simply printing dots using a standard desktop printer. Upon retraction of this pre-stressed thermoplastic sheet, the dots shrink to a fraction of their original size, which we then lithographically transfer onto photoresist-coated commodity shrink wrap film. This shrink film reduces in area by 95% when briefly heated, creating smooth convex photoresist bumps down to 30 µm. Taken together, this sequential shrink process provides a complete process to create microlenses, with an almost 99% reduction in area from the original pattern size. Finally, with a lithography molding step, we emboss these bumps into optical grade plastics such as cyclic olefin copolymer for functional microlens arrays.
Verifying interpretive criteria for bioaerosol data using (bootstrap) Monte Carlo techniques.
Spicer, R Christopher; Gangloff, Harry
2008-02-01
A number of interpretive descriptors have been proposed for bioaerosol data due to the lack of health-based numerical standards, but very few have been verified as to their ability to describe a suspect indoor environment. Culturable and nonculturable (spore trap) sampling using the bootstrap version of Monte Carlo simulation (BMC) at several sites during 2003-2006 served as a source of indoor and outdoor data to test various criteria with regard to their variability in characterizing an indoor or outdoor environment. The purpose was to gain some insight for the reliability of some of the interpretive criteria in use as well as to demonstrate the utility of BMC methods as a generalized technique for validation of various interpretive criteria for bioaerosols. The ratio of nonphylloplane (NP) fungi (total of Aspergillus and Penicillium) to phylloplane (P) fungi (total of Cladosporium, Alternaria, and Epicoccum), or NP/P, is a descriptor that has been used to identify "dominance" of nonphylloplane fungi (NP/P > 1.0), assumed to be indicative of a problematic indoor environment. However, BMC analysis of spore trap and culturable bioaerosol data using the NP/P ratio identified frequent dominance by nonphylloplane fungi in outdoor air. Similarly, the NP/P descriptor indicated dominance of nonphylloplane fungi in buildings with visible mold growth and/or known water intrusion with a frequency often in the range of 0.5 Fixed numerical criteria for spore trap data of 900 and 1300 spores/m(3) for total spores and 750 Aspergillus/Penicillium spores/m(3) exhibited similar variability, as did ratios of nonphylloplane to total fungi, phylloplane to total fungi, and indoor/outdoor ratios for total fungal spores. Analysis of bioaerosol data by BMC indicates that numerical levels or descriptors based on dominance of certain fungi are unreliable as criteria for characterizing a given environment. The utility of BMC analysis lies in its generalized application to test mathematically
Directory of Open Access Journals (Sweden)
Gogarten J Peter
2002-02-01
Full Text Available Abstract Background Horizontal gene transfer (HGT played an important role in shaping microbial genomes. In addition to genes under sporadic selection, HGT also affects housekeeping genes and those involved in information processing, even ribosomal RNA encoding genes. Here we describe tools that provide an assessment and graphic illustration of the mosaic nature of microbial genomes. Results We adapted the Maximum Likelihood (ML mapping to the analyses of all detected quartets of orthologous genes found in four genomes. We have automated the assembly and analyses of these quartets of orthologs given the selection of four genomes. We compared the ML-mapping approach to more rigorous Bayesian probability and Bootstrap mapping techniques. The latter two approaches appear to be more conservative than the ML-mapping approach, but qualitatively all three approaches give equivalent results. All three tools were tested on mitochondrial genomes, which presumably were inherited as a single linkage group. Conclusions In some instances of interphylum relationships we find nearly equal numbers of quartets strongly supporting the three possible topologies. In contrast, our analyses of genome quartets containing the cyanobacterium Synechocystis sp. indicate that a large part of the cyanobacterial genome is related to that of low GC Gram positives. Other groups that had been suggested as sister groups to the cyanobacteria contain many fewer genes that group with the Synechocystis orthologs. Interdomain comparisons of genome quartets containing the archaeon Halobacterium sp. revealed that Halobacterium sp. shares more genes with Bacteria that live in the same environment than with Bacteria that are more closely related based on rRNA phylogeny . Many of these genes encode proteins involved in substrate transport and metabolism and in information storage and processing. The performed analyses demonstrate that relationships among prokaryotes cannot be accurately
Sequential decision analysis for nonstationary stochastic processes
Schaefer, B.
1974-01-01
A formulation of the problem of making decisions concerning the state of nonstationary stochastic processes is given. An optimal decision rule, for the case in which the stochastic process is independent of the decisions made, is derived. It is shown that this rule is a generalization of the Bayesian likelihood ratio test; and an analog to Wald's sequential likelihood ratio test is given, in which the optimal thresholds may vary with time.
Compressive Sequential Learning for Action Similarity Labeling.
Qin, Jie; Liu, Li; Zhang, Zhaoxiang; Wang, Yunhong; Shao, Ling
2016-02-01
Human action recognition in videos has been extensively studied in recent years due to its wide range of applications. Instead of classifying video sequences into a number of action categories, in this paper, we focus on a particular problem of action similarity labeling (ASLAN), which aims at verifying whether a pair of videos contain the same type of action or not. To address this challenge, a novel approach called compressive sequential learning (CSL) is proposed by leveraging the compressive sensing theory and sequential learning. We first project data points to a low-dimensional space by effectively exploring an important property in compressive sensing: the restricted isometry property. In particular, a very sparse measurement matrix is adopted to reduce the dimensionality efficiently. We then learn an ensemble classifier for measuring similarities between pairwise videos by iteratively minimizing its empirical risk with the AdaBoost strategy on the training set. Unlike conventional AdaBoost, the weak learner for each iteration is not explicitly defined and its parameters are learned through greedy optimization. Furthermore, an alternative of CSL named compressive sequential encoding is developed as an encoding technique and followed by a linear classifier to address the similarity-labeling problem. Our method has been systematically evaluated on four action data sets: ASLAN, KTH, HMDB51, and Hollywood2, and the results show the effectiveness and superiority of our method for ASLAN.
Directory of Open Access Journals (Sweden)
Dropkin Greg
2009-12-01
Full Text Available Abstract Background The International Commission on Radiological Protection (ICRP recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years, liver (36.9, lung (13.6, leukaemia (23.66, and pancreas (11.86 and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of
Pismensky, Artem L
2015-01-01
The method of calculation of $\\varepsilon$-expansion in model of scalar field with $\\varphi^3$-interaction based on conformal bootstrap equations is proposed. This technique is based on self-consistent skeleton equations involving full propagator and full triple vertex. Analytical computations of the Fisher's index $\\eta$ are performed in four-loop approximation. The three-loop result coincides with one obtained previously by the renormalization group equations technique based on calculation of a larger number of Feynman diagrams. The four-loop result agrees with its numerical value obtained by other authors.
Energy Technology Data Exchange (ETDEWEB)
Narayan, Paresh Kumar [Department of Accounting, Finance and Economics, Griffith University, Gold Coast (Australia); Prasad, Arti [School of Economics, University of the South Pacific, Suva (Fiji)
2008-02-15
The goal of this paper is to examine any causal effects between electricity consumption and real GDP for 30 OECD countries. We use a bootstrapped causality testing approach and unravel evidence in favour of electricity consumption causing real GDP in Australia, Iceland, Italy, the Slovak Republic, the Czech Republic, Korea, Portugal, and the UK. The implication is that electricity conservation policies will negatively impact real GDP in these countries. However, for the rest of the 22 countries our findings suggest that electricity conversation policies will not affect real GDP. (author)
Online Sequential Projection Vector Machine with Adaptive Data Mean Update
Directory of Open Access Journals (Sweden)
Lin Chen
2016-01-01
Full Text Available We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1 the projection vectors for dimension reduction, (2 the input weights, biases, and output weights, and (3 the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD approach, adaptive multihyperplane machine (AMM, primal estimated subgradient solver (Pegasos, online sequential extreme learning machine (OSELM, and SVD + OSELM (feature selection based on SVD is performed before OSELM. The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.
The Methodology of Testability Prediction for Sequential Circuits
Institute of Scientific and Technical Information of China (English)
徐拾义; 陈斯
1996-01-01
Increasingly,test generation algorithms are being developed with the continuous creations of incredibly sophisticated computing systems.Of all the developments of testable as well as reliable designs for computing systems,the test generation for sequential circuits is usually viewed as one of the hard nuts to be solved for its complexity and time-consuming issue.Although dozens of algorithms have been proposed to cope with this issue,it still remains much to be desired in solving such problems as to determin 1) which of the existing test generation algorithms could be the most efficient for some particular circuits(by efficiency,we mean the Fault Coverage the algorithm offers,CPU time when executing,the number of test patterns to be applied,ectc.)since different algorithms would be preferable for different circuits;2)which parameters(such as the number of gates,flip-flops and loops,etc., in the circuit)will have the most or least influences on test generation so that the designers of circuits can have a global understanding during the stage of designing for testability.Testability forecastin methodology for the sequential circuits using regression models is presented which a user usually needs for analyzing his own circuits and selecting the most suitable test generation algorithm from all possible algorithms available.Some examples and experiment results are also provided in order to show how helpful and practical the method is.
Sequential Multiple Response Optimization for Manufacturing Flexible Printed Circuit
Directory of Open Access Journals (Sweden)
Pichpimon Kanchanasuttisang
2012-01-01
Full Text Available Problem statement: Flexible Printed Circuit or FPC, one of automotive electronic parts, has been developed for lighting automotive vehicles by assembling with the LED. The quality performances or responses of lighting vehicles are relied on the circuit width of an FPC and the etched rate of acid solution. According to the current operating condition of an FPC company, the capability of the manufacturing process is under the company requirement. The standard deviation of FPC circuit widths is at higher levels and the mean is also worse than specifications. Approach: In this process improvement there was four sequential steps based on the designed experiments, steepest descent and interchangeable linear constrained response surface optimization or IC-LCRSOM. An investigation aims to determine the preferable levels of significant process variables affecting multiple responses. Results: The new settings from the IC-LCRSOM improved all performance measures in terms of both the mean and the standard deviation on all process patterns. Conclusion: From this sequential optimization the developed mathematical model has tested for adequacy using analysis of variance and other adequacy measures. In the actual investigation, the new operating conditions lead to higher levels of the etched rate and process capability including lower levels of the standard deviation of the circuit widths and etched rate when compared.
Sequential design approaches for bioequivalence studies with crossover designs.
Potvin, Diane; DiLiberti, Charles E; Hauck, Walter W; Parr, Alan F; Schuirmann, Donald J; Smith, Robert A
2008-01-01
The planning of bioequivalence (BE) studies, as for any clinical trial, requires a priori specification of an effect size for the determination of power and an assumption about the variance. The specified effect size may be overly optimistic, leading to an underpowered study. The assumed variance can be either too small or too large, leading, respectively, to studies that are underpowered or overly large. There has been much work in the clinical trials field on various types of sequential designs that include sample size reestimation after the trial is started, but these have seen only little use in BE studies. The purpose of this work was to validate at least one such method for crossover design BE studies. Specifically, we considered sample size reestimation for a two-stage trial based on the variance estimated from the first stage. We identified two methods based on Pocock's method for group sequential trials that met our requirement for at most negligible increase in type I error rate.
eSeeTrack--visualizing sequential fixation patterns.
Tsang, Hoi Ying; Tory, Melanie; Swindells, Colin
2010-01-01
We introduce eSeeTrack, an eye-tracking visualization prototype that facilitates exploration and comparison of sequential gaze orderings in a static or a dynamic scene. It extends current eye-tracking data visualizations by extracting patterns of sequential gaze orderings, displaying these patterns in a way that does not depend on the number of fixations on a scene, and enabling users to compare patterns from two or more sets of eye-gaze data. Extracting such patterns was very difficult with previous visualization techniques. eSeeTrack combines a timeline and a tree-structured visual representation to embody three aspects of eye-tracking data that users are interested in: duration, frequency and orderings of fixations. We demonstrate the usefulness of eSeeTrack via two case studies on surgical simulation and retail store chain data. We found that eSeeTrack allows ordering of fixations to be rapidly queried, explored and compared. Furthermore, our tool provides an effective and efficient mechanism to determine pattern outliers. This approach can be effective for behavior analysis in a variety of domains that are described at the end of this paper.
Simultaneous computation within a sequential process simulation tool
Directory of Open Access Journals (Sweden)
G. Endrestøl
1989-10-01
Full Text Available The paper describes an equation solver superstructure developed for a sequential modular dynamic process simulation system as part of a Eureka project with Norwegian and British participation. The purpose of the development was combining some of the advantages of equation based and purely sequential systems, enabling implicit treatment of key variables independent of module boundaries, and use of numerical integration techniques suitable for each individual type of variable. For training simulator applications the main advantages are gains in speed due to increased stability limits on time steps and improved consistency of simulation results. The system is split into an off-line analysis phase and an on-line equation solver. The off-line processing consists of automatic determination of the topological structure of the system connectivity from standard process description files and derivation of an optimized sparse matrix solution procedure for the resulting set of equations. The on-line routine collects equation coefficients from involved modules, solves the combined sets of structured equations, and stores the results appropriately. This method minimizes the processing cost during the actual simulation. The solver has been applied in the Veslefrikk training simulator project.
Learning Semantic Lexicons Using Graph Mutual Reinforcement Based Bootstrapping%利用基于图互增理论的自举算法学习语义辞典
Institute of Scientific and Technical Information of China (English)
张奇; 邱锡鹏; 黄萱菁; 吴立德
2008-01-01
This paper presents a method to learn semantic lexicons using a new bootstrapping method based on graph mutual reinforcement (GMR). The approach uses only unlabeled data and a few seed words to learn new words for each semantic category.Different from other bootstrapping methods, we use GMR-based bootstrapping to sort the candidate words and patterns. Experi-mental results show that the GMR-based bootstrapping approach outperforms the existing algorithms both in in-domain data and out-domain data. Furthermore, it shows that the result depends on not only the size of the corpus but also the quality.
Institute of Scientific and Technical Information of China (English)
王焱; 汪震; 黄民翔; 蔡祯祺; 杨濛濛
2014-01-01
提出了一种基于在线序贯极限学习机(OS-ELM)的超短期风电功率预测方法。利用 OS-ELM学习速度快、泛化能力强的优点，将批处理和逐次迭代相结合，不断更新训练数据和网络结构，实现了对数值天气预报风速的快速实时修正和风电机组输出功率的快速预测。随后，采用计算机自助(Bootstrap)法构造伪样本，给出了预测功率的置信区间评估。实例和研究结果表明，该预测方法与反向传播(BP)网络、支持向量机(SVM)方法相比，在计算时间上更能满足在线应用需求，而且预测精度相当，有较好的应用前景。%An ultra-short-term wind power prediction method based on an online sequential extreme learning machine (OS-ELM) is proposed.Firstly,the OS-ELM is utilized to correct the predicted wind speed sequence so as to amend and improve the accuracy of predicted wind speed.Then,by combining batch processing with successive iteration,real-time prediction of wind turbine power output is accomplished with the help of the advantages of OS-ELM”s fast learning speed and strong generalization ability.Finally,a Bootstrap method is adopted to estimate the predicted intervals by resampling data.Analysis results show that,compared with the back propagation (BP) network and support vector machine (SVM) method,this prediction method can better meet the demand of online application and has good application prospects,while its forecasting accuracy is comparable to BP network and SVM method. This work is supported by National High Technology Research and Development Program of China (863 Program) (No.2011AA050204)and National Natural Science Foundation of China(No.51277160).
Sequential nonlinear tracking filter without requirement of measurement decorrelation
Institute of Scientific and Technical Information of China (English)
Taifan Quan
2015-01-01
Sequential measurement processing is of benefit to both estimation accuracy and computational efficiency. When the noises are correlated across the measurement components, decorrelation based on covariance matrix factorization is required in the previous methods in order to perform sequential updates properly. A new sequential processing method, which carries out the sequential updates directly using the correlated measurement components, is proposed. And a typical sequential processing example is investigated, where the converted position measure-ments are used to estimate target states by standard Kalman filtering equations and the converted Doppler measurements are then incorporated into a minimum mean squared error (MMSE) estimator with the updated cross-covariance involved to account for the correlated errors. Numerical simulations demonstrate the superiority of the proposed new sequential processing in terms of better accuracy and consistency than the conventional sequential filter based on measurement decorrelation.
Sequential monitoring of response-adaptive randomized clinical trials
Zhu, Hongjian; 10.1214/10-AOS796
2010-01-01
Clinical trials are complex and usually involve multiple objectives such as controlling type I error rate, increasing power to detect treatment difference, assigning more patients to better treatment, and more. In literature, both response-adaptive randomization (RAR) procedures (by changing randomization procedure sequentially) and sequential monitoring (by changing analysis procedure sequentially) have been proposed to achieve these objectives to some degree. In this paper, we propose to sequentially monitor response-adaptive randomized clinical trial and study it's properties. We prove that the sequential test statistics of the new procedure converge to a Brownian motion in distribution. Further, we show that the sequential test statistics asymptotically satisfy the canonical joint distribution defined in Jennison and Turnbull (\\citeyearJT00). Therefore, type I error and other objectives can be achieved theoretically by selecting appropriate boundaries. These results open a door to sequentially monitor res...
Miller, Ronald Mellado; Capaldi, E. John
2006-01-01
Sequential theory's memory model of learning has been successfully applied in response contingent instrumental conditioning experiments (Capaldi, 1966, 1967, 1994; Capaldi & Miller, 2003). However, it has not been systematically tested in nonresponse contingent Pavlovian conditioning experiments. The present experiments attempted to determine if…
Transition from non-sequential to sequential double ionisation in many-electron systems
Pullen, Michael G; Wang, Xu; Tong, Xiao-Min; Sclafani, Michele; Baudisch, Matthias; Pires, Hugo; Schröter, Claus Dieter; Ullrich, Joachim; Pfeifer, Thomas; Moshammer, Robert; Eberly, J H; Biegert, Jens
2016-01-01
Obtaining a detailed understanding of strong-field double ionisation of many-electron systems (heavy atoms and molecules) remains a challenging task. By comparing experimental and theoretical results in the mid-IR regime, we have unambiguously identified the transition from non-sequential (e,2e) to sequential double ionisation in Xe and shown that it occurs at an intensity below $10^{14}$ Wcm$^{-2}$. In addition, our data demonstrate that ionisation from the Xe 5s orbital is decisive at low intensities. Moreover, using the acetylene molecule, we propose how sequential double ionisation in the mid-IR can be used to study molecular dynamics and fragmentation on unprecedented few-femtosecond timescales.
Fu, Rao; Liu, Lina; Guo, Yingna; Guo, Liping; Yang, Li
2014-02-28
A novel method for online monitoring racemization reaction of alanine (Ala) enantiomers was developed, by combining sequential sample injection and micellar electrokinetic chromatography (MEKC) technique. Various conditions were investigated to optimize the sequential injection, Ala derivatization and MEKC chiral separation of d-/l-Ala. High reproducibility of the sequential MEKC analysis was demonstrated by analyzing the standard Ala samples, with relative standard deviation values (n=20) of 1.35%, 1.98%, and 1.09% for peak height, peak area and migration time, respectively. Ala racemization was automatically monitored every 40s from the beginning to the end of the reaction, by simultaneous detection of the consumption of the substrate enantiomer and the formation of the product enantiomer. The Michaelis constants of the racemization reaction were obtained by the sequential MEKC method, and were in good agreement with those obtained by traditional off-line enzyme assay. Our study indicated that the present sequential MEKC method can perform fast, efficient, accurate and reproducible analysis of racemization reaction of amino acids, which is of great importance for the determination of the activity of racemase and thus understanding its metabolic functions.
Hasan, Mahmudul; Tycner, Christopher; Sigut, Aaron; Zavala, Robert T.
2017-01-01
We describe a modified bootstrap Monte Carlo method that was developed to assess quantitatively the impact of systematic residual errors on calibrated optical interferometry data from the Navy Precision Optical Interferometer. A variety of atmospheric and instrumental effects represent the sources of residual systematic errors that remain in the data after calibration, for example when there are atmospheric fluctuations with shorter time scales than the time scale between the observations of calibrator-target pairs. The modified bootstrap Monte Carlo method retains the inherent structure of how the underlying data set was acquired, by accounting for the fact that groups of data points are obtained simultaneously instead of individual data points. When telescope pairs (baselines) and spectral channels corresponding to a specific output beam from a beam combiner are treated as groups, this method provides a more realistic (and typically larger) uncertainties associated with the fitted model parameters, such as angular diameters of resolved stars, than the standard method based solely on formal errors.This work has been supported by NSF grant AST-1614983.
An Approach to Characterizing the Complicated Sequential Metabolism of Salidroside in Rats
Directory of Open Access Journals (Sweden)
Zhiqiang Luo
2016-05-01
Full Text Available Metabolic study of bioactive compounds that undergo a dynamic and sequential process of metabolism is still a great challenge. Salidroside, one of the most active ingredients of Rhodiola crenulata, can be metabolized in different sites before being absorbed into the systemic blood stream. This study proposed an approach for describing the sequential biotransformation process of salidroside based on comparative analysis. In vitro incubation, in situ closed-loop and in vivo blood sampling were used to determine the relative contribution of each site to the total metabolism of salidroside. The results showed that salidroside was stable in digestive juice, and it was metabolized primarily by the liver and the intestinal flora and to a lesser extent by the gut wall. The sequential metabolism method described in this study could be a general approach to characterizing the metabolic routes in the digestive system for natural products.
Poage, J. L.
1975-01-01
A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.
Generalised sequential crossover of words and languages
Jeganathan, L; Sengupta, Ritabrata
2009-01-01
In this paper, we propose a new operation, Generalised Sequential Crossover (GSCO) of words, which in some sense an abstract model of crossing over of the chromosomes in the living organisms. We extend GSCO over language $L$ iteratively ($GSCO^*(L)$ as well as iterated GSCO over two languages $GSCO^*(L_1,L_2)$). Our study reveals that $GSCO^*(L)$ is subclass of regular languages for any $L$. We compare the different classes of GSCO languages with the prominent sub-regular classes.
Sequential cooling insert for turbine stator vane
Jones, Russell B; Krueger, Judson J; Plank, William L
2014-04-01
A sequential impingement cooling insert for a turbine stator vane that forms a double impingement for the pressure and suction sides of the vane or a triple impingement. The insert is formed from a sheet metal formed in a zigzag shape that forms a series of alternating impingement cooling channels with return air channels, where pressure side and suction side impingement cooling plates are secured over the zigzag shaped main piece. Another embodiment includes the insert formed from one or two blocks of material in which the impingement channels and return air channels are machined into each block.
Automatic differentiation for reduced sequential quadratic programming
Institute of Scientific and Technical Information of China (English)
Liao Liangcai; Li Jin; Tan Yuejin
2007-01-01
In order to slove the large-scale nonlinear programming (NLP) problems efficiently, an efficient optimization algorithm based on reduced sequential quadratic programming (rSQP) and automatic differentiation (AD) is presented in this paper. With the characteristics of sparseness, relatively low degrees of freedom and equality constraints utilized, the nonlinear programming problem is solved by improved rSQP solver. In the solving process, AD technology is used to obtain accurate gradient information. The numerical results show that the combined algorithm, which is suitable for large-scale process optimization problems, can calculate more efficiently than rSQP itself.
A Sequential Algorithm for Training Text Classifiers
Lewis, D D; Lewis, David D.; Gale, William A.
1994-01-01
The ability to cheaply train text classifiers is critical to their use in information retrieval, content analysis, natural language processing, and other tasks involving data which is partly or fully textual. An algorithm for sequential sampling during machine learning of statistical classifiers was developed and tested on a newswire text categorization task. This method, which we call uncertainty sampling, reduced by as much as 500-fold the amount of training data that would have to be manually classified to achieve a given level of effectiveness.
Nonlinear sequential laminates reproducing hollow sphere assemblages
Idiart, Martín I.
2007-07-01
A special class of nonlinear porous materials with isotropic 'sequentially laminated' microstructures is found to reproduce exactly the hydrostatic behavior of 'hollow sphere assemblages'. It is then argued that this result supports the conjecture that Gurson's approximate criterion for plastic porous materials, and its viscoplastic extension of Leblond et al. (1994), may actually yield rigorous upper bounds for the hydrostatic flow stress of porous materials containing an isotropic, but otherwise arbitrary, distribution of porosity. To cite this article: M.I. Idiart, C. R. Mecanique 335 (2007).
Meyer, Michael T.; Lee, Edward A.; Scribner, Elisabeth A.
2007-01-01
An analytical method for the determination of isoxaflutole and its sequential degradation products, diketonitrile and a benzoic acid analogue, in filtered water with varying matrices was developed by the U.S. Geological Survey Organic Geochemistry Research Group in Lawrence, Kansas. Four different water-sample matrices fortified at 0.02 and 0.10 ug/L (micrograms per liter) are extracted by vacuum manifold solid-phase extraction and analyzed by liquid chromatography/tandem mass spectrometry using electrospray ionization in negative-ion mode with multiple-reaction monitoring (MRM). Analytical conditions for mass spectrometry detection are optimized, and quantitation is carried out using the following MRM molecular-hydrogen (precursor) ion and product (p) ion transition pairs: 357.9 (precursor), 78.9 (p), and 277.6 (p) for isoxaflutole and diketonitrile, and 267.0 (precursor), 159.0 (p), and 223.1 (p) for benzoic acid. 2,4-dichlorophenoxyacetic acid-d3 is used as the internal standard, and alachlor ethanesulfonic acid-d5 is used as the surrogate standard. Compound detection limits and reporting levels are calculated using U.S. Environmental Protection Agency procedures. The mean solid-phase extraction recovery values ranged from 104 to 108 percent with relative standard deviation percentages ranging from 4.0 to 10.6 percent. The combined mean percentage concentration normalized to the theoretical spiked concentration of four water matrices analyzed eight times at 0.02 and 0.10 ug/L (seven times for the reagent-water matrix at 0.02 ug/L) ranged from approximately 75 to 101 percent with relative standard deviation percentages ranging from approximately 3 to 26 percent for isoxaflutole, diketonitrile, and benzoic acid. The method detection limit (MDL) for isoxaflutole and diketonitrile is 0.003 ug/L and 0.004 ug/L for benzoic acid. Method reporting levels (MRLs) are 0.011, 0.010, and 0.012 ug/L for isoxaflutole, diketonitrile, and benzoic acid, respectively. On the basis
Sequential Antibiotic Therapy: Effective Cost Management and Patient Care
Directory of Open Access Journals (Sweden)
Lionel A Mandell
1995-01-01
Full Text Available The escalating costs associated with antimicrobial chemotherapy have become of increasing concern to physicians, pharmacists and patients alike. A number of strategies have been developed to address this problem. This article focuses specifically on sequential antibiotic therapy (sat, which is the strategy of converting patients from intravenous to oral medication regardless of whether the same or a different class of drug is used. Advantages of sat include economic benefits, patient benefits and benefits to the health care provider. Potential disadvantages are cost to the consumer and the risk of therapeutic failure. A critical review of the published literature shows that evidence from randomized controlled trials supports the role of sat. However, it is also clear that further studies are necessary to determine the optimal time for intravenous to oral changeover and to identify the variables that may interfere with the use of oral drugs. Procedures necessary for the implementation of a sat program in the hospital setting are also discussed.
Sequential evidence accumulation in decision making
Directory of Open Access Journals (Sweden)
Daniel Hausmann
2008-03-01
Full Text Available Judgments and decisions under uncertainty are frequently linked to a prior sequential search for relevant information. In such cases, the subject has to decide when to stop the search for information. Evidence accumulation models from social and cognitive psychology assume an active and sequential information search until enough evidence has been accumulated to pass a decision threshold. In line with such theories, we conceptualize the evidence threshold as the ``desired level of confidence'' (DLC of a person. This model is tested against a fixed stopping rule (one-reason decision making and against the class of multi-attribute information integrating models. A series of experiments using an information board for horse race betting demonstrates an advantage of the proposed model by measuring the individual DLC of each subject and confirming its correctness in two separate stages. In addition to a better understanding of the stopping rule (within the narrow framework of simple heuristics, the results indicate that individual aspiration levels might be a relevant factor when modelling decision making by task analysis of statistical environments.
Sensitivity validation technique for sequential kriging metamodel
Energy Technology Data Exchange (ETDEWEB)
Huh, Seung Kyun; Lee, Jin Min; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of)
2012-08-15
Metamodels have been developed with a variety of design optimization techniques in the field of structural engineering over the last decade because they are efficient, show excellent prediction performance, and provide easy interconnections into design frameworks. To construct a metamodel, a sequential procedure involving steps such as the design of experiments, metamodeling techniques, and validation techniques is performed. Because validation techniques can measure the accuracy of the metamodel, the number of presampled points for an accurate kriging metamodel is decided by the validation technique in the sequential kriging metamodel. Because the interpolation model such as the kriging metamodel based on computer experiments passes through responses at presampled points, additional analyses or reconstructions of the meta models are required to measure the accuracy of the meta model if existing validation techniques are applied. In this study, we suggest a sensitivity validation that does not require additional analyses or reconstructions of the meta models. Fourteen two dimensional mathematical problems and an engineering problem are illustrated to show the feasibility of the suggested method.
Modifications of sequential designs in bioequivalence trials.
Zheng, Cheng; Zhao, Lihui; Wang, Jixian
2015-01-01
Bioequivalence (BE) studies are designed to show that two formulations of one drug are equivalent and they play an important role in drug development. When in a design stage, it is possible that there is a high degree of uncertainty on variability of the formulations and the actual performance of the test versus reference formulation. Therefore, an interim look may be desirable to stop the study if there is no chance of claiming BE at the end (futility), or claim BE if evidence is sufficient (efficacy), or adjust the sample size. Sequential design approaches specially for BE studies have been proposed previously in publications. We applied modification to the existing methods focusing on simplified multiplicity adjustment and futility stopping. We name our method modified sequential design for BE studies (MSDBE). Simulation results demonstrate comparable performance between MSDBE and the original published methods while MSDBE offers more transparency and better applicability. The R package MSDBE is available at https://sites.google.com/site/modsdbe/.
Information Geometry and Sequential Monte Carlo
Sim, Aaron; Stumpf, Michael P H
2012-01-01
This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...
Sequential algorithm for fast clique percolation.
Kumpula, Jussi M; Kivelä, Mikko; Kaski, Kimmo; Saramäki, Jari
2008-08-01
In complex network research clique percolation, introduced by Palla, Derényi, and Vicsek [Nature (London) 435, 814 (2005)], is a deterministic community detection method which allows for overlapping communities and is purely based on local topological properties of a network. Here we present a sequential clique percolation algorithm (SCP) to do fast community detection in weighted and unweighted networks, for cliques of a chosen size. This method is based on sequentially inserting the constituent links to the network and simultaneously keeping track of the emerging community structure. Unlike existing algorithms, the SCP method allows for detecting k -clique communities at multiple weight thresholds in a single run, and can simultaneously produce a dendrogram representation of hierarchical community structure. In sparse weighted networks, the SCP algorithm can also be used for implementing the weighted clique percolation method recently introduced by Farkas [New J. Phys. 9, 180 (2007)]. The computational time of the SCP algorithm scales linearly with the number of k -cliques in the network. As an example, the method is applied to a product association network, revealing its nested community structure.
G-sequentially connectedness for topological groups with operations
Mucuk, Osman; Cakalli, Huseyin
2016-08-01
It is a well-known fact that for a Hausdorff topological group X, the limits of convergent sequences in X define a function denoted by lim from the set of all convergent sequences in X to X. This notion has been modified by Connor and Grosse-Erdmann for real functions by replacing lim with an arbitrary linear functional G defined on a linear subspace of the vector space of all real sequences. Recently some authors have extended the concept to the topological group setting and introduced the concepts of G-sequential continuity, G-sequential compactness and G-sequential connectedness. In this work, we present some results about G-sequentially closures, G-sequentially connectedness and fundamental system of G-sequentially open neighbourhoods for topological group with operations which include topological groups, topological rings without identity, R-modules, Lie algebras, Jordan algebras, and many others.
Institute of Scientific and Technical Information of China (English)
徐红鹃; 王锋; 艾宪芸; 邵晖; 魏星
2014-01-01
The character of airborne gamma -ray spectrum and the basic principles of bootstrap method are de-scribed in the paper .The 40 K window of the gamma spectrum data measured by the GR -820 airborne multi-channel gamma spectrometers at the height of 300 m from the ground is taken as a small sample .Bootstrap meth-od is used to infer the distribution parameter of the small sample and sampling to obtain calculated spectra .The calculated spectra are compared with the measured spectra at the height of 60m from the ground , satisfactory re-sults were obtained .It provides a feasible technology for quick measurement of airborne gamma spectrum at low activity levels .%介绍了航空γ能谱的特征和Bootstrap方法的基本原理，把GR-820机载核辐射监测系统测量的距地面300 m处的40 K窗口γ能谱数据作为小样本，利用Bootstrap方法对该小样本进行分布参数推断并进行抽样得到计算谱，与距地60 m处测得的γ能谱数据进行对比，吻合度较高，得到了满意的结果，在一定程度上突破了谱仪硬件限制，为实现航空γ能谱在低活度水平的快速测量提供了可行性技术方案。
Method of sequential mesh on Koopman-Darmois distributions
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
For costly and/or destructive tests,the sequential method with a proper maximum sample size is needed.Based on Koopman-Darmois distributions,this paper proposes the method of sequential mesh,which has an acceptable maximum sample size.In comparison with the popular truncated sequential probability ratio test,our method has the advantage of a smaller maximum sample size and is especially applicable for costly and/or destructive tests.
Wiegmann, Daniel D; Seubert, Steven M; Wade, Gordon A
2010-02-21
The behavior of a female in search of a mate determines the likelihood that she encounters a high-quality male in the search process. The fixed sample (best-of-n) search strategy and the sequential search (fixed threshold) strategy are two prominent models of search behavior. The sequential search strategy dominates the former strategy--yields an equal or higher expected net fitness return to searchers--when search costs are nontrivial and the distribution of quality among prospective mates is uniform or truncated normal. In this paper our objective is to determine whether there are any search costs or distributions of male quality for which the sequential search strategy is inferior to the fixed sample search strategy. The two search strategies are derived under general conditions in which females evaluate encountered males by inspection of an indicator character that has some functional relationship to male quality. The solutions are identical to the original models when the inspected male attribute is itself male quality. The sequential search strategy is shown to dominate the fixed sample search strategy for all search costs and distributions of male quality. Low search costs have been implicated to explain empirical observations that are consistent with the use of a fixed sample search strategy, but under conditions in which the original models were derived there is no search cost or distribution of male quality that favors the fixed sample search strategy. Plausible alternative explanations for the apparent use of this search strategy are discussed.
STATE OF THE ART - MODERN SEQUENTIAL RULE MINING TECHNIQUES
Directory of Open Access Journals (Sweden)
Anjali Paliwal
2015-10-01
Full Text Available This paper is state of the art of existing sequential rule mining algorithms. Extracting sequential rule is a very popular and computationally expensive task. We also explain the fundamentals of sequential rule mining. We describe today’s approaches for sequential rule mining. From the broad variety of efficient algorithms that have been developed we will compare the most important ones. We will systematize the algorithms and analyze their performance based on both their run t ime performance and theoretical considerations. Their strengths and weaknesses are also investigated.
State of The Art - Modern Sequential Rule Mining Techniques
Directory of Open Access Journals (Sweden)
Ms. Anjali Paliwal
2014-08-01
Full Text Available This paper is state of the art of existing sequential rule mining algorithms. Extracting sequential rule is a very popular and computationally expensive task. We also explain the fundamentals of sequential rule mining. We describe today’s approaches for sequential rule mining. From the broad variety of efficient algorithms that have been developed we will compare the most important ones. We will systematize the algorithms and analyze their performance based on both their run time performance and theoretical considerations. Their strengths and weaknesses are also investigated.
Reynaud-Bouret, Patricia; Laurent, Béatrice
2012-01-01
Considering two independent Poisson processes, we address the question of testing equality of their respective intensities. We construct multiple testing procedures from the aggregation of single tests whose testing statistics come from model selection, thresholding and/or kernel estimation methods. The corresponding critical values are computed through a non-asymptotic wild bootstrap approach. The obtained tests are proved to be exactly of level $\\alpha$, and to satisfy non-asymptotic oracle type inequalities. From these oracle type inequalities, we deduce that our tests are adaptive in the minimax sense over a large variety of classes of alternatives based on classical and weak Besov bodies in the univariate case, but also Sobolev and anisotropic Nikol'skii-Besov balls in the multivariate case. A simulation study furthermore shows that they strongly perform in practice.
Hanson, Sonya M.; Ekins, Sean; Chodera, John D.
2015-12-01
All experimental assay data contains error, but the magnitude, type, and primary origin of this error is often not obvious. Here, we describe a simple set of assay modeling techniques based on the bootstrap principle that allow sources of error and bias to be simulated and propagated into assay results. We demonstrate how deceptively simple operations—such as the creation of a dilution series with a robotic liquid handler—can significantly amplify imprecision and even contribute substantially to bias. To illustrate these techniques, we review an example of how the choice of dispensing technology can impact assay measurements, and show how large contributions to discrepancies between assays can be easily understood and potentially corrected for. These simple modeling techniques—illustrated with an accompanying IPython notebook—can allow modelers to understand the expected error and bias in experimental datasets, and even help experimentalists design assays to more effectively reach accuracy and imprecision goals.
DEFF Research Database (Denmark)
Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.
2012-01-01
A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis...... results regardless of the origin of data and the nature of the original variables. The approach is suitable for getting an overview of product confidence intervals and also applicable for data obtained from ‘one repetition’ evaluations. Furthermore, it is a convenient way to get an overview of variations...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....
Alonso-Prieto, Esther; Pancaroglu, Raika; Dalrymple, Kirsten A; Handy, Todd; Barton, Jason J S; Oruc, Ipek
2015-01-01
Prior event-related potential studies using group statistics within a priori selected time windows have yielded conflicting results about familiarity effects in face processing. Our goal was to evaluate the temporal dynamics of the familiarity effect at all time points at the single-subject level. Ten subjects were shown faces of anonymous people or celebrities. Individual results were analysed using a point-by-point bootstrap analysis. While familiarity effects were less consistent at later epochs, all subjects showed them between 130 and 195 ms in occipitotemporal electrodes. However, the relation between the time course of familiarity effects and the peak latency of the N170 was variable. We concluded that familiarity effects between 130 and 195 ms are robust and can be shown in single subjects. The variability of their relation to the timing of the N170 potential may lead to underestimation of familiarity effects in studies that use group-based statistics.
Hanson, Sonya M; Ekins, Sean; Chodera, John D
2015-12-01
All experimental assay data contains error, but the magnitude, type, and primary origin of this error is often not obvious. Here, we describe a simple set of assay modeling techniques based on the bootstrap principle that allow sources of error and bias to be simulated and propagated into assay results. We demonstrate how deceptively simple operations--such as the creation of a dilution series with a robotic liquid handler--can significantly amplify imprecision and even contribute substantially to bias. To illustrate these techniques, we review an example of how the choice of dispensing technology can impact assay measurements, and show how large contributions to discrepancies between assays can be easily understood and potentially corrected for. These simple modeling techniques--illustrated with an accompanying IPython notebook--can allow modelers to understand the expected error and bias in experimental datasets, and even help experimentalists design assays to more effectively reach accuracy and imprecision goals.
Indian Academy of Sciences (India)
SHATAKSHEE CHATTERJEE; PARTHA P. MAJUMDER; PRIYANKA PANDEY
2016-09-01
Study of temporal trajectory of gene expression is important. RNA sequencing is popular in genome-scale studies of transcription. Because of high expenses involved, many time-course RNA sequencing studies are challenged by inadequacy of sample sizes. This poses difficulties in conducting formal statistical tests of significance of null hypotheses. We propose a bootstrap algorithm to identify ‘cognizable’ ‘time-trends’ of gene expression. Properties of the proposed algorithm are derived using a simulation study. The proposed algorithm captured known ‘time-trends’ in the simulated data with a high probability of success, even when sample sizes were small (n<10). The proposed statistical method is efficient and robust to capture ‘cognizable’ ‘time-trends’ in RNA sequencing data.
Pang, Yi; Rong, Junchen; Su, Ning
2016-12-01
We consider ϕ 3 theory in 6 - 2 ɛ with F 4 global symmetry. The beta function is calculated up to 3 loops, and a stable unitary IR fixed point is observed. The anomalous dimensions of operators quadratic or cubic in ϕ are also computed. We then employ conformal bootstrap technique to study the fixed point predicted from the perturbative approach. For each putative scaling dimension of ϕ (Δ ϕ ), we obtain the corresponding upper bound on the scaling dimension of the second lowest scalar primary in the 26 representation ( Δ 26 2nd ) which appears in the OPE of ϕ × ϕ. In D = 5 .95, we observe a sharp peak on the upper bound curve located at Δ ϕ equal to the value predicted by the 3-loop computation. In D = 5, we observe a weak kink on the upper bound curve at ( Δ ϕ , Δ 26 2nd ) = (1.6, 4).
Tailored sequential drug release from bilayered calcium sulfate composites
Energy Technology Data Exchange (ETDEWEB)
Orellana, Bryan R.; Puleo, David A., E-mail: puleo@uky.edu
2014-10-01
The current standard for treating infected bony defects, such as those caused by periodontal disease, requires multiple time-consuming steps and often multiple procedures to fight the infection and recover lost tissue. Releasing an antibiotic followed by an osteogenic agent from a synthetic bone graft substitute could allow for a streamlined treatment, reducing the need for multiple surgeries and thereby shortening recovery time. Tailorable bilayered calcium sulfate (CS) bone graft substitutes were developed with the ability to sequentially release multiple therapeutic agents. Bilayered composite samples having a shell and core geometry were fabricated with varying amounts (1 or 10 wt.%) of metronidazole-loaded poly(lactic-co-glycolic acid) (PLGA) particles embedded in the shell and simvastatin directly loaded into either the shell, core, or both. Microcomputed tomography showed the overall layered geometry as well as the uniform distribution of PLGA within the shells. Dissolution studies demonstrated that the amount of PLGA particles (i.e., 1 vs. 10 wt.%) had a small but significant effect on the erosion rate (3% vs. 3.4%/d). Mechanical testing determined that introducing a layered geometry had a significant effect on the compressive strength, with an average reduction of 35%, but properties were comparable to those of mandibular trabecular bone. Sustained release of simvastatin directly loaded into CS demonstrated that changing the shell to core volume ratio dictates the duration of drug release from each layer. When loaded together in the shell or in separate layers, sequential release of metronidazole and simvastatin was achieved. By introducing a tunable, layered geometry capable of releasing multiple drugs, CS-based bone graft substitutes could be tailored in order to help streamline the multiple steps needed to regenerate tissue in infected defects. - Highlights: • Bilayered CS composites were fabricated as potential bone graft substitutes. • The shell
Irurtzun, Aritz
2015-01-01
In recent research (Boeckx and Benítez-Burraco, 2014a,b) have advanced the hypothesis that our species-specific language-ready brain should be understood as the outcome of developmental changes that occurred in our species after the split from Neanderthals-Denisovans, which resulted in a more globular braincase configuration in comparison to our closest relatives, who had elongated endocasts. According to these authors, the development of a globular brain is an essential ingredient for the language faculty and in particular, it is the centrality occupied by the thalamus in a globular brain that allows its modulatory or regulatory role, essential for syntactico-semantic computations. Their hypothesis is that the syntactico-semantic capacities arise in humans as a consequence of a process of globularization, which significantly takes place postnatally (cf. Neubauer et al., 2010). In this paper, I show that Boeckx and Benítez-Burraco's hypothesis makes an interesting developmental prediction regarding the path of language acquisition: it teases apart the onset of phonological acquisition and the onset of syntactic acquisition (the latter starting significantly later, after globularization). I argue that this hypothesis provides a developmental rationale for the prosodic bootstrapping hypothesis of language acquisition (cf. i.a. Gleitman and Wanner, 1982; Mehler et al., 1988, et seq.; Gervain and Werker, 2013), which claim that prosodic cues are employed for syntactic parsing. The literature converges in the observation that a large amount of such prosodic cues (in particular, rhythmic cues) are already acquired before the completion of the globularization phase, which paves the way for the premises of the prosodic bootstrapping hypothesis, allowing babies to have a rich knowledge of the prosody of their target language before they can start parsing the primary linguistic data syntactically.
Directory of Open Access Journals (Sweden)
Aritz eIrurtzun
2015-12-01
Full Text Available In recent research Boeckx & Benítez-Burraco (2014a,b have advanced the hypothesis that our species-specific language-ready brain should be understood as the outcome of developmental changes that occurred in our species after the split from Neanderthals-Denisovans, which resulted in a more globular braincase configuration in comparison to our closest relatives, who had elongated endocasts. According to these authors, the development of a globular brain is an essential ingredient for the language faculty and in particular, it is the centrality occupied by the thalamus in a globular brain that allows its modulatory or regulatory role, essential for syntactico-semantic computations. Their hypothesis is that the syntactico-semantic capacities arise in humans as a consequence of a process of globularization, which significantly takes place postnatally (cf. Neubauer et al. (2010. In this paper, I show that Boeckx & Benítez-Burraco’s hypothesis makes an interesting developmental prediction regarding the path of language acquisition: it teases apart the onset of phonological acquisition and the onset of syntactic acquisition (the latter starting significantly later, after globularization. I argue that this hypothesis provides a developmental rationale for the prosodic bootstrapping hypothesis of language acquisition (cf. i.a. Gleitman & Wanner (1982; Mehler et al. (1988, et seq.; Gervain & Werker (2013, which claim that prosodic cues are employed for syntactic parsing. The literature converges in the observation that a large amount of such prosodic cues (in particular, rhythmic cues are already acquired before the completion of the globularization phase, which paves the way for the premises of prosodic bootstrapping hypothesis, allowing babies to have a rich knowledge of the prosody of their target language before they can start parsing the primary linguistic data syntactically.
Mechanistic studies on a sequential PDT protocol
Kessel, David
2016-03-01
A low (~LD15) PDT dose resulting in selective lysosomal photodamage can markedly promote photokilling by subsequent photodamage targeted to mitochondria. Experimental data are consistent with the proposal that cleavage of the autophagyassociated protein ATG5 to a pro-apoptotic fragment is responsible for this effect. This process is known to be dependent on the proteolytic activity of calpain. We have proposed that Ca2+ released from photodamaged lysosomes is the trigger for ATG5 cleavage. We can now document the conversion of ATG5 to the truncated form after lysosomal photodamage. Photofrin, a photosensitizer that targets both mitochondria and lysosomes, can be used for either phase of the sequential PDT process. The ability of Photofrin to target both loci may explain the well-documented efficacy of this agent.
Lipid peroxidation in experimental uveitis: sequential studies.
Goto, H; Wu, G S; Chen, F; Kristeva, M; Sevanian, A; Rao, N A
1992-06-01
Previously we have detected the occurrence of retinal lipid peroxidation initiated by phagocyte-derived oxygen radicals in experimental autoimmune uveitis (EAU). In the current studies, the confirmation of inflammation-mediated lipid peroxidation was proceeded further to include measurement of multiple parameters, including conjugated dienes, ketodienes, thiobarbituric acid reactive substances and fluorescent chromolipids. The assay for myeloperoxidase, a measure for the number of polymorphonuclear leukocytes in the inflammatory sites was also carried out. The levels of all these parameters were followed through the course of EAU development. The sequential evaluation of histologic changes using both light and electron microscopy was also carried out and the results were correlated with lipid peroxidation indices. These data suggest that the retinal lipid peroxidation plays a causative role in the subsequent retinal degeneration.
Dihydroazulene photoswitch operating in sequential tunneling regime
DEFF Research Database (Denmark)
Broman, Søren Lindbæk; Lara-Avila, Samuel; Thisted, Christine Lindbjerg;
2012-01-01
Molecular switches play a central role for the development of molecular electronics. In this work it is demonstrated that the reproducibility and robustness of a single-molecule dihydroazulene (DHA)/vinylheptafulvene (VHF) switch can be remarkably enhanced if the switching kernel is weakly coupled...... previously reported brominationeliminationcross-coupling protocol for functionalization of the DHA core. For all new derivatives the kinetics of DHA/VHF transition has been thoroughly studied in solution. The kinetics reveals the effect of sulfur end-groups on the thermal ring-closure of VHF. One derivative......, incorporating a p-MeSC6H4 anchoring group in one end, has been placed in a silver nanogap. Conductance measurements justify that transport through both DHA (high resistivity) and VHF (low resistivity) forms goes by sequential tunneling. The switching is fairly reversible and reenterable; after more than 20 ON...
Sequential pattern formation governed by signaling gradients
Jörg, David J.; Oates, Andrew C.; Jülicher, Frank
2016-10-01
Rhythmic and sequential segmentation of the embryonic body plan is a vital developmental patterning process in all vertebrate species. However, a theoretical framework capturing the emergence of dynamic patterns of gene expression from the interplay of cell oscillations with tissue elongation and shortening and with signaling gradients, is still missing. Here we show that a set of coupled genetic oscillators in an elongating tissue that is regulated by diffusing and advected signaling molecules can account for segmentation as a self-organized patterning process. This system can form a finite number of segments and the dynamics of segmentation and the total number of segments formed depend strongly on kinetic parameters describing tissue elongation and signaling molecules. The model accounts for existing experimental perturbations to signaling gradients, and makes testable predictions about novel perturbations. The variety of different patterns formed in our model can account for the variability of segmentation between different animal species.
Compressed Inference for Probabilistic Sequential Models
Polatkan, Gungor
2012-01-01
Hidden Markov models (HMMs) and conditional random fields (CRFs) are two popular techniques for modeling sequential data. Inference algorithms designed over CRFs and HMMs allow estimation of the state sequence given the observations. In several applications, estimation of the state sequence is not the end goal; instead the goal is to compute some function of it. In such scenarios, estimating the state sequence by conventional inference techniques, followed by computing the functional mapping from the estimate is not necessarily optimal. A more formal approach is to directly infer the final outcome from the observations. In particular, we consider the specific instantiation of the problem where the goal is to find the state trajectories without exact transition points and derive a novel polynomial time inference algorithm that outperforms vanilla inference techniques. We show that this particular problem arises commonly in many disparate applications and present experiments on three of them: (1) Toy robot trac...
Sequential pattern formation governed by signaling gradients
Jörg, David J; Jülicher, Frank
2016-01-01
Rhythmic and sequential segmentation of the embryonic body plan is a vital developmental patterning process in all vertebrate species. However, a theoretical framework capturing the emergence of dynamic patterns of gene expression from the interplay of cell oscillations with tissue elongation and shortening and with signaling gradients, is still missing. Here we show that a set of coupled genetic oscillators in an elongating tissue that is regulated by diffusing and advected signaling molecules can account for segmentation as a self-organized patterning process. This system can form a finite number of segments and the dynamics of segmentation and the total number of segments formed depend strongly on kinetic parameters describing tissue elongation and signaling molecules. The model accounts for existing experimental perturbations to signaling gradients, and makes testable predictions about novel perturbations. The variety of different patterns formed in our model can account for the variability of segmentatio...
Steganography Based on Baseline Sequential JPEG Compression
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Information hiding in Joint Photographic Experts Group (JPEG) compressed images are investigated in this paper. Quantization is the source of information loss in JPEG compression process. Therefore, information hidden in images is probably destroyed by JPEG compression. This paper presents an algorithm to reliably embed information into the JPEG bit streams in the process of JPEG encoding. Information extraction is performed in the process of JPEG decoding. The basic idea of our algorithm is to modify the quantized direct current (DC) coefficients and non-zero alternating current (AC) coefficients to represent one bit information (0 or 1). Experimental results on gray images using baseline sequential JPEG encoding show that the cover images (images without secret information) and the stego-images (images with secret information) are perceptually indiscernible.
Isoscaling in Statistical Sequential Decay Model
Institute of Scientific and Technical Information of China (English)
TIAN Wen-Dong; SU Qian-Min; WANG Hong-Wei; WANG Kun; YAN Ting-ZHi; MA Yu-Gang; CAI Xiang-Zhou; FANG De-Qing; GUO Wei; MA Chun-Wang; LIU Gui-Hua; SHEN Wen-Qing; SHI Yu
2007-01-01
A sequential decay model is used to study isoscaling, I.e. The factorization of the isotope ratios from sources of different isospins and sizes over a broad range of excitation energies, into fugacity terms of proton and neutron number, R21(N, Z) = Y2(N, Z)/Y1(N, Z) = Cexp(αN +βZ). It is found that the isoscaling parameters α and β have a strong dependence on the isospin difference of equilibrated source and excitation energy, no significant influence of the source size on α andβ has been observed. It is found that α and β decrease with the excitation energy and are linear functions of 1/T and △(Z/A)2 or △(N/A)2 of the sources. Symmetry energy coefficient Csym is constrained from the relationship of α and source △(Z/A)2, β and source △(N/A)2.
Sequential cooling insert for turbine stator vane
Energy Technology Data Exchange (ETDEWEB)
Jones, Russel B
2017-04-04
A sequential flow cooling insert for a turbine stator vane of a small gas turbine engine, where the impingement cooling insert is formed as a single piece from a metal additive manufacturing process such as 3D metal printing, and where the insert includes a plurality of rows of radial extending impingement cooling air holes alternating with rows of radial extending return air holes on a pressure side wall, and where the insert includes a plurality of rows of chordwise extending second impingement cooling air holes on a suction side wall. The insert includes alternating rows of radial extending cooling air supply channels and return air channels that form a series of impingement cooling on the pressure side followed by the suction side of the insert.
Sequential hadronization and the opportunities it presents
Bellwied, R.
2016-08-01
Continuum extrapolated lattice QCD calculations of quantum number specific susceptibilities and the most recent RHIC and LHC data on produced particle yields, as well as their higher moment fluctuations, can be interpreted using a scenario of sequential flavor dependent hadronization during the QCD crossover transition. I will present the latest data from lattice QCD and experiment and confront the question whether the separation of strangeness and light quark chemical freeze-out could have consequences beyond a simple strangeness enhancement, firstly in the production of exotic strange states and secondly in the flavor dependent evolution of dynamic quantities such as in-medium energy loss and anisotropic flow, which are generated predominantly during the collective partonic phase.
The sequential propensity household projection model
Directory of Open Access Journals (Sweden)
Tom Wilson
2013-04-01
Full Text Available BACKGROUND The standard method of projecting living arrangements and households in Australia and New Zealand is the 'propensity model', a type of extended headship rate model. Unfortunately it possesses a number of serious shortcomings, including internal inconsistencies, difficulties in setting living arrangement assumptions, and very limited scenario creation capabilities. Data allowing the application of more sophisticated dynamic household projection models are unavailable in Australia. OBJECTIVE The aim was create a projection model to overcome these shortcomings whilst minimising input data requirements and costs, and retaining the projection outputs users are familiar with. METHODS The sequential propensity household projection model is proposed. Living arrangement projections take place in a sequence of calculations, with progressively more detailed living arrangement categories calculated in each step. In doing so the model largely overcomes the three serious deficiencies of the standard propensity model noted above. RESULTS The model is illustrated by three scenarios produced for one case study State, Queensland. They are: a baseline scenario in which all propensities are held constant to demonstrate the effects of population growth and ageing, a housing crisis scenario where housing affordability declines, and a prosperity scenario where families and individuals enjoy greater real incomes. A sensitivity analysis in which assumptions are varied one by one is also presented. CONCLUSIONS The sequential propensity model offers a more effective method of producing household and living arrangement projections than the standard propensity model, and is a practical alternative to dynamic projection models for countries and regions where the data and resources to apply such models are unavailable.
Discriminative predation: Simultaneous and sequential encounter experiments
Institute of Scientific and Technical Information of China (English)
C.D.BEATTY; D.W.FRANKS
2012-01-01
There are many situations in which the ability of animals to distinguish between two similar looking objects can have significant selective consequences.For example,the objects that require discrimination may be edible versus defended prey,predators versus non-predators,or mates of varying quality.Working from the premise that there are situations in which discrimination may be more or less successful,we hypothesized that individuals find it more difficult to distinguish between stimuli when they encounter them sequentially rather than simultaneously.Our study has wide biological and psychological implications from the perspective of signal perception,signal evolution,and discrimination,and could apply to any system where individuals are making relative judgments or choices between two or more stimuli or signals.While this is a general principle that might seem intuitive,it has not been experimentally tested in this context,and is often not considered in the design of models or experiments,or in the interpretation of a wide range of studies.Our study is different from previous studies in psychology in that a) the level of similarity of stimuli are gradually varied to obtain selection gradients,and b) we discuss the implications of our study for specific areas in ecology,such as the level of perfection of mimicry in predator-prey systems.Our experiments provide evidence that it is indeed more difficult to distinguish between stimuli - and to learn to distinguish between stimuli - when they are encountered sequentially rather than simultaneously,even if the intervening time interval is short.
Choi, Sae Il
2009-01-01
This study used simulation (a) to compare the kernel equating method to traditional equipercentile equating methods under the equivalent-groups (EG) design and the nonequivalent-groups with anchor test (NEAT) design and (b) to apply the parametric bootstrap method for estimating standard errors of equating. A two-parameter logistic item response…
Biohydrogen production from beet molasses by sequential dark and photofermentation
Energy Technology Data Exchange (ETDEWEB)
Oezguer, Ebru; Eroglu, Inci [Middle East Technical University, Department of Chemical Engineering, 06531, Ankara (Turkey); Mars, Astrid E.; Louwerse, Annemarie; Claassen, Pieternel A.M. [Wageningen UR, Agrotechnology and Food Sciences Group, Wageningen UR, P.O. Box 17, 6700 AA Wageningen (Netherlands); Peksel, Beguem; Yuecel, Meral; Guenduez, Ufuk [Middle East Technical University, Department of Biology, 06531, Ankara (Turkey)
2010-01-15
Biological hydrogen production using renewable resources is a promising possibility to generate hydrogen in a sustainable way. In this study, a sequential dark and photofermentation has been employed for biohydrogen production using sugar beet molasses as a feedstock. An extreme thermophile Caldicellulosiruptor saccharolyticus was used for the dark fermentation, and several photosynthetic bacteria (Rhodobacter capsulatus wild type, R. capsulatus hup{sup -} mutant, and Rhodopseudomonas palustris) were used for the photofermentation. C. saccharolyticus was grown in a pH-controlled bioreactor, in batch mode, on molasses with an initial sucrose concentration of 15 g/L. The influence of additions of NH{sub 4}{sup +} and yeast extract on sucrose consumption and hydrogen production was determined. The highest hydrogen yield (4.2 mol of H{sub 2}/mol sucrose) and maximum volumetric productivity (7.1 mmol H{sub 2}/L{sub c}.h) were obtained in the absence of NH{sub 4}{sup +}. The effluent of the dark fermentation containing no NH{sub 4}{sup +} was fed to a photobioreactor, and hydrogen production was monitored under continuous illumination, in batch mode. Productivity and yield were improved by dilution of the dark fermentor effluent (DFE) and the additions of buffer, iron-citrate and sodium molybdate. The highest hydrogen yield (58% of the theoretical hydrogen yield of the consumed organic acids) and productivity (1.37 mmol H{sub 2}/L{sub c}.h) were attained using the hup{sup -} mutant of R. capsulatus. The overall hydrogen yield from sucrose increased from the maximum of 4.2 mol H{sub 2}/mol sucrose in dark fermentation to 13.7 mol H{sub 2}/mol sucrose (corresponding to 57% of the theoretical yield of 24 mol of H{sub 2}/mole of sucrose) by sequential dark and photofermentation. (author)
Sequential injection methodology for carbon speciation in bathing waters.
Santos, Inês C; Mesquita, Raquel B R; Machado, Ana; Bordalo, Adriano A; Rangel, António O S S
2013-05-17
A sequential injection method (SIA) for carbon speciation in inland bathing waters was developed comprising, in a single manifold, the determination of dissolved inorganic carbon (DIC), free dissolved carbon dioxide (CO2), total carbon (TC), dissolved organic carbon and alkalinity. The determination of DIC, CO2 and TC was based on colour change of bromothymol blue (660 nm) after CO2 diffusion through a hydrophobic membrane placed in a gas diffusion unit (GDU). For the DIC determination, an in-line acidification prior to the GDU was performed and, for the TC determination, an in-line UV photo-oxidation of the sample prior to GDU ensured the conversion of all carbon forms into CO2. Dissolved organic carbon (DOC) was determined by subtracting the obtained DIC value from the TC obtained value. The determination of alkalinity was based on the spectrophotometric measurement of bromocresol green colour change (611 nm) after reaction with acetic acid. The developed SIA method enabled the determination of DIC (0.24-3.5 mg C L(-1)), CO2 (1.0-10 mg C L(-1)), TC (0.50-4.0 mg C L(-1)) and alkalinity (1.2-4.7 mg C L(-1) and 4.7-19 mg C L(-1)) with limits of detection of: 9.5 μg C L(-1), 20 μg C L(-1), 0.21 mg C L(-1), 0.32 mg C L(-1), respectively. The SIA system was effectively applied to inland bathing waters and the results showed good agreement with reference procedures.
Sequential Dependencies in Categorical Judgments of Radiographic Images
Beckstead, Jason W.; Boutis, Kathy; Pecaric, Martin; Pusic, Martin V.
2017-01-01
Sequential context effects, the psychological interactions occurring between the events of successive trials when a sequence of similar stimuli are judged, have interested psychologists for decades. It has been well established that individuals exhibit sequential context effects in psychophysical experiments involving unidimensional stimuli.…
Adaptive Learning in Extensive Form Games and Sequential Equilibrium
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1999-01-01
This paper studies adaptive learning in extensive form games and provides conditions for convergence points of adaptive learning to be sequential equilibria. Precisely, we present a set of conditions on learning sequences such that an assessment is a sequential equilibrium if and only...... if there is a learning sequence fulfilling the conditions, which leads to the assessment...
Revenue Prediction in Budget-constrained Sequential Auctions with Complementarities
S. Verwer (Sicco); Y. Zhang (Yingqian)
2011-01-01
textabstractWhen multiple items are auctioned sequentially, the ordering of auctions plays an important role in the total revenue collected by the auctioneer. This is true especially with budget constrained bidders and the presence of complementarities among items. In such sequential auction setting
Accounting for Heterogeneous Returns in Sequential Schooling Decisions
Zamarro, G.
2006-01-01
This paper presents a method for estimating returns to schooling that takes into account that returns may be heterogeneous among agents and that educational decisions are made sequentially.A sequential decision model is interesting because it explicitly considers that the level of education of each
Interaction of Hypertext Forms and Global Versus Sequential Learning Styles
Dunser, Andreas; Jirasko, Marco
2005-01-01
In this study, the relevance of the distinction between sequential and global learners in the context of learning with hypertext was investigated. Learners with global learning style were expected to produce better results when learning with hypertext, whereas learners with sequential learning style should profit from a structural aid in form of a…
The sequential price of anarchy for atomic congestion games
Jong, de Jasper; Uetz, Marc; Liu, Tie-Yan; Qi, Qi; Ye, Yinyu
2014-01-01
In situations without central coordination, the price of anarchy relates the quality of any Nash equilibrium to the quality of a global optimum. Instead of assuming that all players choose their actions simultaneously, we consider games where players choose their actions sequentially. The sequential
Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris
2014-06-17
Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.
OptGS: An R Package for Finding Near-Optimal Group-Sequential Designs
Directory of Open Access Journals (Sweden)
James Wason
2015-08-01
Full Text Available A group-sequential clinical trial design is one in which interim analyses of the data are conducted after groups of patients are recruited. After each interim analysis, the trial may stop early if the evidence so far shows the new treatment is particularly effective or ineffective. Such designs are ethical and cost-effective, and so are of great interest in practice. An optimal group-sequential design is one which controls the type-I error rate and power at a specified level, but minimizes the expected sample size of the trial when the true treatment effect is equal to some specified value. Searching for an optimal group- sequential design is a significant computational challenge because of the high number of parameters. In this paper the R package OptGS is described. Package OptGS searches for near-optimal and balanced (i.e., one which balances more than one optimality criterion group-sequential designs for randomized controlled trials with normally distributed outcomes. Package OptGS uses a two-parameter family of functions to determine the stopping boundaries, which improves the speed of the search process whilst still allow- ing flexibility in the possible shape of stopping boundaries. The resulting package allows optimal designs to be found in a matter of seconds much faster than a previous approach.
The statistical decay of very hot nuclei: from sequential decay to multifragmentation
Energy Technology Data Exchange (ETDEWEB)
Carlson, B.V. [Centro Tecnico Aeroespacial (ITA/CTA), Sao Jose dos Campos, SP (Brazil). Inst. Tecnologico de Aeronautica; Donangelo, R. [Universidade Federal do Rio de Janeiro (IF/UFRJ), RJ (Brazil). Inst. de Fisica; Universidad de la Republica, Montevideo (Uruguay). Facultad de Ingenieria. Inst. de Fisica; Souza, S.R. [Universidade Federal do Rio de Janeiro (IF/UFRJ), RJ (Brazil). Inst. de Fisica; Universidade Federal do Rio Grande do Sul (IF/UFRGS), Porto Alegre, RS (Brazil). Inst. de Fisica; Lynch, W.G.; Steiner, A.W.; Tsang, M.B. [Michigan State University (NSCL/MSU), East Lansing, MI (United States). National Superconducting Cyclotron Lab.
2010-07-01
Full text. At low excitation energies, the compound nucleus typically decays through the sequential emission of light particles. As the energy increases, the emission probability of heavier fragments increases until, at sufficiently high energies, several heavy complex fragments are emitted during the decay. The extent to which this fragment emission is simultaneous or sequential has been a subject of theoretical and experimental study for almost 30 years. The Statistical Multifragmentation Model, an equilibrium model of simultaneous fragment emission, uses the configurations of a statistical ensemble to determine the distribution of primary fragments of a compound nucleus. The primary fragments are then assumed to decay by sequential compound emission or Fermi breakup. As the first step toward a more unified model of these processes, we demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical version of the statistical multifragmentation model. We then establish a link between this unified Fermi breakup / statistical multifragmentation model and the well-known process of compound nucleus emission, which permits to consider simultaneous and sequential emission on the same footing. Within this unified framework, we analyze the increasing importance of simultaneous, multifragment decay with increasing excitation energy and decreasing lifetime of the compound nucleus. (author)
Canonico, Laura; Comitini, Francesca; Oro, Lucia; Ciani, Maurizio
2016-01-01
The average ethanol content of wine has increased over the last two decades. This increase was due to consumer preference, and also to climate change that resulted in increased grape maturity at harvest. In the present study, to reduce ethanol content in wine, a microbiological approach was investigated, using immobilized selected strains of non-Saccharomyces yeasts namely Starmerella bombicola, Metschnikowia pulcherrima, Hanseniaspora osmophila, and Hanseniaspora uvarum to start fermentation, followed by inoculation of free Saccharomyces cerevisiae cells. The immobilization procedures, determining high reaction rates, led a feasible sequential inoculation management avoiding possible contamination under actual winemaking. Under these conditions, the immobilized cells metabolized almost 50% of the sugar in 3 days, while S. cerevisiae inoculation completed all of fermentation. The S. bombicola and M. pulcherrima initial fermentations showed the best reductions in the final ethanol content (1.6 and 1.4% v/v, respectively). Resulting wines did not have any negative fermentation products with the exception of H. uvarum sequential fermentation that showed significant amount of ethyl acetate. On the other hand, there were increases in desirable compounds such as glycerol and succinic acid for S. bombicola, geraniol for M. pulcherrima and isoamyl acetate and isoamyl alcohol for H. osmophila sequential fermentations. The overall results indicated that a promising ethanol reduction could be obtained using sequential fermentation of immobilized selected non-Saccharomyces strains. In this way, a suitable timing of second inoculation and an enhancement of analytical profile of wine were obtained.
Directory of Open Access Journals (Sweden)
Laura eCanonico
2016-03-01
Full Text Available The average ethanol content of wine has increased over the last two decades. This increase was due to consumer preference, and also to climate change that resulted in increased grape maturity at harvest. In the present study, to reduce ethanol content in wine, a microbiological approach was investigated using immobilized selected strains of non-Saccharomyces yeasts namely Starmerella bombicola, Metschnikowia pulcherrima, Hanseniaspora osmophila and Hanseniaspora uvarum to start fermentation, followed by inoculation of free Saccharomyces cerevisiae cells. The immobilization procedures, determining high reaction rates, led a feasible sequential inoculation management avoiding possible contamination under actual winemaking. Under these conditions, the immobilized cells metabolized almost 50% of the sugar in 3 days, while S. cerevisiae inoculation completed all of fermentation. The S. bombicola and M. pulcherrima initial fermentations showed the best reductions in the final ethanol content (1.6% and 1.4% v/v, respectively. Resulting wines did not have any negative fermentation products with the exception of H. uvarum sequential fermentation that showed significant amount of ethyl acetate. On the other hand, there were increases in desirable compounds such as glycerol and succinic acid for S. bombicola, geraniol for M. pulcherrima and isoamyl acetate and isoamyl alcohol for H. osmophila sequential fermentations. The overall results indicated that a promising ethanol reduction could be obtained using sequential fermentation of immobilized selected non-Saccharomyces strains. In this way, a suitable timing of second inoculation and an enhancement of analytical profile of wine were obtained.
Hand, Troy; Feron, Eric
2010-01-01
This paper discusses the effect of sequential conflict resolution maneuvers of an infinite aircraft flow through a finite control volume. Aircraft flow models are utilized to simulate traffic flows and determine stability. Pseudo-random flow geometry is considered to determine airspace stability in a more random airspace, where aircraft flows are spread over a given positive width. The use of this aircraft flow model generates a more realistic flow geometry. A set of upper bounds on the maximal aircraft deviation during conflict resolution is derived. Also with this flow geometry it is proven that these bounds are not symmetric, unlike the symmetric bounds derived in previous papers for simpler flow configurations. Stability is preserved under sequential conflict resolution algorithms for all flow geometries discussed in this paper.
Cooperation induced by random sequential exclusion
Li, Kun; Cong, Rui; Wang, Long
2016-06-01
Social exclusion is a common and powerful tool to penalize deviators in human societies, and thus to effectively elevate collaborative efforts. Current models on the evolution of exclusion behaviors mostly assume that each peer excluder independently makes the decision to expel the defectors, but has no idea what others in the group would do or how the actual punishment effect will be. Thus, a more realistic model, random sequential exclusion, is proposed. In this mechanism, each excluder has to pay an extra scheduling cost and then all the excluders are arranged in a random order to implement the exclusion actions. If one free rider has already been excluded by an excluder, the remaining excluders will not participate in expelling this defector. We find that this mechanism can help stabilize cooperation under more unfavorable conditions than the normal peer exclusion can do, either in well-mixed population or on social networks. However, too large a scheduling cost may undermine the advantage of this mechanism. Our work validates the fact that collaborative practice among punishers plays an important role in further boosting cooperation.