WorldWideScience

Sample records for high counting statistics

  1. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.

  2. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  3. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  4. Counting statistics in radioactivity measurements

    International Nuclear Information System (INIS)

    Martin, J.

    1975-01-01

    The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr

  5. Theory of photoelectron counting statistics

    International Nuclear Information System (INIS)

    Blake, J.

    1980-01-01

    The purpose of the present essay is to provide a detailed analysis of those theoretical aspects of photoelectron counting which are capable of experimental verification. Most of our interest is in the physical phenomena themselves, while part is in the mathematical techniques. Many of the mathematical methods used in the analysis of the photoelectron counting problem are generally unfamiliar to physicists interested in the subject. For this reason we have developed the essay in such a fashion that, although primary interest is focused on the physical phenomena, we have also taken pains to carry out enough of the analysis so that the reader can follow the main details. We have chosen to present a consistently quantum mechanical version of the subject, in that we follow the Glauber theory throughout. (orig./WL)

  6. Counting statistics of transport through Coulomb blockade nanostructures: High-order cumulants and non-Markovian effects

    DEFF Research Database (Denmark)

    Flindt, Christian; Novotny, Tomás; Braggio, Alessandro

    2010-01-01

    Recent experimental progress has made it possible to detect in real-time single electrons tunneling through Coulomb blockade nanostructures, thereby allowing for precise measurements of the statistical distribution of the number of transferred charges, the so-called full counting statistics...... interactions. Our recursive method can treat systems with many states as well as non-Markovian dynamics. We illustrate our approach with three examples of current experimental relevance: bunching transport through a two-level quantum dot, transport through a nanoelectromechanical system with dynamical Franck...

  7. Statistical data filtration in neutron coincidence counting

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    1992-11-01

    We assessed the effectiveness of statistical data filtration to minimize the contribution of matrix materials in 200-ell drums to the nondestructive assay of plutonium. Those matrices were examined: polyethylene, concrete, aluminum, iron, cadmium, and lead. Statistical filtration of neutron coincidence data improved the low-end sensitivity of coincidence counters. Spurious data arising from electrical noise, matrix spallation, and geometric effects were smoothed in a predictable fashion by the statistical filter. The filter effectively lowers the minimum detectable mass limit that can be achieved for plutonium assay using passive neutron coincidence counting

  8. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  9. Statistical Methods for Unusual Count Data

    DEFF Research Database (Denmark)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...... microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per...

  10. Counting statistics in low level radioactivity measurements fluctuating counting efficiency

    International Nuclear Information System (INIS)

    Pazdur, M.F.

    1976-01-01

    A divergence between the probability distribution of the number of nuclear disintegrations and the number of observed counts, caused by counting efficiency fluctuation, is discussed. The negative binominal distribution is proposed to describe the probability distribution of the number of counts, instead of Poisson distribution, which is assumed to hold for the number of nuclear disintegrations only. From actual measurements the r.m.s. amplitude of counting efficiency fluctuation is estimated. Some consequences of counting efficiency fluctuation are investigated and the corresponding formulae are derived: (1) for detection limit as a function of the number of partial measurements and the relative amplitude of counting efficiency fluctuation, and (2) for optimum allocation of the number of partial measurements between sample and background. (author)

  11. Radon counting statistics - a Monte Carlo investigation

    International Nuclear Information System (INIS)

    Scott, A.G.

    1996-01-01

    Radioactive decay is a Poisson process, and so the Coefficient of Variation (COV) of open-quotes nclose quotes counts of a single nuclide is usually estimated as 1/√n. This is only true if the count duration is much shorter than the half-life of the nuclide. At longer count durations, the COV is smaller than the Poisson estimate. Most radon measurement methods count the alpha decays of 222 Rn, plus the progeny 218 Po and 214 Po, and estimate the 222 Rn activity from the sum of the counts. At long count durations, the chain decay of these nuclides means that every 222 Rn decay must be followed by two other alpha decays. The total number of decays is open-quotes 3Nclose quotes, where N is the number of radon decays, and the true COV of the radon concentration estimate is 1/√(N), √3 larger than the Poisson total count estimate of 1/√3N. Most count periods are comparable to the half lives of the progeny, so the relationship between COV and count time is complex. A Monte-Carlo estimate of the ratio of true COV to Poisson estimate was carried out for a range of count periods from 1 min to 16 h and three common radon measurement methods: liquid scintillation, scintillation cell, and electrostatic precipitation of progeny. The Poisson approximation underestimates COV by less than 20% for count durations of less than 60 min

  12. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  13. Counting in Lattices: Combinatorial Problems from Statistical Mechanics.

    Science.gov (United States)

    Randall, Dana Jill

    In this thesis we consider two classical combinatorial problems arising in statistical mechanics: counting matchings and self-avoiding walks in lattice graphs. The first problem arises in the study of the thermodynamical properties of monomers and dimers (diatomic molecules) in crystals. Fisher, Kasteleyn and Temperley discovered an elegant technique to exactly count the number of perfect matchings in two dimensional lattices, but it is not applicable for matchings of arbitrary size, or in higher dimensional lattices. We present the first efficient approximation algorithm for computing the number of matchings of any size in any periodic lattice in arbitrary dimension. The algorithm is based on Monte Carlo simulation of a suitable Markov chain and has rigorously derived performance guarantees that do not rely on any assumptions. In addition, we show that these results generalize to counting matchings in any graph which is the Cayley graph of a finite group. The second problem is counting self-avoiding walks in lattices. This problem arises in the study of the thermodynamics of long polymer chains in dilute solution. While there are a number of Monte Carlo algorithms used to count self -avoiding walks in practice, these are heuristic and their correctness relies on unproven conjectures. In contrast, we present an efficient algorithm which relies on a single, widely-believed conjecture that is simpler than preceding assumptions and, more importantly, is one which the algorithm itself can test. Thus our algorithm is reliable, in the sense that it either outputs answers that are guaranteed, with high probability, to be correct, or finds a counterexample to the conjecture. In either case we know we can trust our results and the algorithm is guaranteed to run in polynomial time. This is the first algorithm for counting self-avoiding walks in which the error bounds are rigorously controlled. This work was supported in part by an AT&T graduate fellowship, a University of

  14. Gene coexpression measures in large heterogeneous samples using count statistics.

    Science.gov (United States)

    Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan

    2014-11-18

    With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.

  15. Counting statistics of many-particle quantum walks

    Science.gov (United States)

    Mayer, Klaus; Tichy, Malte C.; Mintert, Florian; Konrad, Thomas; Buchleitner, Andreas

    2011-06-01

    We study quantum walks of many noninteracting particles on a beam splitter array as a paradigmatic testing ground for the competition of single- and many-particle interference in a multimode system. We derive a general expression for multimode particle-number correlation functions, valid for bosons and fermions, and infer pronounced signatures of many-particle interferences in the counting statistics.

  16. Counting statistics of many-particle quantum walks

    International Nuclear Information System (INIS)

    Mayer, Klaus; Tichy, Malte C.; Buchleitner, Andreas; Mintert, Florian; Konrad, Thomas

    2011-01-01

    We study quantum walks of many noninteracting particles on a beam splitter array as a paradigmatic testing ground for the competition of single- and many-particle interference in a multimode system. We derive a general expression for multimode particle-number correlation functions, valid for bosons and fermions, and infer pronounced signatures of many-particle interferences in the counting statistics.

  17. Full counting statistics of multiple Andreev reflections in incoherent diffusive superconducting junctions

    International Nuclear Information System (INIS)

    Samuelsson, P.

    2007-01-01

    We present a theory for the full distribution of current fluctuations in incoherent diffusive superconducting junctions, subjected to a voltage bias. This theory of full counting statistics of incoherent multiple Andreev reflections is valid for an arbitrary applied voltage. We present a detailed discussion of the properties of the first four cumulants as well as the low and high voltage regimes of the full counting statistics. (orig.)

  18. Counting statistics and loss corrections for the APS

    International Nuclear Information System (INIS)

    Lee, W.K.; Mills, D.M.

    1992-01-01

    It has been suggested that for timing experiments, it might be advantageous to arrange the bunches in the storage ring in an asymmetrical mode. In this paper, we determine the counting losses from pulsed x-ray sources from basic probabilistic arguments and from Poisson statistics. In particular the impact on single-photon counting losses of a variety of possible filling modes for the Advanced Photon Source (APS) is examined. For bunches of equal current, a loss of 10% occurs whenever the count rate exceeds 21% of the bunch repetition rate. This changes slightly when bunches containing unequal numbers of particles are considered. The results are applied to several common detector/electronics systems

  19. Counting statistics and loss corrections for the APS

    International Nuclear Information System (INIS)

    Lee, W.K.; Mills, D.M.

    1992-01-01

    It has been suggested that for timing experiments, it might be advantageous to arrange the bunches in the storage ring in an asymmetrical mode. In this paper, we determine the counting losses from pulsed x-ray sources from basic probabilistic arguments and from Poisson statistics. In particular the impact on single photon counting losses of a variety of possible filling modes for the Advanced Photon Source (APS) is examined. For bunches of equal current, a loss of 10% occurs whenever the count rate exceeds 21% of the bunch repetition rate. This changes slightly when bunches containing unequal numbers of particles are considered. The results are applied to several common detector/electronics systems

  20. Radiation measurement practice for understanding statistical fluctuation of radiation count using natural radiation sources

    International Nuclear Information System (INIS)

    Kawano, Takao

    2014-01-01

    It is known that radiation is detected at random and the radiation counts fluctuate statistically. In the present study, a radiation measurement experiment was performed to understand the randomness and statistical fluctuation of radiation counts. In the measurement, three natural radiation sources were used. The sources were fabricated from potassium chloride chemicals, chemical fertilizers and kelps. These materials contain naturally occurring potassium-40 that is a radionuclide. From high schools, junior high schools and elementary schools, nine teachers participated to the radiation measurement experiment. Each participant measured the 1-min integration counts of radiation five times using GM survey meters, and 45 sets of data were obtained for the respective natural radiation sources. It was found that the frequency of occurrence of radiation counts was distributed according to a Gaussian distribution curve, although the obtained 45 data sets of radiation counts superficially looked to be fluctuating meaninglessly. (author)

  1. Reducing bias in the analysis of counting statistics data

    International Nuclear Information System (INIS)

    Hammersley, A.P.; Antoniadis, A.

    1997-01-01

    In the analysis of counting statistics data it is common practice to estimate the variance of the measured data points as the data points themselves. This practice introduces a bias into the results of further analysis which may be significant, and under certain circumstances lead to false conclusions. In the case of normal weighted least squares fitting this bias is quantified and methods to avoid it are proposed. (orig.)

  2. Experimental investigation of statistical models describing distribution of counts

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1992-01-01

    The binomial, Poisson and modified Poisson models which are used for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are considered. The validity of the Poisson and the modified Poisson statistical distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89 Y m (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson statistics describe the counting experiment for short measuring times (up to T=0.5T 1/2 ) and its application is recommended. However, analysis of the data demonstrated, with confidence, that for long measurements (T≥T 1/2 ) Poisson distribution is not valid and the modified Poisson function is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. Differences between the standard deviations evaluated on the basis of the Poisson and binomial models are especially significant for experiments with long measuring time (T/T 1/2 ≥2) and/or large detection efficiency (ε>0.30). Optimization of the measuring time for paired observations yields the same solution for either the binomial or the Poisson distribution. (orig.)

  3. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  4. Statistical analysis of nematode counts from interlaboratory proficiency tests

    NARCIS (Netherlands)

    Berg, van den W.; Hartsema, O.; Nijs, Den J.M.F.

    2014-01-01

    A series of proficiency tests on potato cyst nematode (PCN; n=29) and free-living stages of Meloidogyne and Pratylenchus (n=23) were investigated to determine the accuracy and precision of the nematode counts and to gain insights into possible trends and potential improvements. In each test, each

  5. Binomial distribution of Poisson statistics and tracks overlapping probability to estimate total tracks count with low uncertainty

    International Nuclear Information System (INIS)

    Khayat, Omid; Afarideh, Hossein; Mohammadnia, Meisam

    2015-01-01

    In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks

  6. Statistical analysis of the count and profitability of air conditioners.

    Science.gov (United States)

    Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A

    2018-08-01

    This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.

  7. Optimization of counting time using count statistics on a diffraction beamline

    Energy Technology Data Exchange (ETDEWEB)

    Marais, D., E-mail: Deon.Marais@necsa.co.za [Research and Development Division, South African Nuclear Energy Corporation (Necsa) SOC Limited, PO Box 582, Pretoria 0001 (South Africa); School of Mechanical and Nuclear Engineering, North-West University, Potchefstroom 2520 (South Africa); Venter, A.M., E-mail: Andrew.Venter@necsa.co.za [Research and Development Division, South African Nuclear Energy Corporation (Necsa) SOC Limited, PO Box 582, Pretoria 0001 (South Africa); Faculty of Agriculture Science and Technology, North-West University, Mahikeng 2790 (South Africa); Markgraaff, J., E-mail: Johan.Markgraaff@nwu.ac.za [School of Mechanical and Nuclear Engineering, North-West University, Potchefstroom 2520 (South Africa)

    2016-05-11

    The feasibility of an alternative data acquisition strategy to improve the efficiency of beam time usage with neutron strain scanner instruments is demonstrated. By performing strain measurements against set statistical criteria, rather than time, not only leads to substantially reduced sample investigation time but also renders data of similar quality throughout.

  8. Gaussian point count statistics for families of curves over a fixed finite field

    OpenAIRE

    Kurlberg, Par; Wigman, Igor

    2010-01-01

    We produce a collection of families of curves, whose point count statistics over F_p becomes Gaussian for p fixed. In particular, the average number of F_p points on curves in these families tends to infinity.

  9. Spatial statistics of pitting corrosion patterning: Quadrat counts and the non-homogeneous Poisson process

    International Nuclear Information System (INIS)

    Lopez de la Cruz, J.; Gutierrez, M.A.

    2008-01-01

    This paper presents a stochastic analysis of spatial point patterns as effect of localized pitting corrosion. The Quadrat Counts method is studied with two empirical pit patterns. The results are dependent on the quadrat size and bias is introduced when empty quadrats are accounted for the analysis. The spatially inhomogeneous Poisson process is used to improve the performance of the Quadrat Counts method. The latter combines Quadrat Counts with distance-based statistics in the analysis of pit patterns. The Inter-Event and the Nearest-Neighbour statistics are here implemented in order to compare their results. Further, the treatment of patterns in irregular domains is discussed

  10. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  11. High counting rate resistive-plate chamber

    International Nuclear Information System (INIS)

    Peskov, V.; Anderson, D.F.; Kwan, S.

    1993-05-01

    Parallel-plate avalanche chambers (PPAC) are widely used in physics experiments because they are fast ( 5 counts/mm 2 . A resistive-plate chamber (RPC) is similar to the PPAC in construction except that one or both of the electrodes are made from high resistivity (≥10 10 Ω·cm) materials. In practice RPCs are usually used in the spark mode. Resistive electrodes are charged by sparks, locally reducing the actual electric field in the gap. The size of the charged surface is about 10 mm 2 , leaving the rest of the detector unaffected. Therefore, the rate capability of such detectors in the spark mode is considerably higher than conventional spark counters. Among the different glasses tested the best results were obtained with electron type conductive glasses, which obey Ohm's law. Most of the work with such glasses was done with high pressure parallel-plate chambers (10 atm) for time-of-flight measurements. Resistive glasses have been expensive and produced only in small quantities. Now resistive glasses are commercially available, although they are still expensive in small scale production. From the positive experience of different groups working with the resistive glasses, it was decided to review the old idea to use this glass for the RPC. This work has investigated the possibility of using the RPC at 1 atm and in the avalanche mode. This has several advantages: simplicity of construction, high rate capability, low voltage operation, and the ability to work with non-flammable gases

  12. Non-Poisson counting statistics of a hybrid G-M counter dead time model

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Jae, Moosung; Gardner, Robin P.

    2007-01-01

    The counting statistics of a G-M counter with a considerable dead time event rate deviates from Poisson statistics. Important characteristics such as observed counting rates as a function true counting rates, variances and interval distributions were analyzed for three dead time models, non-paralyzable, paralyzable and hybrid, with the help of GMSIM, a Monte Carlo dead time effect simulator. The simulation results showed good agreements with the models in observed counting rates and variances. It was found through GMSIM simulations that the interval distribution for the hybrid model showed three distinctive regions, a complete cutoff region for the duration of the total dead time, a degraded exponential and an enhanced exponential regions. By measuring the cutoff and the duration of degraded exponential from the pulse interval distribution, it is possible to evaluate the two dead times in the hybrid model

  13. Full Counting Statistics for Interacting Fermions with Determinantal Quantum Monte Carlo Simulations.

    Science.gov (United States)

    Humeniuk, Stephan; Büchler, Hans Peter

    2017-12-08

    We present a method for computing the full probability distribution function of quadratic observables such as particle number or magnetization for the Fermi-Hubbard model within the framework of determinantal quantum Monte Carlo calculations. Especially in cold atom experiments with single-site resolution, such a full counting statistics can be obtained from repeated projective measurements. We demonstrate that the full counting statistics can provide important information on the size of preformed pairs. Furthermore, we compute the full counting statistics of the staggered magnetization in the repulsive Hubbard model at half filling and find excellent agreement with recent experimental results. We show that current experiments are capable of probing the difference between the Hubbard model and the limiting Heisenberg model.

  14. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    Science.gov (United States)

    Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.

    2017-01-01

    Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.

  15. Local box-counting dimensions of discrete quantum eigenvalue spectra: Analytical connection to quantum spectral statistics

    Science.gov (United States)

    Sakhr, Jamal; Nieminen, John M.

    2018-03-01

    Two decades ago, Wang and Ong, [Phys. Rev. A 55, 1522 (1997)], 10.1103/PhysRevA.55.1522 hypothesized that the local box-counting dimension of a discrete quantum spectrum should depend exclusively on the nearest-neighbor spacing distribution (NNSD) of the spectrum. In this Rapid Communication, we validate their hypothesis by deriving an explicit formula for the local box-counting dimension of a countably-infinite discrete quantum spectrum. This formula expresses the local box-counting dimension of a spectrum in terms of single and double integrals of the NNSD of the spectrum. As applications, we derive an analytical formula for Poisson spectra and closed-form approximations to the local box-counting dimension for spectra having Gaussian orthogonal ensemble (GOE), Gaussian unitary ensemble (GUE), and Gaussian symplectic ensemble (GSE) spacing statistics. In the Poisson and GOE cases, we compare our theoretical formulas with the published numerical data of Wang and Ong and observe excellent agreement between their data and our theory. We also study numerically the local box-counting dimensions of the Riemann zeta function zeros and the alternate levels of GOE spectra, which are often used as numerical models of spectra possessing GUE and GSE spacing statistics, respectively. In each case, the corresponding theoretical formula is found to accurately describe the numerically computed local box-counting dimension.

  16. Statistical method for resolving the photon-photoelectron-counting inversion problem

    International Nuclear Information System (INIS)

    Wu Jinlong; Li Tiejun; Peng, Xiang; Guo Hong

    2011-01-01

    A statistical inversion method is proposed for the photon-photoelectron-counting statistics in quantum key distribution experiment. With the statistical viewpoint, this problem is equivalent to the parameter estimation for an infinite binomial mixture model. The coarse-graining idea and Bayesian methods are applied to deal with this ill-posed problem, which is a good simple example to show the successful application of the statistical methods to the inverse problem. Numerical results show the applicability of the proposed strategy. The coarse-graining idea for the infinite mixture models should be general to be used in the future.

  17. Farey Statistics in Time n^{2/3} and Counting Primitive Lattice Points in Polygons

    OpenAIRE

    Patrascu, Mihai

    2007-01-01

    We present algorithms for computing ranks and order statistics in the Farey sequence, taking time O (n^{2/3}). This improves on the recent algorithms of Pawlewicz [European Symp. Alg. 2007], running in time O (n^{3/4}). We also initiate the study of a more general algorithmic problem: counting primitive lattice points in planar shapes.

  18. Application of statistical methods to the testing of nuclear counting assemblies

    International Nuclear Information System (INIS)

    Gilbert, J.P.; Friedling, G.

    1965-01-01

    This report describes the application of the hypothesis test theory to the control of the 'statistical purity' and of the stability of the counting batteries used for measurements on activation detectors in research reactors. The principles involved and the experimental results obtained at Cadarache on batteries operating with the reactors PEGGY and AZUR are given. (authors) [fr

  19. High Reproducibility of ELISPOT Counts from Nine Different Laboratories

    Directory of Open Access Journals (Sweden)

    Srividya Sundararaman

    2015-01-01

    Full Text Available The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable background intensities. Due to the subjective nature of judging maximal and minimal spot sizes, different investigators come up with different numbers. This study aims to determine whether statistics-based, automated size-gating can harmonize the number of spot counts calculated between different laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A Basic Count™ relying on subjective counting parameters set by the respective investigators and (B SmartCount™, an automated counting protocol by the ImmunoSpot® Software that uses statistics-based spot size auto-gating with spot intensity auto-thresholding. The average coefficient of variation (CV for the mean values between independent laboratories was 26.7% when counting with Basic Count™, and 6.7% when counting with SmartCount™. Our data indicates that SmartCount™ allows harmonization of counting ELISPOT results between different laboratories and investigators.

  20. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. The estimation of differential counting measurements of possitive quantities with relatively large statistical errors

    International Nuclear Information System (INIS)

    Vincent, C.H.

    1982-01-01

    Bayes' principle is applied to the differential counting measurement of a positive quantity in which the statistical errors are not necessarily small in relation to the true value of the quantity. The methods of estimation derived are found to give consistent results and to avoid the anomalous negative estimates sometimes obtained by conventional methods. One of the methods given provides a simple means of deriving the required estimates from conventionally presented results and appears to have wide potential applications. Both methods provide the actual posterior probability distribution of the quantity to be measured. A particularly important potential application is the correction of counts on low radioacitvity samples for background. (orig.)

  2. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    Science.gov (United States)

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  3. Fast pulse discriminator for photon counting at high photon densities

    International Nuclear Information System (INIS)

    Benoit, R.; Pedrini, A.

    1977-03-01

    A fast tunnel diode discriminator for photon counting up to 200MHz count frequency is described. The tunnel diode is operated on its apparent I.V. characteristics displayed when the diode is driven into its oscillating region. The pulse shaper-discriminator is completely D.C. coupled in order to avoid base-line shift at high pulse rates

  4. Full counting statistics of level renormalization in electron transport through double quantum dots

    International Nuclear Information System (INIS)

    Luo Junyan; Shen Yu; Cen Gang; He Xiaoling; Wang Changrong; Jiao Hujun

    2011-01-01

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  5. Full counting statistics in a serially coupled double quantum dot system with spin-orbit coupling

    Science.gov (United States)

    Wang, Qiang; Xue, Hai-Bin; Xie, Hai-Qing

    2018-04-01

    We study the full counting statistics of electron transport through a serially coupled double quantum dot (QD) system with spin-orbit coupling (SOC) weakly coupled to two electrodes. We demonstrate that the spin polarizations of the source and drain electrodes determine whether the shot noise maintains super-Poissonian distribution, and whether the sign transitions of the skewness from positive to negative values and of the kurtosis from negative to positive values take place. In particular, the interplay between the spin polarizations of the source and drain electrodes and the magnitude of the external magnetic field, can give rise to a gate-voltage-tunable strong negative differential conductance (NDC) and the shot noise in this NDC region is significantly enhanced. Importantly, for a given SOC parameter, the obvious variation of the high-order current cumulants as a function of the energy-level detuning in a certain range, especially the dip position of the Fano factor of the skewness can be used to qualitatively extract the information about the magnitude of the SOC.

  6. PREFACE: Counting Complexity: An international workshop on statistical mechanics and combinatorics

    Science.gov (United States)

    de Gier, Jan; Warnaar, Ole

    2006-07-01

    On 10-15 July 2005 the conference `Counting Complexity: An international workshop on statistical mechanics and combinatorics' was held on Dunk Island, Queensland, Australia in celebration of Tony Guttmann's 60th birthday. Dunk Island provided the perfect setting for engaging in almost all of Tony's life-long passions: swimming, running, food, wine and, of course, plenty of mathematics and physics. The conference was attended by many of Tony's close scientific friends from all over the world, and most talks were presented by his past and present collaborators. This volume contains the proceedings of the meeting and consists of 24 refereed research papers in the fields of statistical mechanics, condensed matter physics and combinatorics. These papers provide an excellent illustration of the breadth and scope of Tony's work. The very first contribution, written by Stu Whittington, contains an overview of the many scientific achievements of Tony over the past 40 years in mathematics and physics. The organizing committee, consisting of Richard Brak, Aleks Owczarek, Jan de Gier, Emma Lockwood, Andrew Rechnitzer and Ole Warnaar, gratefully acknowledges the Australian Mathematical Society (AustMS), the Australian Mathematical Sciences Institute (AMSI), the ARC Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS), the ARC Complex Open Systems Research Network (COSNet), the Institute of Physics (IOP) and the Department of Mathematics and Statistics of The University of Melbourne for financial support in organizing the conference. Tony, we hope that your future years in mathematics will be numerous. Count yourself lucky! Tony Guttman

  7. On-line statistical processing of radiation detector pulse trains with time-varying count rates

    International Nuclear Information System (INIS)

    Apostolopoulos, G.

    2008-01-01

    Statistical analysis is of primary importance for the correct interpretation of nuclear measurements, due to the inherent random nature of radioactive decay processes. This paper discusses the application of statistical signal processing techniques to the random pulse trains generated by radiation detectors. The aims of the presented algorithms are: (i) continuous, on-line estimation of the underlying time-varying count rate θ(t) and its first-order derivative dθ/dt; (ii) detection of abrupt changes in both of these quantities and estimation of their new value after the change point. Maximum-likelihood techniques, based on the Poisson probability distribution, are employed for the on-line estimation of θ and dθ/dt. Detection of abrupt changes is achieved on the basis of the generalized likelihood ratio statistical test. The properties of the proposed algorithms are evaluated by extensive simulations and possible applications for on-line radiation monitoring are discussed

  8. Homicides by Police: Comparing Counts From the National Violent Death Reporting System, Vital Statistics, and Supplementary Homicide Reports.

    Science.gov (United States)

    Barber, Catherine; Azrael, Deborah; Cohen, Amy; Miller, Matthew; Thymes, Deonza; Wang, David Enze; Hemenway, David

    2016-05-01

    To evaluate the National Violent Death Reporting System (NVDRS) as a surveillance system for homicides by law enforcement officers. We assessed sensitivity and positive predictive value of the NVDRS "type of death" variable against our study count of homicides by police, which we derived from NVDRS coded and narrative data for states participating in NVDRS 2005 to 2012. We compared state counts of police homicides from NVDRS, Vital Statistics, and Federal Bureau of Investigation Supplementary Homicide Reports. We identified 1552 police homicides in the 16 states. Positive predictive value and sensitivity of the NVDRS "type of death" variable for police homicides were high (98% and 90%, respectively). Counts from Vital Statistics and Supplementary Homicide Reports were 58% and 48%, respectively, of our study total; gaps varied widely by state. The annual rate of police homicide (0.24/100,000) varied 5-fold by state and 8-fold by race/ethnicity. NVDRS provides more complete data on police homicides than do existing systems. Expanding NVDRS to all 50 states and making 2 improvements we identify will be an efficient way to provide the nation with more accurate, detailed data on homicides by law enforcement.

  9. Full counting statistics of a charge pump in the Coulomb blockade regime

    Science.gov (United States)

    Andreev, A. V.; Mishchenko, E. G.

    2001-12-01

    We study full charge counting statistics (FCCS) of a charge pump based on a nearly open single electron transistor. The problem is mapped onto an exactly soluble problem of a nonequilibrium g=1/2 Luttinger liquid with an impurity. We obtain an analytic expression for the generating function of the transmitted charge for an arbitrary pumping strength. Although this model contains fractionally charged excitations only integer transmitted charges can be observed. In the weak pumping limit FCCS correspond to a Poissonian transmission of particles with charge e*=e/2 from which all events with odd numbers of transferred particles are excluded.

  10. The statistical interpretations of counting data from measurements of low-level radioactivity

    International Nuclear Information System (INIS)

    Donn, J.J.; Wolke, R.L.

    1977-01-01

    The statistical model appropriate to measurements of low-level or background-dominant radioactivity is examined and the derived relationships are applied to two practical problems involving hypothesis testing: 'Does the sample exhibit a net activity above background' and 'Is the activity of the sample below some preselected limit'. In each of these cases, the appropriate decision rule is formulated, procedures are developed for estimating the preset count which is necessary to achieve a desired probability of detection, and a specific sequence of operations is provided for the worker in the field. (author)

  11. Polychromatic Iterative Statistical Material Image Reconstruction for Photon-Counting Computed Tomography

    Directory of Open Access Journals (Sweden)

    Thomas Weidinger

    2016-01-01

    Full Text Available This work proposes a dedicated statistical algorithm to perform a direct reconstruction of material-decomposed images from data acquired with photon-counting detectors (PCDs in computed tomography. It is based on local approximations (surrogates of the negative logarithmic Poisson probability function. Exploiting the convexity of this function allows for parallel updates of all image pixels. Parallel updates can compensate for the rather slow convergence that is intrinsic to statistical algorithms. We investigate the accuracy of the algorithm for ideal photon-counting detectors. Complementarily, we apply the algorithm to simulation data of a realistic PCD with its spectral resolution limited by K-escape, charge sharing, and pulse-pileup. For data from both an ideal and realistic PCD, the proposed algorithm is able to correct beam-hardening artifacts and quantitatively determine the material fractions of the chosen basis materials. Via regularization we were able to achieve a reduction of image noise for the realistic PCD that is up to 90% lower compared to material images form a linear, image-based material decomposition using FBP images. Additionally, we find a dependence of the algorithms convergence speed on the threshold selection within the PCD.

  12. On temporal correlations in high-resolution frequency counting

    OpenAIRE

    Dunker, Tim; Hauglin, Harald; Rønningen, Ole Petter

    2016-01-01

    We analyze noise properties of time series of frequency data from different counting modes of a Keysight 53230A frequency counter. We use a 10 MHz reference signal from a passive hydrogen maser connected via phase-stable Huber+Suhner Sucoflex 104 cables to the reference and input connectors of the counter. We find that the high resolution gap-free (CONT) frequency counting process imposes long-term correlations in the output data, resulting in a modified Allan deviation that is characteristic...

  13. Bayesian Penalized Likelihood Image Reconstruction (Q.Clear) in 82Rb Cardiac PET: Impact of Count Statistics

    DEFF Research Database (Denmark)

    Christensen, Nana Louise; Tolbod, Lars Poulsen

    PET scans. 3) Static and dynamic images from a set of 7 patients (BSA: 1.6-2.2 m2) referred for 82Rb cardiac PET was analyzed using a range of beta factors. Results were compared to the institution’s standard clinical practice reconstruction protocol. All scans were performed on GE DMI Digital......Aim: Q.Clear reconstruction is expected to improve detection of perfusion defects in cardiac PET due to the high degree of image convergence and effective noise suppression. However, 82Rb (T½=76s) possess a special problem, since count statistics vary significantly not only between patients...... statistics using a cardiac PET phantom as well as a selection of clinical patients referred for 82Rb cardiac PET. Methods: The study consistent of 3 parts: 1) A thorax-cardiac phantom was scanned for 10 minutes after injection of 1110 MBq 82Rb. Frames at 3 different times after infusion were reconstructed...

  14. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.

  15. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    Science.gov (United States)

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  16. Unifying quantum heat transfer in a nonequilibrium spin-boson model with full counting statistics

    Science.gov (United States)

    Wang, Chen; Ren, Jie; Cao, Jianshu

    2017-02-01

    To study the full counting statistics of quantum heat transfer in a driven nonequilibrium spin-boson model, we develop a generalized nonequilibrium polaron-transformed Redfield equation with an auxiliary counting field. This enables us to study the impact of qubit-bath coupling ranging from weak to strong regimes. Without external modulations, we observe maximal values of both steady-state heat flux and noise power in moderate coupling regimes, below which we find that these two transport quantities are enhanced by the finite-qubit-energy bias. With external modulations, the geometric-phase-induced heat flux shows a monotonic decrease upon increasing the qubit-bath coupling at zero qubit energy bias (without bias). While under the finite-qubit-energy bias (with bias), the geometric-phase-induced heat flux exhibits an interesting reversal behavior in the strong coupling regime. Our results unify the seemingly contradictory results in weak and strong qubit-bath coupling regimes and provide detailed dissections for the quantum fluctuation of nonequilibrium heat transfer.

  17. Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland   First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...

  18. RCT: Module 2.03, Counting Errors and Statistics, Course 8768

    Energy Technology Data Exchange (ETDEWEB)

    Hillmer, Kurt T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-01

    Radiological sample analysis involves the observation of a random process that may or may not occur and an estimation of the amount of radioactive material present based on that observation. Across the country, radiological control personnel are using the activity measurements to make decisions that may affect the health and safety of workers at those facilities and their surrounding environments. This course will present an overview of measurement processes, a statistical evaluation of both measurements and equipment performance, and some actions to take to minimize the sources of error in count room operations. This course will prepare the student with the skills necessary for radiological control technician (RCT) qualification by passing quizzes, tests, and the RCT Comprehensive Phase 1, Unit 2 Examination (TEST 27566) and by providing in the field skills.

  19. High Channel Count, High Density Microphone Arrays for Wind Tunnel Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The Interdisciplinary Consulting Corporation (IC2) proposes the development of high channel count, high density, reduced cost per channel, directional microphone...

  20. A unified statistical framework for material decomposition using multienergy photon counting x-ray detectors

    International Nuclear Information System (INIS)

    Choi, Jiyoung; Kang, Dong-Goo; Kang, Sunghoon; Sung, Younghun; Ye, Jong Chul

    2013-01-01

    Purpose: Material decomposition using multienergy photon counting x-ray detectors (PCXD) has been an active research area over the past few years. Even with some success, the problem of optimal energy selection and three material decomposition including malignant tissue is still on going research topic, and more systematic studies are required. This paper aims to address this in a unified statistical framework in a mammographic environment.Methods: A unified statistical framework for energy level optimization and decomposition of three materials is proposed. In particular, an energy level optimization algorithm is derived using the theory of the minimum variance unbiased estimator, and an iterative algorithm is proposed for material composition as well as system parameter estimation under the unified statistical estimation framework. To verify the performance of the proposed algorithm, the authors performed simulation studies as well as real experiments using physical breast phantom and ex vivo breast specimen. Quantitative comparisons using various performance measures were conducted, and qualitative performance evaluations for ex vivo breast specimen were also performed by comparing the ground-truth malignant tissue areas identified by radiologists.Results: Both simulation and real experiments confirmed that the optimized energy bins by the proposed method allow better material decomposition quality. Moreover, for the specimen thickness estimation errors up to 2 mm, the proposed method provides good reconstruction results in both simulation and real ex vivo breast phantom experiments compared to existing methods.Conclusions: The proposed statistical framework of PCXD has been successfully applied for the energy optimization and decomposition of three material in a mammographic environment. Experimental results using the physical breast phantom and ex vivo specimen support the practicality of the proposed algorithm

  1. Statistical Methods for Unusual Count Data: Examples From Studies of Microchimerism

    Science.gov (United States)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads; Tjønneland, Anne; Gadi, Vijayakrishna K.; Nelson, J. Lee; Leisenring, Wendy

    2016-01-01

    Natural acquisition of small amounts of foreign cells or DNA, referred to as microchimerism, occurs primarily through maternal-fetal exchange during pregnancy. Microchimerism can persist long-term and has been associated with both beneficial and adverse human health outcomes. Quantitative microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per total cell equivalents tested utilizes all available data and facilitates a comparison of rates between groups. We found that both the marginalized zero-inflated Poisson model and the negative binomial model can provide unbiased and consistent estimates of the overall association of exposure or study group with microchimerism detection rates. The negative binomial model remains the more accessible of these 2 approaches; thus, we conclude that the negative binomial model may be most appropriate for analyzing quantitative microchimerism data. PMID:27769989

  2. Statistics for High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed.  Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells,  Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...

  3. High rate 4π β-γ coincidence counting system

    International Nuclear Information System (INIS)

    Johnson, L.O.; Gehrke, R.J.

    1978-01-01

    A high count rate 4π β-γ coincidence counting system for the determination of absolute disintegration rates of short half-life radionuclides is described. With this system the dead time per pulse is minimized by not stretching any pulses beyond the width necessary to satisfy overlap coincidence requirements. The equations used to correct for the β, γ, and coincidence channel dead times and for accidental coincidences are presented but not rigorously developed. Experimental results are presented for a decaying source of 56 Mn initially at 2 x 10 6 d/s and a set of 60 Co sources of accurately known source strengths varying from 10 3 to 2 x 10 6 d/s. A check of the accidental coincidence equation for the case of two independent sources with varying source strengths is presented

  4. Full-counting statistics of energy transport of molecular junctions in the polaronic regime

    International Nuclear Information System (INIS)

    Tang, Gaomin; Yu, Zhizhou; Wang, Jian

    2017-01-01

    We investigate the full-counting statistics (FCS) of energy transport carried by electrons in molecular junctions for the Anderson–Holstein model in the polaronic regime. Using the two-time quantum measurement scheme, the generating function (GF) for the energy transport is derived and expressed as a Fredholm determinant in terms of Keldysh nonequilibrium Green’s function in the time domain. Dressed tunneling approximation is used in decoupling the phonon cloud operator in the polaronic regime. This formalism enables us to analyze the time evolution of energy transport dynamics after a sudden switch-on of the coupling between the dot and the leads towards the stationary state. The steady state energy current cumulant GF in the long time limit is obtained in the energy domain as well. Universal relations for steady state energy current FCS are derived under a finite temperature gradient with zero bias and this enabled us to express the equilibrium energy current cumulant by a linear combination of lower order cumulants. The behaviors of energy current cumulants in steady state under temperature gradient and external bias are numerically studied and explained. The transient dynamics of energy current cumulants is numerically calculated and analyzed. Universal scaling of normalized transient energy cumulants is found under both temperature gradient and external bias. (paper)

  5. Metrology and statistical analysis for the precise standardisation of cobalt-60 by 4πβ-γ coincidence counting

    International Nuclear Information System (INIS)

    Buckman, S.M.

    1995-03-01

    The major part of the thesis is devoted to the theoretical development of a comprehensive PC-based statistical package for the analysis of data from coincidence-counting experiments. This analysis is applied to primary standardizations of Co performed in Australia and Japan. The Australian standardisation, the accuracy of which is confirmed through international comparison, is used to re-calibrate the ionisation chamber. Both Australian and Japanese coincidence-counting systems are interfaced to personal computers to enable replicated sets of measurements to be made under computer control. Further research to confirm the validity of the statistical model includes an experimental investigation into the non-Poisson behaviour of radiation detectors due to the effect of deadtime. Experimental investigation is conducted to determine which areas are most likely to limit the ultimate accuracy achievable with coincidence counting. The thesis concludes by discussing the possibilities of digital coincidence counting and outlines the design of a prototype system presently under development. The accuracy of the Australian standardisation is confirmed by international comparison. From this result a more accurate Co calibration is obtained for the Australian working standard. Based on the work of this thesis, uncertainties in coincidence counting experiments can be better handled with resulting improvements in measurement reliability. The concept and benefits of digital coincidence counting are discussed and a proposed design is given for such a system. All of the data and software associated with this thesis is provided on computer discs. 237 refs., figs., tabs

  6. Variability in faecal egg counts – a statistical model to achieve reliable determination of anthelmintic resistance in livestock

    DEFF Research Database (Denmark)

    Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret

    statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL...

  7. Scalable Intersample Interpolation Architecture for High-channel-count Beamformers

    DEFF Research Database (Denmark)

    Tomov, Borislav Gueorguiev; Nikolov, Svetoslav I; Jensen, Jørgen Arendt

    2011-01-01

    Modern ultrasound scanners utilize digital beamformers that operate on sampled and quantized echo signals. Timing precision is of essence for achieving good focusing. The direct way to achieve it is through the use of high sampling rates, but that is not economical, so interpolation between echo...... samples is used. This paper presents a beamformer architecture that combines a band-pass filter-based interpolation algorithm with the dynamic delay-and-sum focusing of a digital beamformer. The reduction in the number of multiplications relative to a linear perchannel interpolation and band-pass per......-channel interpolation architecture is respectively 58 % and 75 % beamformer for a 256-channel beamformer using 4-tap filters. The approach allows building high channel count beamformers while maintaining high image quality due to the use of sophisticated intersample interpolation....

  8. A statistical analysis of count normalization methods used in positron-emission tomography

    International Nuclear Information System (INIS)

    Holmes, T.J.; Ficke, D.C.; Snyder, D.L.

    1984-01-01

    As part of the Positron-Emission Tomography (PET) reconstruction process, annihilation counts are normalized for photon absorption, detector efficiency and detector-pair duty-cycle. Several normalization methods of time-of-flight and conventional systems are analyzed mathematically for count bias and variance. The results of the study have some implications on hardware and software complexity and on image noise and distortion

  9. Low power ion spectrometer for high counting rates

    International Nuclear Information System (INIS)

    Klein, J.W.; Dullenkopf, P.; Glasmachers, A.; Melbert, J.; Winkelnkemper, W.

    1980-01-01

    This report describes in detail the electronic concept for a time-of-flight (TOF) ion spectrometer for high counting rates and high dynamic range which can be used as a satellite instrument. The detection principle of the spectrometer is based on a time-of-flight and energy measurement for each incident ion. The ionmass is related to these two quantities by a simple equation. The described approach for the mass identification systems is using an analog fast-slow concept: The fast TOF-signal preselects the gainstep in the much slower energy channel. The conversion time of the mass identifier is approximately 10 -6 s and the dynamic range of the energy channel is better than 10 3 (20 keV to 25 MeV). The purpose of this study was to demonstrate the feasibility of a TOF-spectrometer capable to measure the ion composition in planetary magnetospheres. (orig.) [de

  10. Assessment of the statistical uncertainty affecting a counting; Evaluation de l'incertitude statistique affectant un comptage

    Energy Technology Data Exchange (ETDEWEB)

    Cluchet, J.

    1960-07-01

    After a recall of some aspects regarding the Gauss law and the Gauss curve, this note addresses the case of performance of a large number of measurements of a source activity by means of a sensor (counter, scintillator, nuclear emulsion, etc.) at equal intervals, and with a number of events which is not rigorously constant. Thus, it addresses measurements, and more particularly counting operations in a random or statistical environment. It more particularly addresses the case of a counting rate due to a source greater (and then lower) than twenty times the Eigen movement. The validity of curves is discussed.

  11. A multiwire proportional counter for very high counting rates

    International Nuclear Information System (INIS)

    Barbosa, A.F.; Guedes, G.P.; Tamura, E.; Pepe, I.M.; Oliveira, N.B.

    1997-12-01

    Preliminary measurements in a proportional counter with two independently counting wires showed that counting rates up to 10 6 counts/s per wire can be reached without critical loss in the true versus measured linearity relation. Results obtained with a detector containing 30 active wires (2 mm pitch) are presented. To each wire is associated a fast pre-amplifier and a discriminator channel. Global counting rates in excess to 10 7 events/s are reported. Data acquisition systems are described for 1D (real time) and 2D (off-line) position sensitive detection systems. (author)

  12. A multiwire proportional counter for very high counting rates

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, A F; Guedes, G P [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Tamura, E [Laboratorio Nacional de Luz Sincrotron (LNLS), Campinas, SP (Brazil); Pepe, I M; Oliveira, N B [Bahia Univ., Salvador, BA (Brazil). Inst. de Fisica

    1997-12-01

    Preliminary measurements in a proportional counter with two independently counting wires showed that counting rates up to 10{sup 6} counts/s per wire can be reached without critical loss in the true versus measured linearity relation. Results obtained with a detector containing 30 active wires (2 mm pitch) are presented. To each wire is associated a fast pre-amplifier and a discriminator channel. Global counting rates in excess to 10{sup 7} events/s are reported. Data acquisition systems are described for 1D (real time) and 2D (off-line) position sensitive detection systems. (author) 13 refs., 6 figs.

  13. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  14. Cerenkov counting and Cerenkov-scintillation counting with high refractive index organic liquids using a liquid scintillation counter

    Energy Technology Data Exchange (ETDEWEB)

    Wiebe, L I; Helus, F; Maier-Borst, W [Deutsches Krebsforschungszentrum, Heidelberg (Germany, F.R.). Inst. fuer Nuklearmedizin

    1978-06-01

    /sup 18/F and /sup 14/C radioactivity was measured in methyl salicylate (MS), a high refractive index hybrid Cherenkov-scintillation generating medium, using a liquid scintillation counter. At concentrations of up to 21.4%, in MS, dimethyl sulfoxide (DMSO) quenched /sup 14/C fluorescence, and with a 10-fold excess of DMSO over MS, /sup 18/F count rates were reduced below that for DMSO alone, probably as a result of concentration-independent self-quenching due to 'dark-complex' formation. DMSO in lower concentrations did not reduce the counting efficiency of /sup 18/F in MS. Nitrobenzene was a concentration-dependent quencher for both /sup 14/C and /sup 18/F in MS. Chlorobenzene (CB) and DMSO were both found to be weak Cherenkov generators with /sup 18/F. Counting efficiencies for /sup 18/F in MS, CB, and DMSO were 50.3, 7.8 and 4.3% respectively in the coincidence counting mode, and 58.1, 13.0 and 6.8% in the singles mode. /sup 14/C efficiencies were 14.4 and 22.3% for coincidence and singles respectively, and 15.3 and 42.0% using a modern counter designed for coincidence and single photon counting. The high /sup 14/C and /sup 18/F counting efficiency in MS are discussed with respect to excitation mechanism, on the basis of quench and channels ratios changes observed. It is proposed that MS functions as an efficient Cherenkov-scintillation generator for high-energy beta emitters such as /sup 18/F, and as a low-efficiency scintillator for weak beta emitting radionuclides such as /sup 14/C.

  15. Cerenkov counting and Cerenkov-scintillation counting with high refractive index organic liquids using a liquid scintillation counter

    International Nuclear Information System (INIS)

    Wiebe, L.I.; Helus, F.; Maier-Borst, W.

    1978-01-01

    18 F and 14 C radioactivity was measured in methyl salicylate (MS), a high refractive index hybrid Cherenkov-scintillation generating medium, using a liquid scintillation counter. At concentrations of up to 21.4%, in MS, dimethyl sulfoxide (DMSO) quenched 14 C fluorescence, and with a 10-fold excess of DMSO over MS, 18 F count rates were reduced below that for DMSO alone, probably as a result of concentration-independent self-quenching due to 'dark-complex' formation. DMSO in lower concentrations did not reduce the counting efficiency of 18 F in MS. Nitrobenzene was a concentration-dependent quencher for both 14 C and 18 F in MS. Chlorobenzene (CB) and DMSO were both found to be weak Cherenkov generators with 18 F. Counting efficiencies for 18 F in MS, CB, and DMSO were 50.3, 7.8 and 4.3% respectively in the coincidence counting mode, and 58.1, 13.0 and 6.8% in the singles mode. 14 C efficiencies were 14.4 and 22.3% for coincidence and singles respectively, and 15.3 and 42.0% using a modern counter designed for coincidence and single photon counting. The high 14 C and 18 F counting efficiency in MS are discussed with respect to excitation mechanism, on the basis of quench and channels ratios changes observed. It is proposed that MS functions as an efficient Cherenkov-scintillation generator for high-energy beta emitters such as 18 F, and as a low-efficiency scintillator for weak beta emitting radionuclides such as 14 C. (author)

  16. Bulk tank somatic cell counts analyzed by statistical process control tools to identify and monitor subclinical mastitis incidence.

    Science.gov (United States)

    Lukas, J M; Hawkins, D M; Kinsel, M L; Reneau, J K

    2005-11-01

    The objective of this study was to examine the relationship between monthly Dairy Herd Improvement (DHI) subclinical mastitis and new infection rate estimates and daily bulk tank somatic cell count (SCC) summarized by statistical process control tools. Dairy Herd Improvement Association test-day subclinical mastitis and new infection rate estimates along with daily or every other day bulk tank SCC data were collected for 12 mo of 2003 from 275 Upper Midwest dairy herds. Herds were divided into 5 herd production categories. A linear score [LNS = ln(BTSCC/100,000)/0.693147 + 3] was calculated for each individual bulk tank SCC. For both the raw SCC and the transformed data, the mean and sigma were calculated using the statistical quality control individual measurement and moving range chart procedure of Statistical Analysis System. One hundred eighty-three herds of the 275 herds from the study data set were then randomly selected and the raw (method 1) and transformed (method 2) bulk tank SCC mean and sigma were used to develop models for predicting subclinical mastitis and new infection rate estimates. Herd production category was also included in all models as 5 dummy variables. Models were validated by calculating estimates of subclinical mastitis and new infection rates for the remaining 92 herds and plotting them against observed values of each of the dependents. Only herd production category and bulk tank SCC mean were significant and remained in the final models. High R2 values (0.83 and 0.81 for methods 1 and 2, respectively) indicated a strong correlation between the bulk tank SCC and herd's subclinical mastitis prevalence. The standard errors of the estimate were 4.02 and 4.28% for methods 1 and 2, respectively, and decreased with increasing herd production. As a case study, Shewhart Individual Measurement Charts were plotted from the bulk tank SCC to identify shifts in mastitis incidence. Four of 5 charts examined signaled a change in bulk tank SCC before

  17. Assessment of noise in a digital image using the join-count statistic and the Moran test

    International Nuclear Information System (INIS)

    Kehshih Chuang; Huang, H.K.

    1992-01-01

    It is assumed that data bits of a pixel in digital images can be divided into signal and noise bits. The signal bits occupy the most significant part of the pixel. The signal parts of each pixel are correlated while the noise parts are uncorrelated. Two statistical methods, the Moran test and the join-count statistic, are used to examine the noise parts. Images from computerized tomography, magnetic resonance and computed radiography are used for the evaluation of the noise bits. A residual image is formed by subtracting the original image from its smoothed version. The noise level in the residual image is then identical to that in the original image. Both statistical tests are then performed on the bit planes of the residual image. Results show that most digital images contain only 8-9 bits of correlated information. Both methods are easy to implement and fast to perform. (author)

  18. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  19. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  20. Optimization of statistical methods for HpGe gamma-ray spectrometer used in wide count rate ranges

    Energy Technology Data Exchange (ETDEWEB)

    Gervino, G., E-mail: gervino@to.infn.it [UNITO - Università di Torino, Dipartimento di Fisica, Turin (Italy); INFN - Istituto Nazionale di Fisica Nucleare, Sez. Torino, Turin (Italy); Mana, G. [INRIM - Istituto Nazionale di Ricerca Metrologica, Turin (Italy); Palmisano, C. [UNITO - Università di Torino, Dipartimento di Fisica, Turin (Italy); INRIM - Istituto Nazionale di Ricerca Metrologica, Turin (Italy)

    2016-07-11

    The need to perform γ-ray measurements with HpGe detectors is a common technique in many fields such as nuclear physics, radiochemistry, nuclear medicine and neutron activation analysis. The use of HpGe detectors is chosen in situations where isotope identification is needed because of their excellent resolution. Our challenge is to obtain the “best” spectroscopy data possible in every measurement situation. “Best” is a combination of statistical (number of counts) and spectral quality (peak, width and position) over a wide range of counting rates. In this framework, we applied Bayesian methods and the Ellipsoidal Nested Sampling (a multidimensional integration technique) to study the most likely distribution for the shape of HpGe spectra. In treating these experiments, the prior information suggests to model the likelihood function with a product of Poisson distributions. We present the efforts that have been done in order to optimize the statistical methods to HpGe detector outputs with the aim to evaluate to a better order of precision the detector efficiency, the absolute measured activity and the spectra background. Reaching a more precise knowledge of statistical and systematic uncertainties for the measured physical observables is the final goal of this research project.

  1. Application of high intensity ultrasound treatment on Enterobacteriae count in milk

    Directory of Open Access Journals (Sweden)

    Anet Režek Jambrak

    2011-06-01

    Full Text Available Ultrasonication is a non-thermal method of food preservation that has the advantage of inactivating microbes in food without causing the common side-effects associated with conventional heat treatments, such as nutrient and flavour loss. In this work high intensity ultrasound was used to investigate inactivation Enterobacteriae count in raw milk. Raw milk with 4% of milk fat was treated with ultrasonic probe that was 12 mm in diameter and with 20 kHz frequency immerged in milk directly. For ultrasounds treatment, three parameters varied according to the statistical experimental design. Centre composite design was used to optimize and design experimental parameters: temperature (20, 40 and 60 °C, amplitude (120, 90 and 60 μm and time (6, 9 and 12 minutes. All analyses were performed immediately after sonication and after 3 and 5 days of storage in refrigeration at 4 °C. The facts that substantially affect the inactivation of microorganisms using ultrasound are the amplitude of the ultrasonic waves, the exposure/contact time with the microorganisms, and the temperatureof treatment. The achieved results indicate significant inactivation of microorganisms under longer period of treatments with ultrasonic probe particularly in combination with higher temperature andamplitude. Output optimal value of Enterobacteriae count has been defined by Statgraphics where lowest Enterobacteriae count (1.06151 log CFU mL-1 was as follows for specific ultrasound parameters: amplitude of 120 μm, treatment time for 12 min and temperature of 60 °C.

  2. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  3. Exact Local Correlations and Full Counting Statistics for Arbitrary States of the One-Dimensional Interacting Bose Gas

    Science.gov (United States)

    Bastianello, Alvise; Piroli, Lorenzo; Calabrese, Pasquale

    2018-05-01

    We derive exact analytic expressions for the n -body local correlations in the one-dimensional Bose gas with contact repulsive interactions (Lieb-Liniger model) in the thermodynamic limit. Our results are valid for arbitrary states of the model, including ground and thermal states, stationary states after a quantum quench, and nonequilibrium steady states arising in transport settings. Calculations for these states are explicitly presented and physical consequences are critically discussed. We also show that the n -body local correlations are directly related to the full counting statistics for the particle-number fluctuations in a short interval, for which we provide an explicit analytic result.

  4. Andreev Bound States Formation and Quasiparticle Trapping in Quench Dynamics Revealed by Time-Dependent Counting Statistics.

    Science.gov (United States)

    Souto, R Seoane; Martín-Rodero, A; Yeyati, A Levy

    2016-12-23

    We analyze the quantum quench dynamics in the formation of a phase-biased superconducting nanojunction. We find that in the absence of an external relaxation mechanism and for very general conditions the system gets trapped in a metastable state, corresponding to a nonequilibrium population of the Andreev bound states. The use of the time-dependent full counting statistics analysis allows us to extract information on the asymptotic population of even and odd many-body states, demonstrating that a universal behavior, dependent only on the Andreev state energy, is reached in the quantum point contact limit. These results shed light on recent experimental observations on quasiparticle trapping in superconducting atomic contacts.

  5. Making Women Count: Gender-Typing, Technology and Path Dependencies in Dutch Statistical Data Processing

    NARCIS (Netherlands)

    van den Ende, Jan; van Oost, Elizabeth C.J.

    2001-01-01

    This article is a longitudinal analysis of the relation between gendered labour divisions and new data processing technologies at the Dutch Central Bureau of Statistics (CBS). Following social-constructivist and evolutionary economic approaches, the authors hold that the relation between technology

  6. A question of separation: disentangling tracer bias and gravitational non-linearity with counts-in-cells statistics

    Science.gov (United States)

    Uhlemann, C.; Feix, M.; Codis, S.; Pichon, C.; Bernardeau, F.; L'Huillier, B.; Kim, J.; Hong, S. E.; Laigle, C.; Park, C.; Shin, J.; Pogosyan, D.

    2018-02-01

    Starting from a very accurate model for density-in-cells statistics of dark matter based on large deviation theory, a bias model for the tracer density in spheres is formulated. It adopts a mean bias relation based on a quadratic bias model to relate the log-densities of dark matter to those of mass-weighted dark haloes in real and redshift space. The validity of the parametrized bias model is established using a parametrization-independent extraction of the bias function. This average bias model is then combined with the dark matter PDF, neglecting any scatter around it: it nevertheless yields an excellent model for densities-in-cells statistics of mass tracers that is parametrized in terms of the underlying dark matter variance and three bias parameters. The procedure is validated on measurements of both the one- and two-point statistics of subhalo densities in the state-of-the-art Horizon Run 4 simulation showing excellent agreement for measured dark matter variance and bias parameters. Finally, it is demonstrated that this formalism allows for a joint estimation of the non-linear dark matter variance and the bias parameters using solely the statistics of subhaloes. Having verified that galaxy counts in hydrodynamical simulations sampled on a scale of 10 Mpc h-1 closely resemble those of subhaloes, this work provides important steps towards making theoretical predictions for density-in-cells statistics applicable to upcoming galaxy surveys like Euclid or WFIRST.

  7. Subcritical Multiplicative Chaos for Regularized Counting Statistics from Random Matrix Theory

    Science.gov (United States)

    Lambert, Gaultier; Ostrovsky, Dmitry; Simm, Nick

    2018-05-01

    For an {N × N} Haar distributed random unitary matrix U N , we consider the random field defined by counting the number of eigenvalues of U N in a mesoscopic arc centered at the point u on the unit circle. We prove that after regularizing at a small scale {ɛN > 0}, the renormalized exponential of this field converges as N \\to ∞ to a Gaussian multiplicative chaos measure in the whole subcritical phase. We discuss implications of this result for obtaining a lower bound on the maximum of the field. We also show that the moments of the total mass converge to a Selberg-like integral and by taking a further limit as the size of the arc diverges, we establish part of the conjectures in Ostrovsky (Nonlinearity 29(2):426-464, 2016). By an analogous construction, we prove that the multiplicative chaos measure coming from the sine process has the same distribution, which strongly suggests that this limiting object should be universal. Our approach to the L 1-phase is based on a generalization of the construction in Berestycki (Electron Commun Probab 22(27):12, 2017) to random fields which are only asymptotically Gaussian. In particular, our method could have applications to other random fields coming from either random matrix theory or a different context.

  8. Soft X ray spectrometry at high count rates

    International Nuclear Information System (INIS)

    Blanc, P.; Brouquet, P.; Uhre, N.

    1978-06-01

    Two modifications of the classical method of X-ray spectrometry by a semi-conductor diode permit a count rate of 10 5 c/s with an energy resolution of 350 eV. With a specially constructed pulse height analyzer, this detector can measure four spectra of 5 ms each, in the range of 1-30 keV, during a plasma shot

  9. High sensitivity neutron activation analysis using coincidence counting method

    International Nuclear Information System (INIS)

    Suzuki, Shogo; Okada, Yukiko; Hirai, Shoji

    1999-01-01

    Four kinds of standard samples such as river sediment (NIES CRM No.16), Typical Japanese Diet, otoliths and river water were irradiated by TRIGA-II (100 kW, 3.7x10 12 n cm -2 s -1 ) for 6 h. After irradiation and cooling, they were analyzed by the coincidence counting method and a conventional γ-ray spectrometry. Se, Ba and Hf were determined by 75 Se 265 keV, 131 Ba 496 keV and 181 Hf 482 keV. On the river sediment sample, Ba and Hf showed the same values by two methods, but Se value contained Ta by the conventional method, although the coincidence counting method could analyze Se. On Typical Japanese Diet and otoliths, Se could be determined by two methods and Ba and Hf determined by the coincidence counting method but not determined by the conventional method. Se value in the river water agreed with the authorization value. (S.Y.)

  10. Correcting the Count: Improving Vital Statistics Data Regarding Deaths Related to Obesity.

    Science.gov (United States)

    McCleskey, Brandi C; Davis, Gregory G; Dye, Daniel W

    2017-11-15

    Obesity can involve any organ system and compromise the overall health of an individual, including premature death. Despite the increased risk of death associated with being obese, obesity itself is infrequently indicated on the death certificate. We performed an audit of our records to identify how often "obesity" was listed on the death certificate to determine how our practices affected national mortality data collection regarding obesity-related mortality. During the span of nearly 25 years, 0.2% of deaths were attributed to or contributed by obesity. Over the course of 5 years, 96% of selected natural deaths were likely underreported as being associated with obesity. We present an algorithm for certifiers to use to determine whether obesity should be listed on the death certificate and guidelines for certifying cases in which this is appropriate. Use of this algorithm will improve vital statistics concerning the role of obesity in causing or contributing to death. © 2017 American Academy of Forensic Sciences.

  11. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  12. High Channel Count Time-to-Digital Converter and Lasercom Processor, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — High-channel-count, high-precision, and high-throughput time-to-digital converters (TDC) are needed to support detector arrays used in deep-space optical...

  13. High resolution gamma-ray spectroscopy at high count rates with a prototype High Purity Germanium detector

    Science.gov (United States)

    Cooper, R. J.; Amman, M.; Vetter, K.

    2018-04-01

    High-resolution gamma-ray spectrometers are required for applications in nuclear safeguards, emergency response, and fundamental nuclear physics. To overcome one of the shortcomings of conventional High Purity Germanium (HPGe) detectors, we have developed a prototype device capable of achieving high event throughput and high energy resolution at very high count rates. This device, the design of which we have previously reported on, features a planar HPGe crystal with a reduced-capacitance strip electrode geometry. This design is intended to provide good energy resolution at the short shaping or digital filter times that are required for high rate operation and which are enabled by the fast charge collection afforded by the planar geometry crystal. In this work, we report on the initial performance of the system at count rates up to and including two million counts per second.

  14. Counting and classifying attractors in high dimensional dynamical systems.

    Science.gov (United States)

    Bagley, R J; Glass, L

    1996-12-07

    Randomly connected Boolean networks have been used as mathematical models of neural, genetic, and immune systems. A key quantity of such networks is the number of basins of attraction in the state space. The number of basins of attraction changes as a function of the size of the network, its connectivity and its transition rules. In discrete networks, a simple count of the number of attractors does not reveal the combinatorial structure of the attractors. These points are illustrated in a reexamination of dynamics in a class of random Boolean networks considered previously by Kauffman. We also consider comparisons between dynamics in discrete networks and continuous analogues. A continuous analogue of a discrete network may have a different number of attractors for many different reasons. Some attractors in discrete networks may be associated with unstable dynamics, and several different attractors in a discrete network may be associated with a single attractor in the continuous case. Special problems in determining attractors in continuous systems arise when there is aperiodic dynamics associated with quasiperiodicity of deterministic chaos.

  15. Development of bonded semiconductor device for high counting rate high efficiency photon detectors

    International Nuclear Information System (INIS)

    Kanno, Ikuo

    2008-01-01

    We are trying to decrease dose exposure in medical diagnosis by way of measuring the energy of X-rays. For this purpose, radiation detectors for X-ray energy measurement with high counting rate should be developed. Direct bonding of Si wafers was carried out to make a radiation detector, which had separated X-ray absorber and detector. The resistivity of bonding interface was estimated with the results of four-probe measurements and model calculations. Direct bonding of high resistivity p and n-Si wafers was also performed. The resistance of the pn bonded diode was 0.7 MΩ. The resistance should be increased in the future. (author)

  16. A Statistical Perspective on Highly Accelerated Testing

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Edward V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning

  17. High Triglycerides Are Associated with Low Thrombocyte Counts and High VEGF in Nephropathia Epidemica.

    Science.gov (United States)

    Martynova, Ekaterina V; Valiullina, Aygul H; Gusev, Oleg A; Davidyuk, Yuriy N; Garanina, Ekaterina E; Shakirova, Venera G; Khaertynova, Ilsiyar; Anokhin, Vladimir A; Rizvanov, Albert A; Khaiboullina, Svetlana F

    2016-01-01

    Nephropathia epidemica (NE) is a mild form of hemorrhagic fever with renal syndrome. Several reports have demonstrated a severe alteration in lipoprotein metabolism. However, little is known about changes in circulating lipids in NE. The objectives of this study were to evaluate changes in serum total cholesterol, high density cholesterol (HDCL), and triglycerides. In addition to evaluation of serum cytokine activation associations, changes in lipid profile and cytokine activation were determined for gender, thrombocyte counts, and VEGF. Elevated levels of triglycerides and decreased HDCL were observed in NE, while total cholesterol did not differ from controls. High triglycerides were associated with both the lowest thrombocyte counts and high serum VEGF, as well as a high severity score. Additionally, there were higher levels of triglycerides in male than female NE patients. Low triglycerides were associated with upregulation of IFN- γ and IL-12, suggesting activation of Th1 helper cells. Furthermore, levels of IFN- γ and IL-12 were increased in patients with lower severity scores, suggesting that a Th1 type immune response is playing protective role in NE. These combined data advance the understanding of NE pathogenesis and indicate a role for high triglycerides in disease severity.

  18. A high dynamic range pulse counting detection system for mass spectrometry.

    Science.gov (United States)

    Collings, Bruce A; Dima, Martian D; Ivosev, Gordana; Zhong, Feng

    2014-01-30

    A high dynamic range pulse counting system has been developed that demonstrates an ability to operate at up to 2e8 counts per second (cps) on a triple quadrupole mass spectrometer. Previous pulse counting detection systems have typically been limited to about 1e7 cps at the upper end of the systems dynamic range. Modifications to the detection electronics and dead time correction algorithm are described in this paper. A high gain transimpedance amplifier is employed that allows a multi-channel electron multiplier to be operated at a significantly lower bias potential than in previous pulse counting systems. The system utilises a high-energy conversion dynode, a multi-channel electron multiplier, a high gain transimpedance amplifier, non-paralysing detection electronics and a modified dead time correction algorithm. Modification of the dead time correction algorithm is necessary due to a characteristic of the pulse counting electronics. A pulse counting detection system with the capability to count at ion arrival rates of up to 2e8 cps is described. This is shown to provide a linear dynamic range of nearly five orders of magnitude for a sample of aprazolam with concentrations ranging from 0.0006970 ng/mL to 3333 ng/mL while monitoring the m/z 309.1 → m/z 205.2 transition. This represents an upward extension of the detector's linear dynamic range of about two orders of magnitude. A new high dynamic range pulse counting system has been developed demonstrating the ability to operate at up to 2e8 cps on a triple quadrupole mass spectrometer. This provides an upward extension of the detector's linear dynamic range by about two orders of magnitude over previous pulse counting systems. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Improved Yield, Performance and Reliability of High-Actuator-Count Deformable Mirrors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The project team will conduct processing and design research aimed at improving yield, performance, and reliability of high-actuator-count micro-electro-mechanical...

  20. Development of a high-count-rate neutron detector with position sensitivity and high efficiency

    International Nuclear Information System (INIS)

    Nelson, R.; Sandoval, J.

    1996-01-01

    While the neutron scattering community is bombarded with hints of new technologies that may deliver detectors with high-count-rate capability, high efficiency, gamma-ray insensitivity, and high resolution across large areas, only the time-tested, gas-filled 3 He and scintillation detectors are in widespread use. Future spallation sources with higher fluxes simply must exploit some of the advanced detector schemes that are as yet unproved as production systems. Technologies indicating promise as neutron detectors include pixel arrays of amorphous silicon, silicon microstrips, microstrips with gas, and new scintillation materials. This project sought to study the competing neutron detector technologies and determine which or what combination will lead to a production detector system well suited for use at a high-intensity neutron scattering source

  1. Homeless High School Students in America: Who Counts?

    Science.gov (United States)

    Cumming, John M.; Gloeckner, Gene W.

    2012-01-01

    After interviewing homeless high school students, the research team in a Colorado school district discovered that many students had not revealed their true living conditions (homelessness) to anyone in the school district. This research team developed an anonymous survey written around the homeless categories identified in the McKinney-Vento…

  2. Effect of finite Coulomb interaction on full counting statistics of electronic transport through single-molecule magnet

    Energy Technology Data Exchange (ETDEWEB)

    Xue Haibin, E-mail: xhb98326110@163.co [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Nie, Y.-H., E-mail: nieyh@sxu.edu.c [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Li, Z.-J.; Liang, J.-Q. [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China)

    2011-01-17

    We study the full counting statistics (FCS) in a single-molecule magnet (SMM) with finite Coulomb interaction U. For finite U the FCS, differing from U{yields}{infinity}, shows a symmetric gate-voltage-dependence when the coupling strengths with two electrodes are interchanged, which can be observed experimentally just by reversing the bias-voltage. Moreover, we find that the effect of finite U on shot noise depends on the internal level structure of the SMM and the coupling asymmetry of the SMM with two electrodes as well. When the coupling of the SMM with the incident-electrode is stronger than that with the outgoing-electrode, the super-Poissonian shot noise in the sequential tunneling regime appears under relatively small gate-voltage and relatively large finite U, and dose not for U{yields}{infinity}; while it occurs at relatively large gate-voltage for the opposite coupling case. The formation mechanism of super-Poissonian shot noise can be qualitatively attributed to the competition between fast and slow transport channels.

  3. Effect of finite Coulomb interaction on full counting statistics of electronic transport through single-molecule magnet

    International Nuclear Information System (INIS)

    Xue Haibin; Nie, Y.-H.; Li, Z.-J.; Liang, J.-Q.

    2011-01-01

    We study the full counting statistics (FCS) in a single-molecule magnet (SMM) with finite Coulomb interaction U. For finite U the FCS, differing from U→∞, shows a symmetric gate-voltage-dependence when the coupling strengths with two electrodes are interchanged, which can be observed experimentally just by reversing the bias-voltage. Moreover, we find that the effect of finite U on shot noise depends on the internal level structure of the SMM and the coupling asymmetry of the SMM with two electrodes as well. When the coupling of the SMM with the incident-electrode is stronger than that with the outgoing-electrode, the super-Poissonian shot noise in the sequential tunneling regime appears under relatively small gate-voltage and relatively large finite U, and dose not for U→∞; while it occurs at relatively large gate-voltage for the opposite coupling case. The formation mechanism of super-Poissonian shot noise can be qualitatively attributed to the competition between fast and slow transport channels.

  4. Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions

    Science.gov (United States)

    Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong

    2018-01-01

    Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within  ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.

  5. MERLIN, a new high count rate spectrometer at ISIS

    International Nuclear Information System (INIS)

    Bewley, R.I.; Eccleston, R.S.; McEwen, K.A.; Hayden, S.M.; Dove, M.T.; Bennington, S.M.; Treadgold, J.R.; Coleman, R.L.S.

    2006-01-01

    MERLIN is designed to be a high intensity, medium energy resolution spectrometer. As such, it will complement the high-resolution MAPS spectrometer at ISIS. MERLIN will utilise all the latest advances in technology with a supermirror guide to enhance flux as well as 3 m long position-sensitive detectors in a vacuum making it ideal for single-crystal users. The detector bank will cover a massive π steradians of solid angle with an angular range from -45 o to +135 o degrees in the horizontal plane and ±30 o degrees in the vertical plane. This will allow large swathes of Q,ω space to be accessed in a single run. The instrument will be ready for commissioning in February 2006. This paper presents details of design and performance of this new instrument

  6. High impact  =  high statistical standards? Not necessarily so.

    Science.gov (United States)

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  7. High Impact = High Statistical Standards? Not Necessarily So

    Science.gov (United States)

    Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533

  8. Gain reduction due to space charge at high counting rates in multiwire proportional chambers

    International Nuclear Information System (INIS)

    Smith, G.C.; Mathieson, E.

    1986-10-01

    Measurements with a small MWPC of gas gain reduction, due to ion space charge at high counting rates, have been compared with theoretical predictions. The quantity ln(q/q 0 )/(q/q 0 ), where (q/q 0 ) is the relative reduced avalanche charge, has been found to be closely proportional to count rate, as predicted. The constant of proportionality is in good agreement with calculations made with a modified version of the original, simplified theory

  9. Atom-counting in High Resolution Electron Microscopy:TEM or STEM - That's the question.

    Science.gov (United States)

    Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S

    2017-03-01

    In this work, a recently developed quantitative approach based on the principles of detection theory is used in order to determine the possibilities and limitations of High Resolution Scanning Transmission Electron Microscopy (HR STEM) and HR TEM for atom-counting. So far, HR STEM has been shown to be an appropriate imaging mode to count the number of atoms in a projected atomic column. Recently, it has been demonstrated that HR TEM, when using negative spherical aberration imaging, is suitable for atom-counting as well. The capabilities of both imaging techniques are investigated and compared using the probability of error as a criterion. It is shown that for the same incoming electron dose, HR STEM outperforms HR TEM under common practice standards, i.e. when the decision is based on the probability function of the peak intensities in HR TEM and of the scattering cross-sections in HR STEM. If the atom-counting decision is based on the joint probability function of the image pixel values, the dependence of all image pixel intensities as a function of thickness should be known accurately. Under this assumption, the probability of error may decrease significantly for atom-counting in HR TEM and may, in theory, become lower as compared to HR STEM under the predicted optimal experimental settings. However, the commonly used standard for atom-counting in HR STEM leads to a high performance and has been shown to work in practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  11. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation

  12. Smart pile-up consideration for evaluation of high count rate EDS spectra

    International Nuclear Information System (INIS)

    Eggert, F; Anderhalt, R; Nicolosi, J; Elam, T

    2012-01-01

    This work describes a new pile-up consideration for the very high count rate spectra which are possible to acquire with silicon drift detector (SDD) technology. Pile-up effects are the major and still remaining challenge with the use of SDD for EDS in scanning electron microscopes (SEM) with ultra thin windows for soft X-ray detection. The ability to increase the count rates up to a factor of 100 compared with conventional Si(Li) detectors, comes with the problem that the pile-up recognition (pile-up rejection) in pulse processors is not able to improve by the same order of magnitude, just only with a factor of about 3. Therefore, it is common that spectra will show significant pile-up effects if count rates of more than 10000 counts per second (10 kcps) are used. These false counts affect both automatic qualitative analysis and quantitative evaluation of the spectra. The new idea is to use additional inputs for pile-up calculation to shift the applicability towards very high count rates of up to 200 kcps and more, which can be easily acquired with the SDD. The additional input is the 'known' (estimated) background distribution, calculated iteratively during all automated qualitative or quantitative evaluations. This additional knowledge gives the opportunity for self adjustment of the pile-up calculation parameters and avoids over-corrections which challenge the evaluation as well as the pile-up artefacts themselves. With the proposed method the pile-up correction is no longer a 'correction' but an integral part of all spectra evaluation steps. Examples for the application are given with evaluation of very high count rate spectra.

  13. Evaluation of high-energy electron detectors for probing the inner magnetosphere under high-counting condition

    International Nuclear Information System (INIS)

    Tamada, Yukihiro; Takashima, Takeshi; Mitani, Takefumi; Miyake, Wataru

    2013-01-01

    An ERG (Energization and Radiation in Geospace) satellite will be launched to study the acceleration processes of energetic particles in the radiation belt surrounding the earth. It is very important to reveal the acceleration process of high-energy particles for both science and the application to space weather forecast. Drastic increases of high-energy electrons in the radiation belt is sometimes observed during a geomagnetic storm. When a large magnetic storm occurs, energetic electron count rates may exceed flux limits expected in the nominal design and large number of incident electrons leading to detection loss. The purpose of this study is to demonstrate that the count rate range of a single detection on board ERG satellite can be expanded by means of reading circuit operations to decrease an area of detection. In our ground experiment, we also found an unexpected result that count peaks shift to the higher energy side under high counting conditions. (author)

  14. A pulse shape discriminator with high precision of neutron and gamma ray selection at high counting rate

    International Nuclear Information System (INIS)

    Bialkowski, J.; Moszynski, M.; Wolski, D.

    1989-01-01

    A pulse shape discriminator based on the zero-crossing principle is described. Due to dc negative feedback loops stabilizing the shaping amplifier and the zero-crossing discriminator, the working of the circuit is not affected by the high counting rate and the temperature variations. The pileup rejection circuit built into the discriminator improves the quality of the n-γ separation at high counting rates. A full γ-ray rejection is obtained for a recoil energy of electrons down to 25 keV. At high counting rates the remaining γ-ray contribution is evidently due to the pileup effect which is equal to about 2% at 4x10 5 counts/s. (orig.)

  15. An integral whole circuit of amplifying and discriminating suited to high counting rate

    International Nuclear Information System (INIS)

    Dong Chengfu; Su Hong; Wu Ming; Li Xiaogang; Peng Yu; Qian Yi; Liu Yicai; Xu Sijiu; Ma Xiaoli

    2007-01-01

    A hybrid circuit consists of charge sensitive preamplifier, main amplifier, discriminator and shaping circuit was described. This instrument has characteristics of low power consumption, small volume, high sensitivity, potable and so on, and is convenient for use in field. The output pulse of this instrument may directly consist with CMOS or TTL logic level. This instrument was mainly used for count measurement, for example, for high sensitive 3 He neutron detector, meanwhile also may used for other heavy ion detectors, the highest counting rate can reach 10 6 /s. (authors)

  16. Performance of Drift-Tube Detectors at High Counting Rates for High-Luminosity LHC Upgrades

    CERN Document Server

    Bittner, Bernhard; Kortner, Oliver; Kroha, Hubert; Manfredini, Alessandro; Nowak, Sebastian; Ott, Sebastian; Richter, Robert; Schwegler, Philipp; Zanzi, Daniele; Biebel, Otmar; Hertenberger, Ralf; Ruschke, Alexander; Zibell, Andre

    2016-01-01

    The performance of pressurized drift-tube detectors at very high background rates has been studied at the Gamma Irradiation Facility (GIF) at CERN and in an intense 20 MeV proton beam at the Munich Van-der-Graaf tandem accelerator for applications in large-area precision muon tracking at high-luminosity upgrades of the Large Hadron Collider (LHC). The ATLAS muon drifttube (MDT) chambers with 30 mm tube diameter have been designed to cope with and neutron background hit rates of up to 500 Hz/square cm. Background rates of up to 14 kHz/square cm are expected at LHC upgrades. The test results with standard MDT readout electronics show that the reduction of the drift-tube diameter to 15 mm, while leaving the operating parameters unchanged, vastly increases the rate capability well beyond the requirements. The development of new small-diameter muon drift-tube (sMDT) chambers for LHC upgrades is completed. Further improvements of tracking e?ciency and spatial resolution at high counting rates will be achieved with ...

  17. Improvements in the energy resolution and high-count-rate performance of bismuth germanate

    International Nuclear Information System (INIS)

    Koehler, P.E.; Wender, S.A.; Kapustinsky, J.S.

    1985-01-01

    Several methods for improving the energy resolution of bismuth germanate (BGO) have been investigated. It is shown that some of these methods resulted in a substantial improvement in the energy resolution. In addition, a method to improve the performance of BGO at high counting rates has been systematically studied. The results of this study are presented and discussed

  18. Exponential decay and exponential recovery of modal gains in high count rate channel electron multipliers

    International Nuclear Information System (INIS)

    Hahn, S.F.; Burch, J.L.

    1980-01-01

    A series of data on high count rate channel electron multipliers revealed an initial drop and subsequent recovery of gains in exponential fashion. The FWHM of the pulse height distribution at the initial stage of testing can be used as a good criterion for the selection of operating bias voltage of the channel electron multiplier

  19. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  20. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  1. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  2. Generalized linear models and point count data: statistical considerations for the design and analysis of monitoring studies

    Science.gov (United States)

    Nathaniel E. Seavy; Suhel Quader; John D. Alexander; C. John Ralph

    2005-01-01

    The success of avian monitoring programs to effectively guide management decisions requires that studies be efficiently designed and data be properly analyzed. A complicating factor is that point count surveys often generate data with non-normal distributional properties. In this paper we review methods of dealing with deviations from normal assumptions, and we focus...

  3. Development of Fast High-Resolution Muon Drift-Tube Detectors for High Counting Rates

    CERN Document Server

    INSPIRE-00287945; Dubbert, J.; Horvat, S.; Kortner, O.; Kroha, H.; Legger, F.; Richter, R.; Adomeit, S.; Biebel, O.; Engl, A.; Hertenberger, R.; Rauscher, F.; Zibell, A.

    2011-01-01

    Pressurized drift-tube chambers are e?cient detectors for high-precision tracking over large areas. The Monitored Drift-Tube (MDT) chambers of the muon spectrometer of the ATLAS detector at the Large Hadron Collider (LHC) reach a spatial resolution of 35 micons and almost 100% tracking e?ciency with 6 layers of 30 mm diameter drift tubes operated with Ar:CO2 (93:7) gas mixture at 3 bar and a gas gain of 20000. The ATLAS MDT chambers are designed to cope with background counting rates due to neutrons and gamma-rays of up to about 300 kHz per tube which will be exceeded for LHC luminosities larger than the design value of 10-34 per square cm and second. Decreasing the drift-tube diameter to 15 mm while keeping the other parameters, including the gas gain, unchanged reduces the maximum drift time from about 700 ns to 200 ns and the drift-tube occupancy by a factor of 7. New drift-tube chambers for the endcap regions of the ATLAS muon spectrometer have been designed. A prototype chamber consisting of 12 times 8 l...

  4. Statistical evidences of absorption at high latitudes

    International Nuclear Information System (INIS)

    Fesenko, B.I.

    1980-01-01

    Evidences are considered which indicate to the significant effect of the irregular interstellar absorption at high latitudes b. The number density of faint galaxies grows with the increasing |b| even at the values of |b| exceeding 50 deg. The effects of interstellar medium are traced even in the directions of the stars and globular clusters with very low values of the colour excess. The coefficient of absorption, Asub(B)=0.29+-0.05, was estimated from the colours of the bright E-galaxies [ru

  5. Calibration of the Accuscan II IN Vivo System for High Energy Lung Counting

    Energy Technology Data Exchange (ETDEWEB)

    Ovard R. Perry; David L. Georgeson

    2011-07-01

    This report describes the April 2011 calibration of the Accuscan II HpGe In Vivo system for high energy lung counting. The source used for the calibration was a NIST traceable lung set manufactured at the University of Cincinnati UCLL43AMEU & UCSL43AMEU containing Am-241 and Eu-152 with energies from 26 keV to 1408 keV. The lung set was used in conjunction with a Realistic Torso phantom. The phantom was placed on the RMC II counting table (with pins removed) between the v-ridges on the backwall of the Accuscan II counter. The top of the detector housing was positioned perpendicular to the junction of the phantom clavicle with the sternum. This position places the approximate center line of the detector housing with the center of the lungs. The energy and efficiency calibrations were performed using a Realistic Torso phantom (Appendix I) and the University of Cincinnati lung set. This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for high energy lung counting and verified in accordance with ANSI/HPS N13.30-1996 criteria.

  6. Liver stiffness plus platelet count can be used to exclude high-risk oesophageal varices.

    Science.gov (United States)

    Ding, Nik S; Nguyen, Tin; Iser, David M; Hong, Thai; Flanagan, Emma; Wong, Avelyn; Luiz, Lauren; Tan, Jonathan Y C; Fulforth, James; Holmes, Jacinta; Ryan, Marno; Bell, Sally J; Desmond, Paul V; Roberts, Stuart K; Lubel, John; Kemp, William; Thompson, Alexander J

    2016-02-01

    Endoscopic screening for high-risk gastro-oesophageal varices (GOV) is recommended for compensated cirrhotic patients with transient elastography identifying increasing numbers of patients with cirrhosis without portal hypertension. Using liver stiffness measurement (LSM) ± platelet count, the aim was to develop a simple clinical rule to exclude the presence of high-risk GOV in patients with Child-Pugh A cirrhosis. A retrospective analysis of 71 patients with Child-Pugh A cirrhosis diagnosed by transient elastography (LSM >13.6 kPa) who underwent screening gastroscopy was conducted. A predictive model using LSM ± platelet count was assessed to exclude the presence of high-risk GOV (diameter >5 mm and/or the presence of high-risk stigmata) and validated using a second cohort of 200 patients from two independent centres. High-risk GOV were present in 10 (15%) and 16 (8%) of the training and validation cohorts, respectively, which was associated with LSM and Pl count (P < 0.05). A combined model based on LSM and Pl count was more accurate for excluding the presence of high-risk GOV than either alone (training cohort AUROC: 0.87 [0.77-0.96] vs. 0.78 [0.65-0.92] for LSM and 0.71 [0.52-0.90] for platelets) with the combination of LSM ≤25 kPa and Pl ≥100 having a NPV of 100% in both the training and validation cohorts. A total of 107 (39%) patients meet this criterion. The combination of LSM ≤25 kPa and Pl ≥100 can be used in clinical practice to exclude the presence of high-risk GOV in patients with Child-Pugh A cirrhosis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.

    Science.gov (United States)

    Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio

    2016-01-01

    The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.

  8. High-Throughput Quantification of Bacterial-Cell Interactions Using Virtual Colony Counts

    Directory of Open Access Journals (Sweden)

    Stefanie Hoffmann

    2018-02-01

    Full Text Available The quantification of bacteria in cell culture infection models is of paramount importance for the characterization of host-pathogen interactions and pathogenicity factors involved. The standard to enumerate bacteria in these assays is plating of a dilution series on solid agar and counting of the resulting colony forming units (CFU. In contrast, the virtual colony count (VCC method is a high-throughput compatible alternative with minimized manual input. Based on the recording of quantitative growth kinetics, VCC relates the time to reach a given absorbance threshold to the initial cell count using a series of calibration curves. Here, we adapted the VCC method using the model organism Salmonella enterica sv. Typhimurium (S. Typhimurium in combination with established cell culture-based infection models. For HeLa infections, a direct side-by-side comparison showed a good correlation of VCC with CFU counting after plating. For MDCK cells and RAW macrophages we found that VCC reproduced the expected phenotypes of different S. Typhimurium mutants. Furthermore, we demonstrated the use of VCC to test the inhibition of Salmonella invasion by the probiotic E. coli strain Nissle 1917. Taken together, VCC provides a flexible, label-free, automation-compatible methodology to quantify bacteria in in vitro infection assays.

  9. Mutacins and bacteriocins like genes in Streptococcus mutans isolated from participants with high, moderate, and low salivary count.

    Science.gov (United States)

    Soto, Carolina; Padilla, Carlos; Lobos, Olga

    2017-02-01

    To detect S. mutans producers of mutacins and bacteriocins like substances (BLIS) from saliva of participants with low, moderate, and high salivary counts. 123 strains of S. mutans were obtained from participants with low, moderate, and high salivary counts (age 18 and 20 years old) and their antibacterial capacity analyzed. By using PCR amplification, the expression levels of mutacins and BLIS genes were studied (expressed in arbitrary units/ml) in all three levels. S. mutans strains from participants with low salivary counts show high production of mutacins (63%). In contrast, participants with moderate and high salivary counts depict relatively low levels of mutacins (22 and 15%, respectively). Moreover, participants with low salivary counts showed high expression levels of genes encoding mutacins, a result that correlates with the strong antimicrobial activity of the group. Participants with moderate and high salivary counts however depict low expression levels of mutacin related genes, and little antimicrobial activity. No BLIS were detected in any of the groups studied. S. mutans isolated from the saliva of participants with low bacterial counts have significant antibacterial capacity compared to that of participants with moderate and high salivary counts. The superior lethality of S. mutans in participants with low salivary counts is likely due to the augmented expression of mutacin- related genes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A simple and robust statistical framework for planning, analysing and interpreting faecal egg count reduction test (FECRT) studies

    DEFF Research Database (Denmark)

    Denwood, M.J.; McKendrick, I.J.; Matthews, L.

    Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...

  11. High energy X-ray photon counting imaging using linear accelerator and silicon strip detectors

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Y., E-mail: cycjty@sophie.q.t.u-tokyo.ac.jp [Department of Bioengineering, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Shimazoe, K.; Yan, X. [Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Ueda, O.; Ishikura, T. [Fuji Electric Co., Ltd., Fuji, Hino, Tokyo 191-8502 (Japan); Fujiwara, T. [National Institute of Advanced Industrial Science and Technology, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568 (Japan); Uesaka, M.; Ohno, M. [Nuclear Professional School, the University of Tokyo, 2-22 Shirakata-shirane, Tokai, Ibaraki 319-1188 (Japan); Tomita, H. [Department of Quantum Engineering, Nagoya University, Furo, Chikusa, Nagoya 464-8603 (Japan); Yoshihara, Y. [Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Takahashi, H. [Department of Bioengineering, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2016-09-11

    A photon counting imaging detector system for high energy X-rays is developed for on-site non-destructive testing of thick objects. One-dimensional silicon strip (1 mm pitch) detectors are stacked to form a two-dimensional edge-on module. Each detector is connected to a 48-channel application specific integrated circuit (ASIC). The threshold-triggered events are recorded by a field programmable gate array based counter in each channel. The detector prototype is tested using 950 kV linear accelerator X-rays. The fast CR shaper (300 ns pulse width) of the ASIC makes it possible to deal with the high instant count rate during the 2 μs beam pulse. The preliminary imaging results of several metal and concrete samples are demonstrated.

  12. High energy X-ray photon counting imaging using linear accelerator and silicon strip detectors

    International Nuclear Information System (INIS)

    Tian, Y.; Shimazoe, K.; Yan, X.; Ueda, O.; Ishikura, T.; Fujiwara, T.; Uesaka, M.; Ohno, M.; Tomita, H.; Yoshihara, Y.; Takahashi, H.

    2016-01-01

    A photon counting imaging detector system for high energy X-rays is developed for on-site non-destructive testing of thick objects. One-dimensional silicon strip (1 mm pitch) detectors are stacked to form a two-dimensional edge-on module. Each detector is connected to a 48-channel application specific integrated circuit (ASIC). The threshold-triggered events are recorded by a field programmable gate array based counter in each channel. The detector prototype is tested using 950 kV linear accelerator X-rays. The fast CR shaper (300 ns pulse width) of the ASIC makes it possible to deal with the high instant count rate during the 2 μs beam pulse. The preliminary imaging results of several metal and concrete samples are demonstrated.

  13. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  14. A high-throughput, multi-channel photon-counting detector with picosecond timing

    International Nuclear Information System (INIS)

    Lapington, J.S.; Fraser, G.W.; Miller, G.M.; Ashton, T.J.R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  15. Fluorescence decay data analysis correcting for detector pulse pile-up at very high count rates

    Science.gov (United States)

    Patting, Matthias; Reisch, Paja; Sackrow, Marcus; Dowler, Rhys; Koenig, Marcelle; Wahl, Michael

    2018-03-01

    Using time-correlated single photon counting for the purpose of fluorescence lifetime measurements is usually limited in speed due to pile-up. With modern instrumentation, this limitation can be lifted significantly, but some artifacts due to frequent merging of closely spaced detector pulses (detector pulse pile-up) remain an issue to be addressed. We propose a data analysis method correcting for this type of artifact and the resulting systematic errors. It physically models the photon losses due to detector pulse pile-up and incorporates the loss in the decay fit model employed to obtain fluorescence lifetimes and relative amplitudes of the decay components. Comparison of results with and without this correction shows a significant reduction of systematic errors at count rates approaching the excitation rate. This allows quantitatively accurate fluorescence lifetime imaging at very high frame rates.

  16. Photon-counting digital radiography using high-pressure xenon filled detectors

    CERN Document Server

    Li, Maozhen; Johns, P C

    2001-01-01

    Digital radiography overcomes many of the limitations of the traditional screen/film system. Further enhancements in the digital radiography image are possible if the X-ray image receptor could measure the energy of individual photons instead of simply integrating their energy, as is the case at present. A prototype photon counting scanned projection radiography system has been constructed, which combines a Gas Electron Multiplier (GEM) and a Gas Microstrip Detector (GMD) using Xe : CH sub 4 (90 : 10) at high pressure. With the gain contribution from the GEM, the GMD can be operated at lower and safer voltages making the imaging system more reliable. Good energy resolution, and spatial resolution comparable to that of screen/film, have been demonstrated for the GEM/GMD hybrid imaging system in photon counting mode for X-ray spectra up to 50 kV.

  17. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    Science.gov (United States)

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  18. Students’ Learning Obstacles and Alternative Solution in Counting Rules Learning Levels Senior High School

    Directory of Open Access Journals (Sweden)

    M A Jatmiko

    2017-12-01

    Full Text Available The counting rules is a topic in mathematics senior high school. In the learning process, teachers often find students who have difficulties in learning this topic. Knowing the characteristics of students' learning difficulties and analyzing the causes is important for the teacher, as an effort in trying to reflect the learning process and as a reference in constructing alternative learning solutions which appropriate to anticipate students’ learning obstacles. This study uses qualitative methods and involves 70 students of class XII as research subjects. The data collection techniques used in this study is diagnostic test instrument about learning difficulties in counting rules, observation, and interview. The data used to know the learning difficulties experienced by students, the causes of learning difficulties, and to develop alternative learning solutions. From the results of data analysis, the results of diagnostic tests researcher found some obstacles faced by students, such as students get confused in describing the definition, students difficulties in understanding the procedure of solving multiplication rules. Based on those problems, researcher analyzed the causes of these difficulties and make hypothetical learning trajectory as an alternative solution in counting rules learning.

  19. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  20. High energy behaviour of particles and unified statistics

    International Nuclear Information System (INIS)

    Chang, Y.

    1984-01-01

    Theories and experiments suggest that particles at high energy appear to possess a new statistics unifying Bose-Einstein and Fermi-Dirac statistics via the GAMMA distribution. This hypothesis can be obtained from many models, and agrees quantitatively with scaling, the multiplicty, large transverse momentum, the mass spectrum, and other data. It may be applied to scatterings at high energy, and agrees with experiments and known QED's results. The Veneziano model and other theories have implied new statistics, such as, the B distribution and the Polya distribution. They revert to the GAMMA distribution at high energy. The possible inapplicability of Pauli's exclusion principle within the unified statistics is considered and associated to the quark constituents

  1. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  2. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  3. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  4. Counting on Early Math Skills: Preliminary Kindergarten Impacts of the Making Pre-K Count and High 5s Programs

    Science.gov (United States)

    Mattera, Shira; Morris, Pamela

    2017-01-01

    Early math ability is one of the best predictors of children's math and reading skills into late elementary school. Children with stronger math proficiency in elementary school, in turn, are more likely to graduate from high school and attend college. However, early math skills have not historically been a major focus of instruction in preschool…

  5. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    Science.gov (United States)

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  6. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  7. Statistical behavior of high doses in medical radiodiagnosis

    International Nuclear Information System (INIS)

    Barboza, Adriana Elisa

    2014-01-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  8. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  9. A cell-based high-throughput screening assay for radiation susceptibility using automated cell counting

    International Nuclear Information System (INIS)

    Hodzic, Jasmina; Dingjan, Ilse; Maas, Mariëlle JP; Meulen-Muileman, Ida H van der; Menezes, Renee X de; Heukelom, Stan; Verheij, Marcel; Gerritsen, Winald R; Geldof, Albert A; Triest, Baukelien van; Beusechem, Victor W van

    2015-01-01

    Radiotherapy is one of the mainstays in the treatment for cancer, but its success can be limited due to inherent or acquired resistance. Mechanisms underlying radioresistance in various cancers are poorly understood and available radiosensitizers have shown only modest clinical benefit. There is thus a need to identify new targets and drugs for more effective sensitization of cancer cells to irradiation. Compound and RNA interference high-throughput screening technologies allow comprehensive enterprises to identify new agents and targets for radiosensitization. However, the gold standard assay to investigate radiosensitivity of cancer cells in vitro, the colony formation assay (CFA), is unsuitable for high-throughput screening. We developed a new high-throughput screening method for determining radiation susceptibility. Fast and uniform irradiation of batches up to 30 microplates was achieved using a Perspex container and a clinically employed linear accelerator. The readout was done by automated counting of fluorescently stained nuclei using the Acumen eX3 laser scanning cytometer. Assay performance was compared to that of the CFA and the CellTiter-Blue homogeneous uniform-well cell viability assay. The assay was validated in a whole-genome siRNA library screening setting using PC-3 prostate cancer cells. On 4 different cancer cell lines, the automated cell counting assay produced radiation dose response curves that followed a linear-quadratic equation and that exhibited a better correlation to the results of the CFA than did the cell viability assay. Moreover, the cell counting assay could be used to detect radiosensitization by silencing DNA-PKcs or by adding caffeine. In a high-throughput screening setting, using 4 Gy irradiated and control PC-3 cells, the effects of DNA-PKcs siRNA and non-targeting control siRNA could be clearly discriminated. We developed a simple assay for radiation susceptibility that can be used for high-throughput screening. This will aid

  10. HIGH-RESOLUTION IMAGING OF THE ATLBS REGIONS: THE RADIO SOURCE COUNTS

    Energy Technology Data Exchange (ETDEWEB)

    Thorat, K.; Subrahmanyan, R.; Saripalli, L.; Ekers, R. D., E-mail: kshitij@rri.res.in [Raman Research Institute, C. V. Raman Avenue, Sadashivanagar, Bangalore 560080 (India)

    2013-01-01

    The Australia Telescope Low-brightness Survey (ATLBS) regions have been mosaic imaged at a radio frequency of 1.4 GHz with 6'' angular resolution and 72 {mu}Jy beam{sup -1} rms noise. The images (centered at R.A. 00{sup h}35{sup m}00{sup s}, decl. -67 Degree-Sign 00'00'' and R.A. 00{sup h}59{sup m}17{sup s}, decl. -67 Degree-Sign 00'00'', J2000 epoch) cover 8.42 deg{sup 2} sky area and have no artifacts or imaging errors above the image thermal noise. Multi-resolution radio and optical r-band images (made using the 4 m CTIO Blanco telescope) were used to recognize multi-component sources and prepare a source list; the detection threshold was 0.38 mJy in a low-resolution radio image made with beam FWHM of 50''. Radio source counts in the flux density range 0.4-8.7 mJy are estimated, with corrections applied for noise bias, effective area correction, and resolution bias. The resolution bias is mitigated using low-resolution radio images, while effects of source confusion are removed by using high-resolution images for identifying blended sources. Below 1 mJy the ATLBS counts are systematically lower than the previous estimates. Showing no evidence for an upturn down to 0.4 mJy, they do not require any changes in the radio source population down to the limit of the survey. The work suggests that automated image analysis for counts may be dependent on the ability of the imaging to reproduce connecting emission with low surface brightness and on the ability of the algorithm to recognize sources, which may require that source finding algorithms effectively work with multi-resolution and multi-wavelength data. The work underscores the importance of using source lists-as opposed to component lists-and correcting for the noise bias in order to precisely estimate counts close to the image noise and determine the upturn at sub-mJy flux density.

  11. Gating circuit for single photon-counting fluorescence lifetime instruments using high repetition pulsed light sources

    International Nuclear Information System (INIS)

    Laws, W.R.; Potter, D.W.; Sutherland, J.C.

    1984-01-01

    We have constructed a circuit that permits conventional timing electronics to be used in single photon-counting fluorimeters with high repetition rate excitation sources (synchrotrons and mode-locked lasers). Most commercial time-to-amplitude and time-to-digital converters introduce errors when processing very short time intervals and when subjected to high-frequency signals. This circuit reduces the frequency of signals representing the pulsed light source (stops) to the rate of detected fluorescence events (starts). Precise timing between the start/stop pair is accomplished by using the second stop pulse after a start pulse. Important features of our design are that the circuit is insensitive to the simultaneous occurrence of start and stop signals and that the reduction in the stop frequency allows the start/stop time interval to be placed in linear regions of the response functions of commercial timing electronics

  12. Mu-Spec - A High Performance Ultra-Compact Photon Counting spectrometer for Space Submillimeter Astronomy

    Science.gov (United States)

    Moseley, H.; Hsieh, W.-T.; Stevenson, T.; Wollack, E.; Brown, A.; Benford, D.; Sadleir; U-Yen, I.; Ehsan, N.; Zmuidzinas, J.; hide

    2011-01-01

    We have designed and are testing elements of a fully integrated submillimeter spectrometer based on superconducting microstrip technology. The instrument can offer resolving power R approximately 1500, and its high frequency cutoff is set by the gap of available high performance superconductors. All functions of the spectrometer are integrated - light is coupled to the microstrip circuit with a planar antenna, the spectra discrimination is achieved using a synthetic grating, orders are separated using planar filter, and detected using photon counting MKID detector. This spectrometer promises to revolutionize submillimeter spectroscopy from space. It replaces instruments with the scale of 1m with a spectrometer on a 10 cm Si wafer. The reduction in mass and volume promises a much higher performance system within available resource in a space mission. We will describe the system and the performance of the components that have been fabricated and tested.

  13. High count-rate study of two TES x-ray microcalorimeters with different transition temperatures

    Science.gov (United States)

    Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; Sadleir, John E.; Smith, Stephen J.; Wassell, Edward J.

    2017-10-01

    We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures (T c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T cs had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.3 eV at 6 keV from lower and higher T c devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the so-called event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96% throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T c (faster) device, and 5.8 eV FWHM with 97% throughput with the lower T c (slower) device at 722 Hz.

  14. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  15. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  16. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. Achieving a high mode count in the exact electromagnetic simulation of diffractive optical elements.

    Science.gov (United States)

    Junker, André; Brenner, Karl-Heinz

    2018-03-01

    The application of rigorous optical simulation algorithms, both in the modal as well as in the time domain, is known to be limited to the nano-optical scale due to severe computing time and memory constraints. This is true even for today's high-performance computers. To address this problem, we develop the fast rigorous iterative method (FRIM), an algorithm based on an iterative approach, which, under certain conditions, allows solving also large-size problems approximation free. We achieve this in the case of a modal representation by avoiding the computationally complex eigenmode decomposition. Thereby, the numerical cost is reduced from O(N 3 ) to O(N log N), enabling a simulation of structures like certain diffractive optical elements with a significantly higher mode count than presently possible. Apart from speed, another major advantage of the iterative FRIM over standard modal methods is the possibility to trade runtime against accuracy.

  20. Signal shaping and tail cancellation for gas proportional detectors at high counting rates

    International Nuclear Information System (INIS)

    Boie, R.A.; Hrisoho, A.T.; Rehak, P.

    1982-01-01

    A low noise, wide bandwidth preamplifier and signal processing filter were developed for high counting rate proportional counters. The filter consists of a seven pole Gaussian integrator with symmetrical weighting function and continuously variable shaping time, tausub(s), of 8-50 ns (fwhm) preceded by a second order pole/zero circuit which cancels the long (1/t) tails of the chamber signals. The preamplifier is an optimized common base input design with 2 ns rise time and an equivalent noise input charge < 2000 r.m.s. electrons, when connected to a chamber with 10 pF capacitance and at a filtering time, tausub(s), of 10 ns. (orig.)

  1. Two dimensional localization of electrons and positrons under high counting rate

    International Nuclear Information System (INIS)

    Barbosa, A.F.; Anjos, J.C.; Sanchez-Hernandez, A.; Pepe, I.M.; Barros, N.

    1997-12-01

    The construction of two wire chambers for the experiment E831 at Fermilab is reported. Each chamber includes three wire planes - one anode and two orthogonal cathodes - in which the wires operate as independent proportional counters. One of the chambers is rotated with respect to the other, so that four position coordinates may be encoded for a charged particle crossing both chambers. Spatial resolution is determined by the wire pitch: 1 mm for cathodes, 2 mm for anodes. 320 electronic channels are involved in the detection system readout. Global counting rates in excess to 10 7 events per second have been measured, while the average electron-positron beam intensity may be as high as 3 x 10 7 events per second. (author)

  2. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  3. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  4. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  5. ChromAIX2: A large area, high count-rate energy-resolving photon counting ASIC for a Spectral CT Prototype

    Science.gov (United States)

    Steadman, Roger; Herrmann, Christoph; Livne, Amir

    2017-08-01

    Spectral CT based on energy-resolving photon counting detectors is expected to deliver additional diagnostic value at a lower dose than current state-of-the-art CT [1]. The capability of simultaneously providing a number of spectrally distinct measurements not only allows distinguishing between photo-electric and Compton interactions but also discriminating contrast agents that exhibit a K-edge discontinuity in the absorption spectrum, referred to as K-edge Imaging [2]. Such detectors are based on direct converting sensors (e.g. CdTe or CdZnTe) and high-rate photon counting electronics. To support the development of Spectral CT and show the feasibility of obtaining rates exceeding 10 Mcps/pixel (Poissonian observed count-rate), the ChromAIX ASIC has been previously reported showing 13.5 Mcps/pixel (150 Mcps/mm2 incident) [3]. The ChromAIX has been improved to offer the possibility of a large area coverage detector, and increased overall performance. The new ASIC is called ChromAIX2, and delivers count-rates exceeding 15 Mcps/pixel with an rms-noise performance of approximately 260 e-. It has an isotropic pixel pitch of 500 μm in an array of 22×32 pixels and is tile-able on three of its sides. The pixel topology consists of a two stage amplifier (CSA and Shaper) and a number of test features allowing to thoroughly characterize the ASIC without a sensor. A total of 5 independent thresholds are also available within each pixel, allowing to acquire 5 spectrally distinct measurements simultaneously. The ASIC also incorporates a baseline restorer to eliminate excess currents induced by the sensor (e.g. dark current and low frequency drifts) which would otherwise cause an energy estimation error. In this paper we report on the inherent electrical performance of the ChromAXI2 as well as measurements obtained with CZT (CdZnTe)/CdTe sensors and X-rays and radioactive sources.

  6. Experimental characterization of the COndensation PArticle counting System for high altitude aircraft-borne application

    Directory of Open Access Journals (Sweden)

    S. Borrmann

    2009-06-01

    Full Text Available A characterization of the ultra-fine aerosol particle counter COPAS (COndensation PArticle counting System for operation on board the Russian high altitude research aircraft M-55 Geophysika is presented. The COPAS instrument consists of an aerosol inlet and two dual-channel continuous flow Condensation Particle Counters (CPCs operated with the chlorofluorocarbon FC-43. It operates at pressures between 400 and 50 hPa for aerosol detection in the particle diameter (dp range from 6 nm up to 1 μm. The aerosol inlet, designed for the M-55, is characterized with respect to aspiration, transmission, and transport losses. The experimental characterization of counting efficiencies of three CPCs yields dp50 (50% detection particle diameter of 6 nm, 11 nm, and 15 nm at temperature differences (ΔT between saturator and condenser of 17°C, 30°C, and 33°C, respectively. Non-volatile particles are quantified with a fourth CPC, with dp50=11 nm. It includes an aerosol heating line (250°C to evaporate H2SO4-H2O particles of 11 nm<dp<200 nm at pressures between 70 and 300 hPa. An instrumental in-flight inter-comparison of the different COPAS CPCs yields correlation coefficients of 0.996 and 0.985. The particle emission index for the M-55 in the range of 1.4–8.4×1016 kg−1 fuel burned has been estimated based on measurements of the Geophysika's own exhaust.

  7. Determination of Np, Pu and Am in high level radioactive waste with extraction-liquid scintillation counting

    International Nuclear Information System (INIS)

    Yang Dazhu; Zhu Yongjun; Jiao Rongzhou

    1994-01-01

    A new method for the determination of transuranium elements, Np, Pu and Am with extraction-liquid scintillation counting has been studied systematically. Procedures for the separation of Pu and Am by HDEHP-TRPO extraction and for the separation of Np by TTA-TiOA extraction have been developed, by which the recovery of Np, Pu and Am is 97%, 99% and 99%, respectively, and the decontamination factors for the major fission products ( 90 Sr, 137 Cs etc.) are 10 4 -10 6 . Pulse shape discrimination (PSD) technique has been introduced to liquid scintillation counting, by which the counting efficiency of α-activity is >99% and the rejection of β-counts is >99.95%. This new method, combining extraction and pulse shape discrimination with liquid scintillation technique, has been successfully applied to the assay of Np, Pu and Am in high level radioactive waste. (author) 7 refs.; 7 figs.; 4 tabs

  8. Relationship of long-term highly active antiretroviral therapy on salivary flow rate and CD4 Count among HIV-infected patients.

    Science.gov (United States)

    Kumar, J Vijay; Baghirath, P Venkat; Naishadham, P Parameswar; Suneetha, Sujai; Suneetha, Lavanya; Sreedevi, P

    2015-01-01

    To determine if long-term highly active antiretroviral therapy (HAART) therapy alters salivary flow rate and also to compare its relation of CD4 count with unstimulated and stimulated whole saliva. A cross-sectional study was performed on 150 individuals divided into three groups. Group I (50 human immunodeficiency virus (HIV) seropositive patients, but not on HAART therapy), Group II (50 HIV-infected subjects and on HAART for less than 3 years called short-term HAART), Group III (50 HIV-infected subjects and on HAART for more than or equal to 3 years called long-term HAART). Spitting method proposed by Navazesh and Kumar was used for the measurement of unstimulated and stimulated salivary flow rate. Chi-square test and analysis of variance (ANOVA) were used for statistical analysis. The mean CD4 count was 424.78 ± 187.03, 497.82 ± 206.11 and 537.6 ± 264.00 in the respective groups. Majority of the patients in all the groups had a CD4 count between 401 and 600. Both unstimulated and stimulated whole salivary (UWS and SWS) flow rates in Group I was found to be significantly higher than in Group II (P flow rate between Group II and III subjects were also found to be statistically significant (P relationship in Group II (P flow rates of HIV-infected individuals who are on long-term HAART.

  9. RBC count

    Science.gov (United States)

    ... by kidney disease) RBC destruction ( hemolysis ) due to transfusion, blood vessel injury, or other cause Leukemia Malnutrition Bone ... slight risk any time the skin is broken) Alternative Names Erythrocyte count; Red blood cell count; Anemia - RBC count Images Blood test ...

  10. Population-based CD4 counts in a rural area in South Africa with high HIV prevalence and high antiretroviral treatment coverage.

    Directory of Open Access Journals (Sweden)

    Abraham Malaza

    Full Text Available Little is known about the variability of CD4 counts in the general population of sub-Saharan Africa countries affected by the HIV epidemic. We investigated factors associated with CD4 counts in a rural area in South Africa with high HIV prevalence and high antiretroviral treatment (ART coverage.CD4 counts, health status, body mass index (BMI, demographic characteristics and HIV status were assessed in 4990 adult resident participants of a demographic surveillance in rural KwaZulu-Natal in South Africa; antiretroviral treatment duration was obtained from a linked clinical database. Multivariable regression analysis, overall and stratified by HIV status, was performed with CD4 count levels as outcome.Median CD4 counts were significantly higher in women than in men overall (714 vs. 630 cells/µl, p<0.0001, both in HIV-uninfected (833 vs. 683 cells/µl, p<0.0001 and HIV-infected adults (384.5 vs. 333 cells/µl, p<0.0001. In multivariable regression analysis, women had 19.4% (95% confidence interval (CI 16.1-22.9 higher CD4 counts than men, controlling for age, HIV status, urban/rural residence, household wealth, education, BMI, self-reported tuberculosis, high blood pressure, other chronic illnesses and sample processing delay. At ART initiation, HIV-infected adults had 21.7% (95% CI 14.6-28.2 lower CD4 counts than treatment-naive individuals; CD4 counts were estimated to increase by 9.2% (95% CI 6.2-12.4 per year of treatment.CD4 counts are primarily determined by sex in HIV-uninfected adults, and by sex, age and duration of antiretroviral treatment in HIV-infected adults. Lower CD4 counts at ART initiation in men could be a consequence of lower CD4 cell counts before HIV acquisition.

  11. Reliability of spring interconnects for high channel-count polyimide electrode arrays

    Science.gov (United States)

    Khan, Sharif; Ordonez, Juan Sebastian; Stieglitz, Thomas

    2018-05-01

    Active neural implants with a high channel-count need robust and reliable operational assembly for the targeted environment in order to be classified as viable fully implantable systems. The discrete functionality of the electrode array and the implant electronics is vital for intact assembly. A critical interface exists at the interconnection sites between the electrode array and the implant electronics, especially in hybrid assemblies (e.g. retinal implants) where electrodes and electronics are not on the same substrate. Since the interconnects in such assemblies cannot be hermetically sealed, reliable protection against the physiological environment is essential for delivering high insulation resistance and low defusibility of salt ions, which are limited in complexity by current assembly techniques. This work reports on a combination of spring-type interconnects on a polyimide array with silicone rubber gasket insulation for chronically active implantable systems. The spring design of the interconnects on the backend of the electrode array compensates for the uniform thickness of the sandwiched gasket during bonding in assembly and relieves the propagation of extrinsic stresses to the bulk polyimide substrate. The contact resistance of the microflex-bonded spring interconnects with the underlying metallized ceramic test vehicles and insulation through the gasket between adjacent contacts was investigated against the MIL883 standard. The contact and insulation resistances remained stable in the exhausting environmental conditions.

  12. Optimization of the ATLAS (s)MDT readout electronics for high counting rates

    Energy Technology Data Exchange (ETDEWEB)

    Kortner, Oliver; Kroha, Hubert; Nowak, Sebastian; Schmidt-Sommerfeld, Korbinian [Max-Planck-Institut fuer Physik (Werner-Heisenberg-Institut), Foehringer Ring 6, 80805 Muenchen (Germany)

    2016-07-01

    In the ATLAS muon spectrometer, Monitored Drift Tube (MDT) chambers are used for precise muon track measurement. For the high background rates expected at HL-LHC, which are mainly due to neutrons and photons produced by interactions of the proton collision products in the detector and shielding, new small-diameter muon drift tube (sMDT)-chambers with half the drift tube diameter of the MDT-chambers and ten times higher rate capability have been developed. The standard MDT readout electronics uses bipolar shaping in front of a discriminator. This shaping leads to an undershoot of same charge but opposite polarity following each pulse. With count rates also the probability of having the subsequent pulse in this undershoot increases, which leads to losses in efficiency and spatial resolution. In order to decrease this effect, discrete prototype electronics including Baseline Restoration has been developed. Results of their tests and data taken with them during muon beamtime measurements at CERN's Gamma Irradiation Facility will be presented. which causes a deterioration of signal pulses by preceding background hits, leading to losses in muon efficiency and drift tube spatial resolution. In order to mitigate these so-called signal pile-up effects, new readout electronics with active baseline restoration (BLR) is under development. Discrete prototype electronics with BLR functionality has been tested in laboratory measurements and in the Gamma Irradiation Facility at CERN under high γ-irradiation rates. Results of the measurements are presented.

  13. Toward an understanding of the building blocks: constructing programs for high processor count systems

    International Nuclear Information System (INIS)

    Reilly, M H

    2008-01-01

    Technology and industry trends have clearly shown that the future of technical computing lies in exploitation of more processors in larger multiprocessor systems. Exploitation of high processor count architectures demands a more thorough understanding of the underlying system dynamics and an accounting for them in the design of high-performance applications. Currently these dynamics are incompletely described by the widely adopted benchmarks and kernel metrics. Systems are most often characterized to allow comparisons and ranking. Often the characterizations are in the form of a scalar measure of some aspect of system performance that is a 'not to exceed' number: the maximum possible level of performance that could be attained. While such comparisons typically drive both system design and procurement, more useful characterizations can be used to drive application development and design. This paper explores a few of these measures and presents a few simple examples of their application. The first set of metrics addresses individual processor performance, specifically performance related to memory references. The second set of metrics attempts to describe the behavior of the message-passing system under load and across a range of conditions

  14. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  15. A Near-Infrared Photon Counting Camera for High Sensitivity Astronomical Observation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a Near Infrared Photon-Counting Sensor (NIRPCS), an imaging device with sufficient sensitivity to capture the spectral signatures, in the...

  16. A Near-Infrared Photon Counting Camera for High Sensitivity Astronomical Observation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a Near Infrared Photon-Counting Sensor (NIRPCS), an imaging device with sufficient sensitivity to capture the spectral signatures, in the...

  17. Highly Sensitive Photon Counting Detectors for Deep Space Optical Communications, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A new type of a photon-counting photodetector is proposed to advance the state-of the-art in deep space optical communications technology. The proposed detector...

  18. High Resolution Gamma Ray Spectroscopy at MHz Counting Rates With LaBr3 Scintillators for Fusion Plasma Applications

    Science.gov (United States)

    Nocente, M.; Tardocchi, M.; Olariu, A.; Olariu, S.; Pereira, R. C.; Chugunov, I. N.; Fernandes, A.; Gin, D. B.; Grosso, G.; Kiptily, V. G.; Neto, A.; Shevelev, A. E.; Silva, M.; Sousa, J.; Gorini, G.

    2013-04-01

    High resolution γ-ray spectroscopy measurements at MHz counting rates were carried out at nuclear accelerators, combining a LaBr 3(Ce) detector with dedicated hardware and software solutions based on digitization and off-line analysis. Spectra were measured at counting rates up to 4 MHz, with little or no degradation of the energy resolution, adopting a pile up rejection algorithm. The reported results represent a step forward towards the final goal of high resolution γ-ray spectroscopy measurements on a burning plasma device.

  19. Disseminated HIV-Associated Kaposi’s Sarcoma With High CD4 Cell Count And Low Viral Load

    Directory of Open Access Journals (Sweden)

    Diana Pereira Anjos

    2017-12-01

    Full Text Available Kaposi’s sarcoma is considered an acquired immunodeficiency syndrome-defining illness and is caused by human herpesvirus 8. It has been associated with patients infected with human immunodeficiency virus (HIV who have CD4 T lymphocytes <200 cells/uL and high viral loads. We report a case of a 23-year old woman infected with HIV-1 and receiving antiretroviral treatment since diagnosis, with high CD4 cell count and low viral load that presented with disseminated Kaposi’s sarcoma. Clinicians should be aware of the occurrence of Kaposi’s sarcoma despite robust CD4 cell counts.

  20. Construction and Test of Muon Drift Tube Chambers for High Counting Rates

    CERN Document Server

    Schwegler, Philipp; Dubbert, Jörg

    2010-01-01

    Since the start of operation of the Large Hadron Collider (LHC) at CERN on 20 November 2009, the instantaneous luminosity is steadily increasing. The muon spectrometer of the ATLAS detector at the LHC is instrumented with trigger and precision tracking chambers in a toroidal magnetic field. Monitored Drift-Tube (MDT) chambers are employed as precision tracking chambers, complemented by Cathode Strip Chambers (CSC) in the very forward region where the background counting rate due to neutrons and γ's produced in shielding material and detector components is too high for the MDT chambers. After several upgrades of the CERN accelerator system over the coming decade, the instantaneous luminosity is expected to be raised to about five times the LHC design luminosity. This necessitates replacement of the muon chambers in the regions with the highest background radiation rates in the so-called Small Wheels, which constitute the innermost layers of the muon spectrometer end-caps, by new detectors with higher rate cap...

  1. Preamplifier development for high count-rate, large dynamic range readout of inorganic scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Keshelashvili, Irakli; Erni, Werner; Steinacher, Michael; Krusche, Bernd; Collaboration: PANDA-Collaboration

    2013-07-01

    Electromagnetic calorimeter are central component of many experiments in nuclear and particle physics. Modern ''trigger less'' detectors run with very high count-rates, require good time and energy resolution, and large dynamic range. In addition photosensors and preamplifiers must work in hostile environments (magnetic fields). Due to later constraints mainly Avalanche Photo Diodes (APD's), Vacuum Photo Triodes (VPT's), and Vacuum Photo Tetrodes (VPTT's) are used. A disadvantage is their low gain which together with other requirements is a challenge for the preamplifier design. Our group has developed special Low Noise / Low Power (LNP) preamplifier for this purpose. They will be used to equip PANDA EMC forward end-cap (dynamic range 15'000, rate 1MHz), where the PWO II crystals and preamplifier have to run in an environment cooled down to -25{sup o}C. Further application is the upgrade of the Crystal Barrel detector at the Bonn ELSA accelerator with APD readout for which special temperature comparison of the APD gain and good time resolution is necessary. Development and all test procedures after the mass production done by our group during past several years in Basel University will be reported.

  2. Centroid based clustering of high throughput sequencing reads based on n-mer counts.

    Science.gov (United States)

    Solovyov, Alexander; Lipkin, W Ian

    2013-09-08

    Many problems in computational biology require alignment-free sequence comparisons. One of the common tasks involving sequence comparison is sequence clustering. Here we apply methods of alignment-free comparison (in particular, comparison using sequence composition) to the challenge of sequence clustering. We study several centroid based algorithms for clustering sequences based on word counts. Study of their performance shows that using k-means algorithm with or without the data whitening is efficient from the computational point of view. A higher clustering accuracy can be achieved using the soft expectation maximization method, whereby each sequence is attributed to each cluster with a specific probability. We implement an open source tool for alignment-free clustering. It is publicly available from github: https://github.com/luscinius/afcluster. We show the utility of alignment-free sequence clustering for high throughput sequencing analysis despite its limitations. In particular, it allows one to perform assembly with reduced resources and a minimal loss of quality. The major factor affecting performance of alignment-free read clustering is the length of the read.

  3. Lower white blood cell counts in elite athletes training for highly aerobic sports.

    Science.gov (United States)

    Horn, P L; Pyne, D B; Hopkins, W G; Barnes, C J

    2010-11-01

    White cell counts at rest might be lower in athletes participating in selected endurance-type sports. Here, we analysed blood tests of elite athletes collected over a 10-year period. Reference ranges were established for 14 female and 14 male sports involving 3,679 samples from 937 females and 4,654 samples from 1,310 males. Total white blood cell counts and counts of neutrophils, lymphocytes and monocytes were quantified. Each sport was scaled (1-5) for its perceived metabolic stress (aerobic-anaerobic) and mechanical stress (concentric-eccentric) by 13 sports physiologists. Substantially lower total white cell and neutrophil counts were observed in aerobic sports of cycling and triathlon (~16% of test results below the normal reference range) compared with team or skill-based sports such as water polo, cricket and volleyball. Mechanical stress of sports had less effect on the distribution of cell counts. The lower white cell counts in athletes in aerobic sports probably represent an adaptive response, not underlying pathology.

  4. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG

    Science.gov (United States)

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Sertac Artan, N.; Froemke, Robert C.; Viventi, Jonathan

    2017-04-01

    Objective. High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. Approach. We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. Main results. The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 d. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. Significance. We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but

  5. Counting probe

    International Nuclear Information System (INIS)

    Matsumoto, Haruya; Kaya, Nobuyuki; Yuasa, Kazuhiro; Hayashi, Tomoaki

    1976-01-01

    Electron counting method has been devised and experimented for the purpose of measuring electron temperature and density, the most fundamental quantities to represent plasma conditions. Electron counting is a method to count the electrons in plasma directly by equipping a probe with the secondary electron multiplier. It has three advantages of adjustable sensitivity, high sensitivity of the secondary electron multiplier, and directional property. Sensitivity adjustment is performed by changing the size of collecting hole (pin hole) on the incident front of the multiplier. The probe is usable as a direct reading thermometer of electron temperature because it requires to collect very small amount of electrons, thus it doesn't disturb the surrounding plasma, and the narrow sweep width of the probe voltage is enough. Therefore it can measure anisotropy more sensitively than a Langmuir probe, and it can be used for very low density plasma. Though many problems remain on anisotropy, computer simulation has been carried out. Also it is planned to provide a Helmholtz coil in the vacuum chamber to eliminate the effect of earth magnetic field. In practical experiments, the measurement with a Langmuir probe and an emission probe mounted to the movable structure, the comparison with the results obtained in reverse magnetic field by using a Helmholtz coil, and the measurement of ionic sound wave are scheduled. (Wakatsuki, Y.)

  6. Improving EWMA Plans for Detecting Unusual Increases in Poisson Counts

    Directory of Open Access Journals (Sweden)

    R. S. Sparks

    2009-01-01

    adaptive exponentially weighted moving average (EWMA plan is developed for signalling unusually high incidence when monitoring a time series of nonhomogeneous daily disease counts. A Poisson transitional regression model is used to fit background/expected trend in counts and provides “one-day-ahead” forecasts of the next day's count. Departures of counts from their forecasts are monitored. The paper outlines an approach for improving early outbreak data signals by dynamically adjusting the exponential weights to be efficient at signalling local persistent high side changes. We emphasise outbreak signals in steady-state situations; that is, changes that occur after the EWMA statistic had run through several in-control counts.

  7. Statistical approach for calculating opacities of high-Z plasmas

    International Nuclear Information System (INIS)

    Nishikawa, Takeshi; Nakamura, Shinji; Takabe, Hideaki; Mima, Kunioki

    1992-01-01

    For simulating the X-ray radiation from laser produced high-Z plasma, an appropriate atomic modeling is necessary. Based on the average ion model, we have used a rather simple atomic model for opacity calculation in a hydrodynamic code and obtained a fairly good agreement with the experiment on the X-ray spectra from the laser-produced plasmas. We have investigated the accuracy of the atomic model used in the hydrodynamic code. It is found that transition energies of 4p-4d, 4d-4f, 4p-5d, 4d-5f and 4f-5g, which are important in laser produced high-Z plasma, can be given within an error of 15 % compared to the values by the Hartree-Fock-Slater (HFS) calculation and their oscillator strengths obtained by HFS calculation vary by a factor two according to the difference of charge state. We also propose a statistical method to carry out detail configuration accounting for electronic state by use of the population of bound electrons calculated with the average ion model. The statistical method is relatively simple and provides much improvement in calculating spectral opacities of line radiation, when we use the average ion model to determine electronic state. (author)

  8. High-Throughput Nanoindentation for Statistical and Spatial Property Determination

    Science.gov (United States)

    Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.

    2018-04-01

    Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.

  9. Topics in statistical data analysis for high-energy physics

    International Nuclear Information System (INIS)

    Cowan, G.

    2011-01-01

    These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)

  10. Renal stone characterization using high resolution imaging mode on a photon counting detector CT system

    Science.gov (United States)

    Ferrero, A.; Gutjahr, R.; Henning, A.; Kappler, S.; Halaweish, A.; Abdurakhimova, D.; Peterson, Z.; Montoya, J.; Leng, S.; McCollough, C.

    2017-03-01

    In addition to the standard-resolution (SR) acquisition mode, a high-resolution (HR) mode is available on a research photon-counting-detector (PCD) whole-body CT system. In the HR mode each detector consists of a 2x2 array of 0.225 mm x 0.225 mm subpixel elements. This is in contrast to the SR mode that consists of a 4x4 array of the same subelements, and results in 0.25 mm isotropic resolution at iso-center for the HR mode. In this study, we quantified ex vivo the capabilities of the HR mode to characterize renal stones in terms of morphology and mineral composition. Forty pure stones - 10 uric acid (UA), 10 cystine (CYS), 10 calcium oxalate monohydrate (COM) and 10 apatite (APA) - and 14 mixed stones were placed in a 20 cm water phantom and scanned in HR mode, at radiation dose matched to that of routine dual-energy stone exams. Data from micro CT provided a reference for the quantification of morphology and mineral composition of the mixed stones. The area under the ROC curve was 1.0 for discriminating UA from CYS, 0.89 for CYS vs COM and 0.84 for COM vs APA. The root mean square error (RMSE) of the percent UA in mixed stones was 11.0% with a medium-sharp kernel and 15.6% with the sharpest kernel. The HR showed qualitatively accurate characterization of stone morphology relative to micro CT.

  11. Time-over-threshold readout to enhance the high flux capabilities of single-photon-counting detectors

    International Nuclear Information System (INIS)

    Bergamaschi, Anna; Dinapoli, Roberto; Greiffenberg, Dominic; Henrich, Beat; Johnson, Ian; Mozzanica, Aldo; Radicci, Valeria; Schmitt, Bernd; Shi, Xintian; Stoppani, Laura

    2011-01-01

    The MYTHEN photon-counting ASIC operated in time-over-threshold mode shows an innovative approach towards the development of a detector operating with very high photon intensities while maintaining the single-photon sensitivity for synchrotron radiation experiments. The MYTHEN single-photon-counting (SPC) detector has been characterized using the time-over-threshold (ToT) readout method, i.e. measuring the time that the signal produced by the detected X-rays remains above the comparator threshold. In the following it is shown that the ToT readout preserves the sensitivity, dynamic range and capability of background suppression of the SPC mode, while enhancing the count-rate capability, which is the main limitation of state-of-the-art SPC systems

  12. Time-over-threshold readout to enhance the high flux capabilities of single-photon-counting detectors

    Energy Technology Data Exchange (ETDEWEB)

    Bergamaschi, Anna, E-mail: anna.bergamaschi@psi.ch; Dinapoli, Roberto; Greiffenberg, Dominic; Henrich, Beat; Johnson, Ian; Mozzanica, Aldo; Radicci, Valeria; Schmitt, Bernd; Shi, Xintian; Stoppani, Laura [Paul Scherrer Institut, CH-5232 Villigen (Switzerland)

    2011-11-01

    The MYTHEN photon-counting ASIC operated in time-over-threshold mode shows an innovative approach towards the development of a detector operating with very high photon intensities while maintaining the single-photon sensitivity for synchrotron radiation experiments. The MYTHEN single-photon-counting (SPC) detector has been characterized using the time-over-threshold (ToT) readout method, i.e. measuring the time that the signal produced by the detected X-rays remains above the comparator threshold. In the following it is shown that the ToT readout preserves the sensitivity, dynamic range and capability of background suppression of the SPC mode, while enhancing the count-rate capability, which is the main limitation of state-of-the-art SPC systems.

  13. Ballistic deficit correction methods for large Ge detectors-high counting rate study

    International Nuclear Information System (INIS)

    Duchene, G.; Moszynski, M.

    1995-01-01

    This study presents different ballistic correction methods versus input count rate (from 3 to 50 kcounts/s) using four large Ge detectors of about 70 % relative efficiency. It turns out that the Tennelec TC245 linear amplifier in the BDC mode (Hinshaw method) is the best compromise for energy resolution throughout. All correction methods lead to narrow sum-peaks indistinguishable from single Γ lines. The full energy peak throughput is found representative of the pile-up inspection dead time of the corrector circuits. This work also presents a new and simple representation, plotting simultaneously energy resolution and throughput versus input count rate. (TEC). 12 refs., 11 figs

  14. High Sensitivity Detection of Xe Isotopes Via Beta-Gamma Coincidence Counting

    International Nuclear Information System (INIS)

    Bowyer, Ted W.; McIntyre, Justin I.; Reeder, Paul L.

    1999-01-01

    Measurement of xenon fission product isotopes is a key element in the global network being established to monitor the Comprehensive Nuclear-Test-Ban Treaty. Pacific Northwest National Laboratory has developed an automated system for separating Xe from air which includes a beta-gamma counting system for 131mXe, 133mXe, 133Xe, and 135Xe. Betas and conversion electrons are detected in a plastic scintillation cell containing the Xe sample. The counting geometry is nearly 100% for beta and conversion electrons. The resolution in the pulse height spectrum from the plastic scintillator is sufficient to observe distinct peaks for specific conversion electrons. Gamma and X-rays are detected in a NaI(Tl) scintillation detector which surrounds the plastic scintillator sample cell. Two-dimensional pulse height spectra of gamma energy versus beta energy are obtained. Each of the four xenon isotopes has a distinctive signature in the two-dimensional energy array. The details of the counting system, examples of two-dimensional beta-gamma data, and operational experience with this counting system will be described

  15. Automatic Counting of Large Mammals from Very High Resolution Panchromatic Satellite Imagery

    NARCIS (Netherlands)

    Xue, Yifei; Wang, Tiejun; Skidmore, Andrew K.

    2017-01-01

    Estimating animal populations by direct counting is an essential component of wildlife conservation and management. However, conventional approaches (i.e., ground survey and aerial survey) have intrinsic constraints. Advances in image data capture and processing provide new opportunities for using

  16. Study on the behaviour of timing photomultipliers at a high counting rate

    International Nuclear Information System (INIS)

    Gladyshev, D.A.; Li, B.N.; Yunusov, Kh.R.

    1978-01-01

    Variations in the amplification factor K of a photomultiplier (PMU) with the accuracy of 1% in a pulse mode are studied. Measurements were performed by means of a light pulse generator based on a light diode which generates pulses at the repetition rate of 250-10 5 pulse/s. Relative variations in K were determined by the position of the peak gravity centre from the light diode using a pulse analyzer and a frequency meter. Results of PM testing show that, at a sudden counting rate increase, the amplification increases during the time period less than, the measurement time (less than 1 s) and returns to the stationary value. When the counting rate returns from 10 5 pulse/s to the initial value of 250 pulse/s, the amplification decreases and than increases to stationary value. The total time of K stabilization after counting rate applying constitutes 10-70 min. Restoration of K after counting rate removal occurs to be much slower, during 3 hr. 40 min. K values varied from 1 to 12%

  17. Computerized pattern recognition used for grain counting in high resolution autoradiographs with low grain densities

    NARCIS (Netherlands)

    Schellart, N. A.; Zweijpfenning, R. C.; van Marle, J.; Huijsmans, D. P.

    1986-01-01

    Using a video-image system coupled to a minicomputer with commercial image handling software, autoradiographic grains displayed in dark-field are counted with a fast (ca. 3.5 min for 120,000 microns 2) and reliable (false scores less than 5%) grain-recognizing FORTRAN program executed in the users

  18. Counting carbohydrates

    Science.gov (United States)

    Carb counting; Carbohydrate-controlled diet; Diabetic diet; Diabetes-counting carbohydrates ... Many foods contain carbohydrates (carbs), including: Fruit and fruit juice Cereal, bread, pasta, and rice Milk and milk products, soy milk Beans, legumes, ...

  19. Nondestructive detection of total viable count changes of chilled pork in high oxygen storage condition based on hyperspectral technology

    Science.gov (United States)

    Zheng, Xiaochun; Peng, Yankun; Li, Yongyu; Chao, Kuanglin; Qin, Jianwei

    2017-05-01

    The plate count method is commonly used to detect the total viable count (TVC) of bacteria in pork, which is timeconsuming and destructive. It has also been used to study the changes of the TVC in pork under different storage conditions. In recent years, many scholars have explored the non-destructive methods on detecting TVC by using visible near infrared (VIS/NIR) technology and hyperspectral technology. The TVC in chilled pork was monitored under high oxygen condition in this study by using hyperspectral technology in order to evaluate the changes of total bacterial count during storage, and then evaluate advantages and disadvantages of the storage condition. The VIS/NIR hyperspectral images of samples stored in high oxygen condition was acquired by a hyperspectral system in range of 400 1100nm. The actual reference value of total bacteria was measured by standard plate count method, and the results were obtained in 48 hours. The reflection spectra of the samples are extracted and used for the establishment of prediction model for TVC. The spectral preprocessing methods of standard normal variate transformation (SNV), multiple scatter correction (MSC) and derivation was conducted to the original reflectance spectra of samples. Partial least squares regression (PLSR) of TVC was performed and optimized to be the prediction model. The results show that the near infrared hyperspectral technology based on 400-1100nm combined with PLSR model can describe the growth pattern of the total bacteria count of the chilled pork under the condition of high oxygen very vividly and rapidly. The results obtained in this study demonstrate that the nondestructive method of TVC based on NIR hyperspectral has great potential in monitoring of edible safety in processing and storage of meat.

  20. Unaccounted Workload Factor: Game-Day Pitch Counts in High School Baseball Pitchers-An Observational Study.

    Science.gov (United States)

    Zaremski, Jason L; Zeppieri, Giorgio; Jones, Deborah L; Tripp, Brady L; Bruner, Michelle; Vincent, Heather K; Horodyski, MaryBeth

    2018-04-01

    Throwing injuries are common in high school baseball. Known risk factors include excessive pitch counts, year-round pitching, and pitching with arm pain and fatigue. Despite the evidence, the prevalence of pitching injuries among high school players has not decreased. One possibility to explain this pattern is that players accumulate unaccounted pitch volume during warm-up and bullpen activity, but this has not yet been examined. Our primary hypothesis was that approximately 30% to 40% of pitches thrown off a mound by high school pitchers during a game-day outing are unaccounted for in current data but will be revealed when bullpen sessions and warm-up pitches are included. Our secondary hypothesis was that there is wide variability among players in the number of bullpen pitches thrown per outing. Cross-sectional study; Level of evidence, 3. Researchers counted all pitches thrown off a mound during varsity high school baseball games played by 34 high schools in North Central Florida during the 2017 season. We recorded 13,769 total pitches during 115 varsity high school baseball starting pitcher outings. The mean ± SD pitch numbers per game were calculated for bullpen activity (27.2 ± 9.4), warm-up (23.6 ±8.0), live games (68.9 ±19.7), and total pitches per game (119.7 ± 27.8). Thus, 42.4% of the pitches performed were not accounted for in the pitch count monitoring of these players. The number of bullpen pitches thrown varied widely among players, with 25% of participants in our data set throwing fewer than 22 pitches and 25% throwing more than 33 pitches per outing. In high school baseball players, pitch count monitoring does not account for the substantial volume of pitching that occurs during warm-up and bullpen activity during the playing season. These extra pitches should be closely monitored to help mitigate the risk of overuse injury.

  1. A high count rate position decoding and energy measuring method for nuclear cameras using Anger logic detectors

    International Nuclear Information System (INIS)

    Wong, W.H.; Li, H.; Uribe, J.

    1998-01-01

    A new method for processing signals from Anger position-sensitive detectors used in gamma cameras and PET is proposed for very high count-rate imaging where multiple-event pileups are the norm. This method is designed to sort out and recover every impinging event from multiple-event pileups while maximizing the collection of scintillation signal for every event to achieve optimal accuracy in the measurement of energy and position. For every detected event, this method cancels the remnant signals from previous events, and excludes the pileup of signals from following events. The remnant subtraction is exact even for multiple pileup events. A prototype circuit for energy recovery demonstrated that the maximum count rates can be increased by more than 10 times comparing to the pulse-shaping method, and the energy resolution is as good as pulse shaping (or fixed integration) at low count rates. At 2 x 10 6 events/sec on NaI(Tl), the true counts acquired with this method is 3.3 times more than the delay-line clipping method (256 ns clipping) due to events recovered from pileups. Pulse-height spectra up to 3.5 x 10 6 events/sec have been studied. Monte Carlo simulation studies have been performed for image-quality comparisons between different processing methods

  2. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    Statistical inference for sparse signals or low-rank matrices in high-dimensional settings is of significant interest in a range of contemporary applications. It has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. In this thesis, we consider several problems in including sparse signal recovery (compressed sensing under restricted isometry) and low-rank matrix recovery (matrix recovery via rank-one projections and structured matrix completion). The first part of the thesis discusses compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and low-rank matrix recovery. The analysis relies on a key technical tool which represents points in a polytope by convex combinations of sparse vectors. The technique is elementary while leads to sharp results. It is shown that, in compressed sensing, delta kA 0, delta kA < 1/3 + epsilon, deltak A + thetak,kA < 1 + epsilon, or deltatkA< √(t - 1) / t + epsilon are not sufficient to guarantee the exact recovery of all k-sparse signals for large k. Similar result also holds for matrix recovery. In addition, the conditions delta kA<1/3, deltak A+ thetak,kA<1, delta tkA < √(t - 1)/t and deltarM<1/3, delta rM+ thetar,rM<1, delta trM< √(t - 1)/ t are also shown to be sufficient respectively for stable recovery of approximately sparse signals and low-rank matrices in the noisy case. For the second part of the thesis, we introduce a rank-one projection model for low-rank matrix recovery and propose a constrained nuclear norm minimization method for stable recovery of low-rank matrices in the noisy case. The procedure is adaptive to the rank and robust against small perturbations. Both upper and lower bounds for the estimation accuracy under the Frobenius norm loss are obtained. The proposed estimator is shown to be rate-optimal under certain conditions. The

  3. Statistical mechanics of high-density bond percolation

    Science.gov (United States)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  4. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  5. Comparison of Drive Counts and Mark-Resight As Methods of Population Size Estimation of Highly Dense Sika Deer (Cervus nippon Populations.

    Directory of Open Access Journals (Sweden)

    Kazutaka Takeshita

    Full Text Available Assessing temporal changes in abundance indices is an important issue in the management of large herbivore populations. The drive counts method has been frequently used as a deer abundance index in mountainous regions. However, despite an inherent risk for observation errors in drive counts, which increase with deer density, evaluations of the utility of drive counts at a high deer density remain scarce. We compared the drive counts and mark-resight (MR methods in the evaluation of a highly dense sika deer population (MR estimates ranged between 11 and 53 individuals/km2 on Nakanoshima Island, Hokkaido, Japan, between 1999 and 2006. This deer population experienced two large reductions in density; approximately 200 animals in total were taken from the population through a large-scale population removal and a separate winter mass mortality event. Although the drive counts tracked temporal changes in deer abundance on the island, they overestimated the counts for all years in comparison to the MR method. Increased overestimation in drive count estimates after the winter mass mortality event may be due to a double count derived from increased deer movement and recovery of body condition secondary to the mitigation of density-dependent food limitations. Drive counts are unreliable because they are affected by unfavorable factors such as bad weather, and they are cost-prohibitive to repeat, which precludes the calculation of confidence intervals. Therefore, the use of drive counts to infer the deer abundance needs to be reconsidered.

  6. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  7. CLARO: an ASIC for high rate single photon counting with multi-anode photomultipliers

    Science.gov (United States)

    Baszczyk, M.; Carniti, P.; Cassina, L.; Cotta Ramusino, A.; Dorosz, P.; Fiorini, M.; Gotti, C.; Kucewicz, W.; Malaguti, R.; Pessina, G.

    2017-08-01

    The CLARO is a radiation-hard 8-channel ASIC designed for single photon counting with multi-anode photomultiplier tubes. Each channel outputs a digital pulse when the input signal from the photomultiplier crosses a configurable threshold. The fast return to baseline, typically within 25 ns, and below 50 ns in all conditions, allows to count up to 107 hits/s on each channel, with a power consumption of about 1 mW per channel. The ASIC presented here is a much improved version of the first 4-channel prototype. The threshold can be precisely set in a wide range, between 30 ke- (5 fC) and 16 Me- (2.6 pC). The noise of the amplifier with a 10 pF input capacitance is 3.5 ke- (0.6 fC) RMS. All settings are stored in a 128-bit configuration and status register, protected against soft errors with triple modular redundancy. The paper describes the design of the ASIC at transistor-level, and demonstrates its performance on the test bench.

  8. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    Science.gov (United States)

    Lu, Wei; Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies (50) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  9. High absolute basophil count is a powerful independent predictor of inferior overall survival in patients with primary myelofibrosis.

    Science.gov (United States)

    Lucijanic, Marko; Livun, Ana; Stoos-Veic, Tajana; Pejsa, Vlatko; Jaksic, Ozren; Cicic, David; Lucijanic, Jelena; Romic, Zeljko; Orehovec, Biserka; Aralica, Gorana; Miletic, Marko; Kusec, Rajko

    2018-05-01

    To investigate the clinical and prognostic significance of absolute basophil count (ABC) in patients with primary myelofibrosis (PMF). We retrospectively investigated 58 patients with PMF treated in our institution in the period from 2006 to 2017. ABC was obtained in addition to other hematological and clinical parameters. Patients were separated into high and low ABC groups using the Receiver operating characteristic curve analysis. ABC was higher in PMF patients than in healthy controls (P constitutional symptoms (P = 0.030) and massive splenomegaly (P = 0.014). ABC was also positively correlated with absolute monocyte count (AMC) (P processes. Hence, both have a potential for improvement of current prognostic scores. Basophils represent a part of malignant clone in PMF and are associated with unfavorable disease features and poor prognosis which is independent of currently established prognostic scoring system and monocytosis.

  10. High levels of viral suppression among East African HIV-infected women and men in serodiscordant partnerships initiating antiretroviral therapy with high CD4 counts and during pregnancy.

    Science.gov (United States)

    Mujugira, Andrew; Baeten, Jared; Kidoguchi, Lara; Haberer, Jessica; Celum, Connie; Donnell, Deborah; Ngure, Kenneth; Bukusi, Elizabeth; Mugo, Nelly; Asiimwe, Stephen; Odoyo, Josephine; Tindimwebwa, Edna; Bulya, Nulu; Katabira, Elly; Heffron, Renee

    2017-09-13

    People who are asymptomatic and feel healthy, including pregnant women, may be less motivated to initiate ART or achieve high adherence. We assessed whether ART initiation, and viral suppression 6, 12 and 24-months after ART initiation, were lower in HIV-infected members of serodiscordant couples who initiated during pregnancy or with higher CD4 counts. We used data from the Partners Demonstration Project, an open-label study of the delivery of integrated PrEP and ART (at any CD4 count) for HIV prevention among high-risk HIV serodiscordant couples in Kenya and Uganda. Differences in viral suppression (HIV RNA 500 cells/mm3) and during pregnancy were estimated using Poisson regression. Of 865 HIV-infected participants retained after becoming eligible for ART during study follow-up, 95% initiated ART. Viral suppression 24-months after ART initiation was high overall (97%), and comparable among those initiating ART at CD4 counts >500, 351-500 and ≤350 cells/mm3 (96% vs 97% vs 97%; relative risk [RR] 0.98; 95% CI: 0.93-1.03 for CD4 >500 vs <350 and RR 0.99; 95% CI: (0.93-1.06) for CD4 351-500 vs ≤350). Viral suppression was as likely among women initiating ART primarily to prevent perinatal transmission as ART initiation for other reasons (p=0.9 at 6 months and p=0.5 at 12 months). Nearly all HIV-infected partners initiating ART were virally suppressed by 24 months, irrespective of CD4 count or pregnancy status. These findings suggest that people initiating ART at high CD4 counts or due to pregnancy can adhere to ART as well as those starting treatment with symptomatic HIV disease or low CD4 counts.

  11. Counting cormorants

    DEFF Research Database (Denmark)

    Bregnballe, Thomas; Carss, David N; Lorentsen, Svein-Håkon

    2013-01-01

    This chapter focuses on Cormorant population counts for both summer (i.e. breeding) and winter (i.e. migration, winter roosts) seasons. It also explains differences in the data collected from undertaking ‘day’ versus ‘roost’ counts, gives some definitions of the term ‘numbers’, and presents two...

  12. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    Science.gov (United States)

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  13. Improovement of statistical counting conditions for the determination of chloride in Beams leaves (Phaseolus vulgaris L.) by thermal neutron activation analysis

    International Nuclear Information System (INIS)

    Ferraz, E.S.B.; Nascimento Filho, V.F.

    1975-04-01

    The use of two radiation peaks from the same gamma-emitting source in the calculation of the corresponding liquid counting rate in multi-element gamma spectrometry is discussed. It is shown that, in the determination of chlorine in Phaseolus vulgaris L. using neutronic activation analysis will result in an increase in accuracy of measurement of approximately 40%

  14. Unaccounted Workload Factor: Game-Day Pitch Counts in High School Baseball Pitchers—An Observational Study

    Science.gov (United States)

    Zaremski, Jason L.; Zeppieri, Giorgio; Jones, Deborah L.; Tripp, Brady L.; Bruner, Michelle; Vincent, Heather K.; Horodyski, MaryBeth

    2018-01-01

    Background: Throwing injuries are common in high school baseball. Known risk factors include excessive pitch counts, year-round pitching, and pitching with arm pain and fatigue. Despite the evidence, the prevalence of pitching injuries among high school players has not decreased. One possibility to explain this pattern is that players accumulate unaccounted pitch volume during warm-up and bullpen activity, but this has not yet been examined. Hypotheses: Our primary hypothesis was that approximately 30% to 40% of pitches thrown off a mound by high school pitchers during a game-day outing are unaccounted for in current data but will be revealed when bullpen sessions and warm-up pitches are included. Our secondary hypothesis was that there is wide variability among players in the number of bullpen pitches thrown per outing. Study Design: Cross-sectional study; Level of evidence, 3. Methods: Researchers counted all pitches thrown off a mound during varsity high school baseball games played by 34 high schools in North Central Florida during the 2017 season. Results: We recorded 13,769 total pitches during 115 varsity high school baseball starting pitcher outings. The mean ± SD pitch numbers per game were calculated for bullpen activity (27.2 ± 9.4), warm-up (23.6 ±8.0), live games (68.9 ±19.7), and total pitches per game (119.7 ± 27.8). Thus, 42.4% of the pitches performed were not accounted for in the pitch count monitoring of these players. The number of bullpen pitches thrown varied widely among players, with 25% of participants in our data set throwing fewer than 22 pitches and 25% throwing more than 33 pitches per outing. Conclusion: In high school baseball players, pitch count monitoring does not account for the substantial volume of pitching that occurs during warm-up and bullpen activity during the playing season. These extra pitches should be closely monitored to help mitigate the risk of overuse injury. PMID:29662911

  15. High resolution micro-CT of low attenuating organic materials using large area photon-counting detector

    International Nuclear Information System (INIS)

    Kumpová, I.; Jandejsek, I.; Jakůbek, J.; Vopálenský, M.; Vavřík, D.; Fíla, T.; Koudelka, P.; Kytýř, D.; Zlámal, P.; Gantar, A.

    2016-01-01

    To overcome certain limitations of contemporary materials used for bone tissue engineering, such as inflammatory response after implantation, a whole new class of materials based on polysaccharide compounds is being developed. Here, nanoparticulate bioactive glass reinforced gelan-gum (GG-BAG) has recently been proposed for the production of bone scaffolds. This material offers promising biocompatibility properties, including bioactivity and biodegradability, with the possibility of producing scaffolds with directly controlled microgeometry. However, to utilize such a scaffold with application-optimized properties, large sets of complex numerical simulations using the real microgeometry of the material have to be carried out during the development process. Because the GG-BAG is a material with intrinsically very low attenuation to X-rays, its radiographical imaging, including tomographical scanning and reconstructions, with resolution required by numerical simulations might be a very challenging task. In this paper, we present a study on X-ray imaging of GG-BAG samples. High-resolution volumetric images of investigated specimens were generated on the basis of micro-CT measurements using a large area flat-panel detector and a large area photon-counting detector. The photon-counting detector was composed of a 010× 1 matrix of Timepix edgeless silicon pixelated detectors with tiling based on overlaying rows (i.e. assembled so that no gap is present between individual rows of detectors). We compare the results from both detectors with the scanning electron microscopy on selected slices in transversal plane. It has been shown that the photon counting detector can provide approx. 3× better resolution of the details in low-attenuating materials than the integrating flat panel detectors. We demonstrate that employment of a large area photon counting detector is a good choice for imaging of low attenuating materials with the resolution sufficient for numerical

  16. High resolution micro-CT of low attenuating organic materials using large area photon-counting detector

    Science.gov (United States)

    Kumpová, I.; Vavřík, D.; Fíla, T.; Koudelka, P.; Jandejsek, I.; Jakůbek, J.; Kytýř, D.; Zlámal, P.; Vopálenský, M.; Gantar, A.

    2016-02-01

    To overcome certain limitations of contemporary materials used for bone tissue engineering, such as inflammatory response after implantation, a whole new class of materials based on polysaccharide compounds is being developed. Here, nanoparticulate bioactive glass reinforced gelan-gum (GG-BAG) has recently been proposed for the production of bone scaffolds. This material offers promising biocompatibility properties, including bioactivity and biodegradability, with the possibility of producing scaffolds with directly controlled microgeometry. However, to utilize such a scaffold with application-optimized properties, large sets of complex numerical simulations using the real microgeometry of the material have to be carried out during the development process. Because the GG-BAG is a material with intrinsically very low attenuation to X-rays, its radiographical imaging, including tomographical scanning and reconstructions, with resolution required by numerical simulations might be a very challenging task. In this paper, we present a study on X-ray imaging of GG-BAG samples. High-resolution volumetric images of investigated specimens were generated on the basis of micro-CT measurements using a large area flat-panel detector and a large area photon-counting detector. The photon-counting detector was composed of a 010× 1 matrix of Timepix edgeless silicon pixelated detectors with tiling based on overlaying rows (i.e. assembled so that no gap is present between individual rows of detectors). We compare the results from both detectors with the scanning electron microscopy on selected slices in transversal plane. It has been shown that the photon counting detector can provide approx. 3× better resolution of the details in low-attenuating materials than the integrating flat panel detectors. We demonstrate that employment of a large area photon counting detector is a good choice for imaging of low attenuating materials with the resolution sufficient for numerical simulations.

  17. Statistical emission of complex fragments from highly excited compound nucleus

    International Nuclear Information System (INIS)

    Matsuse, T.

    1991-01-01

    A full statistical analysis has been given in terms of the Extended Hauser-Feshbach method. The charge and kinetic energy distributions of 35 Cl+ 12 C reaction at E lab = 180, 200 MeV and 23 Na+ 24 Mg reaction at E lab = 89 MeV which form the 47 V compound nucleus are investigated as a prototype of the light mass system. The measured kinetic energy distributions of the complex fragments are shown to be well reproduced by the Extended Hauser-Feshbach method, so the observed complex fragment production is understood as the statistical binary decay from the compound nucleus induced by heavy-ion reaction. Next, this method is applied to the study of the complex production from the 111 In compound nucleus which is formed by the 84 Kr+ 27 Al reaction at E lab = 890 MeV. (K.A.) 18 refs., 10 figs

  18. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  19. Use of domestic detergents in the California mastitis test for high somatic cell counts in milk.

    Science.gov (United States)

    Leach, K A; Green, M J; Breen, J E; Huxley, J N; Macaulay, R; Newton, H T; Bradley, A J

    2008-11-08

    The California mastitis test (CMT) is used on farms to identify subclinical mastitis by an indirect estimation of the somatic cell count (SCC) in milk. Four commercially available detergents were compared with a bespoke cmt fluid for their ability to detect milk samples with a scc above 200,000 cells/ml; differences between the interpretation of the results of the tests by eight operators were also investigated. The sensitivity and specificity of the test were affected by the type of detergent, and by the operators' interpretations. When used by the most sensitive operator, suitably diluted Fairy Liquid performed almost identically to cmt fluid in identifying milk samples with more than 200,000 cells/ml. The average sensitivities achieved by the eight operators for detecting this threshold were 82 per cent for Fairy Liquid and 84 per cent for cmt fluid, and the specificities were 93 and 91 per cent respectively. The other detergents contained less anionic surfactants and were less sensitive but similarly specific.

  20. A High-Speed, Event-Driven, Active Pixel Sensor Readout for Photon-Counting Microchannel Plate Detectors

    Science.gov (United States)

    Kimble, Randy A.; Pain, Bedabrata; Norton, Timothy J.; Haas, J. Patrick; Oegerle, William R. (Technical Monitor)

    2002-01-01

    Silicon array readouts for microchannel plate intensifiers offer several attractive features. In this class of detector, the electron cloud output of the MCP intensifier is converted to visible light by a phosphor; that light is then fiber-optically coupled to the silicon array. In photon-counting mode, the resulting light splashes on the silicon array are recognized and centroided to fractional pixel accuracy by off-chip electronics. This process can result in very high (MCP-limited) spatial resolution while operating at a modest MCP gain (desirable for dynamic range and long term stability). The principal limitation of intensified CCD systems of this type is their severely limited local dynamic range, as accurate photon counting is achieved only if there are not overlapping event splashes within the frame time of the device. This problem can be ameliorated somewhat by processing events only in pre-selected windows of interest of by using an addressable charge injection device (CID) for the readout array. We are currently pursuing the development of an intriguing alternative readout concept based on using an event-driven CMOS Active Pixel Sensor. APS technology permits the incorporation of discriminator circuitry within each pixel. When coupled with suitable CMOS logic outside the array area, the discriminator circuitry can be used to trigger the readout of small sub-array windows only when and where an event splash has been detected, completely eliminating the local dynamic range problem, while achieving a high global count rate capability and maintaining high spatial resolution. We elaborate on this concept and present our progress toward implementing an event-driven APS readout.

  1. Tower counts

    Science.gov (United States)

    Woody, Carol Ann; Johnson, D.H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.

    2007-01-01

    Counting towers provide an accurate, low-cost, low-maintenance, low-technology, and easily mobilized escapement estimation program compared to other methods (e.g., weirs, hydroacoustics, mark-recapture, and aerial surveys) (Thompson 1962; Siebel 1967; Cousens et al. 1982; Symons and Waldichuk 1984; Anderson 2000; Alaska Department of Fish and Game 2003). Counting tower data has been found to be consistent with that of digital video counts (Edwards 2005). Counting towers do not interfere with natural fish migration patterns, nor are fish handled or stressed; however, their use is generally limited to clear rivers that meet specific site selection criteria. The data provided by counting tower sampling allow fishery managers to determine reproductive population size, estimate total return (escapement + catch) and its uncertainty, evaluate population productivity and trends, set harvest rates, determine spawning escapement goals, and forecast future returns (Alaska Department of Fish and Game 1974-2000 and 1975-2004). The number of spawning fish is determined by subtracting subsistence, sport-caught fish, and prespawn mortality from the total estimated escapement. The methods outlined in this protocol for tower counts can be used to provide reasonable estimates ( plus or minus 6%-10%) of reproductive salmon population size and run timing in clear rivers. 

  2. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  3. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  4. High Performance Negative Feedback Near Infrared Single Photon Counting Detectors & Arrays, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Amplification Technologies Inc ("ATI") proposes to develop the enabling material and device technology for the design of ultra low noise, high gain and high speed...

  5. High statistics inclusive phi-meson production at SPS energies

    International Nuclear Information System (INIS)

    Dijkstra, H.B.

    1985-01-01

    This thesis describes an experiment studying the inclusive reaction hadron + Be → phi + anything → K + + K - + anything in 100 GeV/c, 120 GeV/c and 200 GeV/c hadron interactions. A total of 8x10 6 events were recorded using both positively and negatively charged unseparated hadron beams supplied by the CERN SPS. The experiment made use of an intelligent on-line event selection system based on micro-processors (FAMPs) in conjunction with a system of large MWPCs to increase the number of phi-events recorded per unit time. In 32 days of data taking over 600,000 phi-mesons were recorded onto magnetic tape. The physics motivation for collecting a large statistics sample of inclusive phi-mesons was the investigation of the inclusive phi-meson production mechanism and phi-spectroscopy. (Auth.)

  6. Technical feasibility proof for high-resolution low-dose photon-counting CT of the breast

    Energy Technology Data Exchange (ETDEWEB)

    Kalender, Willi A.; Kolditz, Daniel; Lueck, Ferdinand [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); CT Imaging GmbH, Erlangen (Germany); Steiding, Christian [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); CT Imaging GmbH, Erlangen (Germany); University Hospital of Erlangen, Institute of Radiology, Erlangen (Germany); Ruth, Veikko; Roessler, Ann-Christin [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); Wenkel, Evelyn [University Hospital of Erlangen, Institute of Radiology, Erlangen (Germany)

    2017-03-15

    X-ray computed tomography (CT) has been proposed and evaluated multiple times as a potentially alternative method for breast imaging. All efforts shown so far have been criticized and partly disapproved because of their limited spatial resolution and higher patient dose when compared to mammography. Our concept for a dedicated breast CT (BCT) scanner therefore aimed at novel apparatus and detector design to provide high spatial resolution of about 100 μm and average glandular dose (AGD) levels of 5 mGy or below. Photon-counting technology was considered as a solution to reach these goals. The complete concept was previously evaluated and confirmed by simulations and basic experiments on laboratory setups. We here present measurements of dose, technical image quality parameters and surgical specimen results on such a scanner. For comparison purposes, the specimens were also imaged with digital mammography (DM) and breast tomosynthesis (BT) apparatus. Results show that photon-counting BCT (pcBCT) at 5 mGy AGD offers sufficiently high 3D spatial resolution for reliable detectability of calcifications and soft tissue delineation. (orig.)

  7. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  8. Photon-Counting Microwave Kinetic Inductance Detectors (MKIDs) for High Resolution Far-Infrared Spectroscopy

    Data.gov (United States)

    National Aeronautics and Space Administration — We are developing ultrasensitive Microwave Kinetic Inductance Detectors (MKIDs) for high resolution far-infrared spectroscopy applications, with a long-term goal of...

  9. Economic consequences of mastitis and withdrawal of milk with high somatic cell count in Swedish dairy herds

    DEFF Research Database (Denmark)

    Nielsen, C; Østergaard, Søren; Emanuelson, U

    2010-01-01

    Herd, was used to study the effects of mastitis in a herd with 150 cows. Results given the initial incidence of mastitis (32 and 33 clinical and subclinical cases per 100 cow-years, respectively) were studied, together with the consequences of reducing or increasing the incidence of mastitis by 50%, modelling......% of the herd net return given the initial incidence of mastitis. Expressed per cow-year, the avoidable cost of mastitis was €55. The costs per case of CM and SCM were estimated at €278 and €60, respectively. Withdrawing milk with high SCC was never profitable because this generated a substantial amount of milk......The main aim was to assess the impact of mastitis on technical and economic results of a dairy herd under current Swedish farming conditions. The second aim was to investigate the effects obtained by withdrawing milk with high somatic cell count (SCC). A dynamic and stochastic simulation model, Sim...

  10. An investigation of the trade-off between the count level and image quality in myocardial perfusion SPECT using simulated images: the effects of statistical noise and object variability on defect detectability

    International Nuclear Information System (INIS)

    He Xin; Links, Jonathan M; Frey, Eric C

    2010-01-01

    Quantum noise as well as anatomic and uptake variability in patient populations limits observer performance on a defect detection task in myocardial perfusion SPECT (MPS). The goal of this study was to investigate the relative importance of these two effects by varying acquisition time, which determines the count level, and assessing the change in performance on a myocardial perfusion (MP) defect detection task using both mathematical and human observers. We generated ten sets of projections of a simulated patient population with count levels ranging from 1/128 to around 15 times a typical clinical count level to simulate different levels of quantum noise. For the simulated population we modeled variations in patient, heart and defect size, heart orientation and shape, defect location, organ uptake ratio, etc. The projection data were reconstructed using the OS-EM algorithm with no compensation or with attenuation, detector response and scatter compensation (ADS). The images were then post-filtered and reoriented to generate short-axis slices. A channelized Hotelling observer (CHO) was applied to the short-axis images, and the area under the receiver operating characteristics (ROC) curve (AUC) was computed. For each noise level and reconstruction method, we optimized the number of iterations and cutoff frequencies of the Butterworth filter to maximize the AUC. Using the images obtained with the optimal iteration and cutoff frequency and ADS compensation, we performed human observer studies for four count levels to validate the CHO results. Both CHO and human observer studies demonstrated that observer performance was dependent on the relative magnitude of the quantum noise and the patient variation. When the count level was high, the patient variation dominated, and the AUC increased very slowly with changes in the count level for the same level of anatomic variability. When the count level was low, however, quantum noise dominated, and changes in the count level

  11. HEPS-BPIX, a single photon counting pixel detector with a high frame rate for the HEPS project

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Wei, E-mail: weiw@ihep.ac.cn [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); State Key Laboratory of Particle Detection and Electronics, Beijing 100049 (China); Zhang, Jie; Ning, Zhe; Lu, Yunpeng; Fan, Lei; Li, Huaishen; Jiang, Xiaoshan; Lan, Allan K.; Ouyang, Qun; Wang, Zheng; Zhu, Kejun; Chen, Yuanbo [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); State Key Laboratory of Particle Detection and Electronics, Beijing 100049 (China); Liu, Peng [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China)

    2016-11-01

    China's next generation light source, named the High Energy Photon Source (HEPS), is currently under construction. HEPS-BPIX (HEPS-Beijing PIXel) is a dedicated pixel readout chip that operates in single photon counting mode for X-ray applications in HEPS. Designed using CMOS 0.13 µm technology, the chip contains a matrix of 104×72 pixels. Each pixel measures 150 µm×150 µm and has a counting depth of 20 bits. A bump-bonded prototyping detector module with a 300-µm thick silicon sensor was tested in the beamline of Beijing Synchrotron Radiation Facility. A fast stream of X-ray images was demonstrated, and a frame rate of 1.2 kHz was proven, with a negligible dead time. The test results showed an equivalent noise charge of 115 e{sup −} rms after bump bonding and a threshold dispersion of 55 e{sup −} rms after calibration.

  12. Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

    CERN Document Server

    Petriş, M.

    2016-09-13

    Multi-gap RPC prototypes with readout on a multi-strip electrode were developed for the small polar angle region of the CBM-TOF subdetector, the most demanding zone in terms of granularity and counting rate. The prototypes are based on low resistivity ($\\sim$10$^{10}$ $\\Omega$cm) glass electrodes for performing in high counting rate environment. The strip width/pitch size was chosen such to fulfill the impedance matching with the front-end electronics and the granularity requirements of the innermost zone of the CBM-TOF wall. The in-beam tests using secondary particles produced in heavy ion collisions on a Pb target at SIS18 - GSI Darmstadt and SPS - CERN were focused on the performance of the prototype in conditions similar to the ones expected at SIS100/FAIR. An efficiency larger than 98\\% and a system time resolution in the order of 70~-~80~ps were obtained in high counting rate and high multiplicity environment.

  13. Single Photon Counting Large Format Imaging Sensors with High Spatial and Temporal Resolution

    Science.gov (United States)

    Siegmund, O. H. W.; Ertley, C.; Vallerga, J. V.; Cremer, T.; Craven, C. A.; Lyashenko, A.; Minot, M. J.

    High time resolution astronomical and remote sensing applications have been addressed with microchannel plate based imaging, photon time tagging detector sealed tube schemes. These are being realized with the advent of cross strip readout techniques with high performance encoding electronics and atomic layer deposited (ALD) microchannel plate technologies. Sealed tube devices up to 20 cm square have now been successfully implemented with sub nanosecond timing and imaging. The objective is to provide sensors with large areas (25 cm2 to 400 cm2) with spatial resolutions of 5 MHz and event timing accuracy of 100 ps. High-performance ASIC versions of these electronics are in development with better event rate, power and mass suitable for spaceflight instruments.

  14. Clicks versus Citations: Click Count as a Metric in High Energy Physics Publishing

    Energy Technology Data Exchange (ETDEWEB)

    Bitton, Ayelet; /UC, San Diego /SLAC

    2011-06-22

    High-energy physicists worldwide rely on online resources such as SPIRES and arXiv to perform gather research and share their own publications. SPIRES is a tool designed to search the literature within high-energy physics, while arXiv provides the actual full-text documents of this literature. In high-energy physics, papers are often ranked according to the number of citations they acquire - meaning the number of times a later paper references the original. This paper investigates the correlation between the number of times a paper is clicked in order to be downloaded and the number of citations it receives following the click. It explores how physicists truly read what they cite.

  15. A High School Statistics Class Investigates the Death Penalty

    Science.gov (United States)

    Brelias, Anastasia

    2015-01-01

    Recommendations for reforming high school mathematics curricula emphasize the importance of engaging students in mathematical investigations of societal issues (CCSSI [Common Core State Standards Initiative] 2010; NCTM [National Council of Teachers of Mathematics] 2000). Proponents argue that these investigations can positively influence students'…

  16. EcoCount

    Directory of Open Access Journals (Sweden)

    Phillip P. Allen

    2014-05-01

    Full Text Available Techniques that analyze biological remains from sediment sequences for environmental reconstructions are well established and widely used. Yet, identifying, counting, and recording biological evidence such as pollen grains remain a highly skilled, demanding, and time-consuming task. Standard procedure requires the classification and recording of between 300 and 500 pollen grains from each representative sample. Recording the data from a pollen count requires significant effort and focused resources from the palynologist. However, when an adaptation to the recording procedure is utilized, efficiency and time economy improve. We describe EcoCount, which represents a development in environmental data recording procedure. EcoCount is a voice activated fully customizable digital count sheet that allows the investigator to continuously interact with a field of view during the data recording. Continuous viewing allows the palynologist the opportunity to remain engaged with the essential task, identification, for longer, making pollen counting more efficient and economical. EcoCount is a versatile software package that can be used to record a variety of environmental evidence and can be installed onto different computer platforms, making the adoption by users and laboratories simple and inexpensive. The user-friendly format of EcoCount allows any novice to be competent and functional in a very short time.

  17. Your Voice Counts: Listening to the Voice of High School Students with Autism Spectrum Disorder

    Science.gov (United States)

    Saggers, Beth; Hwang, Yoon-Suk; Mercer, K. Louise

    2011-01-01

    Supporting students with autism spectrum disorders (ASDs) in inclusive settings presents both opportunities and significant challenges to school communities. This study, which explored the lived experience of nine students with ASD in an inclusive high school in Australia, is based on the belief that by listening to the voices of students, school…

  18. Statistical mechanics of flux lines in high-temperature superconductors

    International Nuclear Information System (INIS)

    Dasgupta, C.

    1992-01-01

    The shortness of the low temperature coherence lengths of high T c materials leads to new mechanisms of pinning of flux lines. Lattice periodic modulations of the order parameters itself acts to pin vortex lines in regions of the unit cell were the order parameter is small. A presentation of flux creep and flux noise at low temperature and magnetic fields in terms of motion of simple metastable defects on flux lines is made, with a calculation of flux lattice melting. 12 refs

  19. Two types of photomultiplier voltage dividers for high and changing count rates

    International Nuclear Information System (INIS)

    Reiter, W.L.; Stengl, G.

    1980-01-01

    We report on the design of two types of voltage distribution circuits for high stability photomultiplier operation. 'Type A' voltage divider is an ohmic voltage divider with high bleeder current (up to 10 mA) and the resistor chain split at one of the last dynodes, usually the dynode where the analog signal is derived from. This simple constructive measure improves the stability of the dynode voltage by a factor of 5 compared with an unsplit conventional resistor chain. 'Type B' is a novel active voltage divider using cold cathode tubes ar regulating elements. This voltage divider exhibits excellent temperature stability (about 10 -4 / 0 C). With 'type B' an equal stability compared with conventional ohmic dividers can be achieved at a bleeder current smaller by one order of magnitude. Of course both concepts, 'type A' and 'type B', can be combined. (orig.)

  20. Counting Highly Cited Papers for University Research Assessment: Conceptual and Technical Issues

    Science.gov (United States)

    Rodríguez-Navarro, Alonso

    2012-01-01

    A Kuhnian approach to research assessment requires us to consider that the important scientific breakthroughs that drive scientific progress are infrequent and that the progress of science does not depend on normal research. Consequently, indicators of research performance based on the total number of papers do not accurately measure scientific progress. Similarly, those universities with the best reputations in terms of scientific progress differ widely from other universities in terms of the scale of investments made in research and in the higher concentrations of outstanding scientists present, but less so in terms of the total number of papers or citations. This study argues that indicators for the 1% high-citation tail of the citation distribution reveal the contribution of universities to the progress of science and provide quantifiable justification for the large investments in research made by elite research universities. In this tail, which follows a power low, the number of the less frequent and highly cited important breakthroughs can be predicted from the frequencies of papers in the upper part of the tail. This study quantifies the false impression of excellence produced by multinational papers, and by other types of papers that do not contribute to the progress of science. Many of these papers are concentrated in and dominate lists of highly cited papers, especially in lower-ranked universities. The h-index obscures the differences between higher- and lower-ranked universities because the proportion of h-core papers in the 1% high-citation tail is not proportional to the value of the h-index. PMID:23071759

  1. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    Science.gov (United States)

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  2. Analysis of the emission characteristics of ion sources for high-value optical counting processes

    International Nuclear Information System (INIS)

    Beermann, Nils

    2009-01-01

    The production of complex high-quality thin film systems requires a detailed understanding of all partial processes. One of the most relevant partial processes is the condensation of the coating material on the substrate surface. The optical and mechanical material properties can be adjusted by the well-defined impingement of energetic ions during deposition. Thus, in the past, a variety of different ion sources were developed. With respect to the present and future challenges in the production of precisely fabricated high performance optical coatings, the ion emission of the sources has commonly not been characterized sufficiently so far. This question is addressed in the frame of this work which itself is thematically integrated in the field of process-development and -control of ion assisted deposition processes. In a first step, a Faraday cup measurement system was developed which allows the spatially resolved determination of the ion energy distribution as well as the ion current distribution. Subsequently, the ion emission profiles of six ion sources were determined depending on the relevant operating parameters. Consequently, a data pool for process planning and supplementary process analysis is made available. On the basis of the acquired results, the basic correlations between the operating parameters and the ion emission are demonstrated. The specific properties of the individual sources as well as the respective control strategies are pointed out with regard to the thin film properties and production yield. Finally, a synthesis of the results and perspectives for future activities are given. (orig.)

  3. Development of nuclear counting system for plateau high voltage scintillation detector test facilities

    International Nuclear Information System (INIS)

    Sarizah Mohamed Nor; Siti Hawa Md Zain; Muhd Izham Ahmad; Izuhan Ismail

    2010-01-01

    Nuclear counter system is a system monitoring and analysis of radioactivity used in scientific and technical research and development in the Malaysian Nuclear Agency. It consists of three basic parts, namely sensors, signal conditioning and monitoring. Nuclear counter system set up for use in the testing of nuclear detectors using radioactive sources such as 60 Co and 137 Cs and other radioactive sources. It can determine the types of scintillation detectors and the equivalent function properly, always operate in the range plateau high voltage and meet the specifications. Hence, it should be implemented on all systems in the Nuclear Nuclear counter Malaysia and documented as Standard Working Procedure (SWP) is a reference to the technicians, trainees IPTA / IPTS and related workers. (author)

  4. Tunable high-channel-count bandpass plasmonic filters based on an analogue of electromagnetically induced transparency

    International Nuclear Information System (INIS)

    Lu Hua; Liu Xueming; Wang Guoxi; Mao Dong

    2012-01-01

    We have proposed a novel type of bandpass plasmonic filter consisting of metal–insulator–metal bus waveguides coupled with a series of side-coupled cavities and stub waveguides. The theoretical modeling demonstrates that our waveguide-resonator system performs a plasmonic analogue of electromagnetically induced transparency (EIT) in atomic systems, as is confirmed by numerical experiments. The plasmonic EIT-like response enables the realization of nanoscale bandpass filters with multiple channels. Additionally, the operating wavelengths and bandwidths of our filters can be efficiently tuned by adjusting the geometric parameters such as the lengths of stub waveguides and the coupling distances between the cavities and stub waveguides. The ultracompact configurations contribute to the achievement of wavelength division multiplexing systems for optical computing and communications in highly integrated optical circuits. (paper)

  5. Effects of high-energy electron irradiation of chicken meat on Salmonella and aerobic plate count

    International Nuclear Information System (INIS)

    Heath, J.L.; Owens, S.L.; Tesch, S.; Hannah, K.W.

    1990-01-01

    Four experiments were used to determine the effects of high-energy irradiation on the number of aerobic microorganisms and Salmonella on broiler breasts and thighs. Irradiation ranging from 100 to 700 kilorads (krads) was provided by a commercial-scale, electron-beam accelerator. Irradiation of broiler breast and thigh pieces with electron beams at levels of 100, 200, 300, 400, 500, and 600 krads showed that levels as low as 100 krads would eliminate Salmonella. When 33 thighs were tested after irradiation at 200 krads, only one thigh tested presumptive positive. The total number of aerobic organisms was reduced by 2 to 3 log10 cycles at irradiation levels of 100, 200, 300, 400, 500, 600, and 700 krads. Increasing the dose above 100 krads gave little if any additional benefit

  6. The CAMEO project: high sensitivity quest for Majorana neutrino mass with the BOREXINO counting test facility

    International Nuclear Information System (INIS)

    Bellini, G.; Caccianiga, B.; Giammarchi, M.G.

    2001-01-01

    The unique features of the CTF and BOREXINO set-ups are used for a high sensitivity study of 100 Mo and 116 Cd neutrinoless 2β decay. Pilot measurements with 116 Cd and Monte Carlo simulation show that the sensitivity of the CAMEO experiment (in terms of the T 1/2 limit for 0ν2β decay) is (3-5) · 10 24 y with a 1 kg source of 100 Mo ( 116 Cd, 82 Se, 150 Nd) and ∼ 10 26 y with 65 kg of 116 CdWO 4 crystals placed in the CTF. The last value corresponds to a limit on the neutrino mass of m ν ≤ 0.06 eV. Moreover, with 1000 kg of 116 CdWO 4 crystals located in the BOREXINO apparatus, the neutrino mass limit can be pushed down to m ν ≤ 0.02 eV

  7. High-Voltage Clock Driver for Photon-Counting CCD Characterization

    Science.gov (United States)

    Baker, Robert

    2013-01-01

    A document discusses the CCD97 from e2v technologies as it is being evaluated at Goddard Space Flight Center's Detector Characterization Laboratory (DCL) for possible use in ultra-low background noise space astronomy applications, such as Terrestrial Planet Finder Coronagraph (TPF-C). The CCD97 includes a photoncounting mode where the equivalent output noise is less than one electron. Use of this mode requires a clock signal at a voltage level greater than the level achievable by the existing CCD (charge-coupled-device) electronics. A high-voltage waveform generator has been developed in code 660/601 to support the CCD97 evaluation. The unit generates required clock waveforms at voltage levels from -20 to +50 V. It deals with standard and arbitrary waveforms and supports pixel rates from 50 to 500 kHz. The system is designed to interface with existing Leach CCD electronics.

  8. Auto-counting of high density overlapping tracks and neutron spectrum measurement using CR-39 detectors and in-house image analysis program

    International Nuclear Information System (INIS)

    Paul, Sabyasachi; Tripathy, S.P.; Sahoo, G.S.; Joshi, D.S.; Bandyopadhyay, T.

    2014-01-01

    An effort is made in this work to overcome the difficulty of counting highly dense and overlapping tracks in solid polymeric track detectors (SPTD) such as CR-39. A program is developed to automatically count the track density which is found to be faster and more precise compared to other commonly used image analysing software. The results obtained by the present methodology are compared with those obtained using other software. (author)

  9.  Risk of discontinuation of nevirapine due to toxicities in antiretroviral naive and experienced HIV-infected patients with high and low CD4 counts

    DEFF Research Database (Denmark)

    Mocroft, A; Staszewski, S; Weber, R

    2007-01-01

    AND METHODS: 1,571 EuroSIDA patients started NVPc after 1/1/1999, with CD4+ T-cell counts and viral load measured in the 6 months before starting treatment, and were stratified into four groups based on CD4+ T-cell counts at initiation of NVPc (high [H], > 400/mm3 or > 250/mm3 for male or female, respectively...

  10. {sup 90}Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    Energy Technology Data Exchange (ETDEWEB)

    Carlier, Thomas, E-mail: thomas.carlier@chu-nantes.fr [Department of Nuclear Medicine, University Hospital of Nantes, Place Alexis Ricordeau, Nantes 44093, France and CRCNA–UMR 892 INSERM 6299 CNRS, 8 quai Moncousu BP 70721, Nantes 44007 (France); Willowson, Kathy P. [Institute of Medical Physics, University of Sydney, Camperdown, New South Wales 2006 (Australia); Fourkal, Eugene [Department of Radiation Oncology, Allegheny General Hospital, Pittsburgh, Pennsylvania 15212 (United States); Bailey, Dale L. [Faculty of Health Sciences, University of Sydney, Lidcombe 2141, Australia and Department of Nuclear Medicine, Royal North Shore Hospital, St Leonards, New South Wales 2065 (Australia); Doss, Mohan [Department of Diagnostic Imaging, Fox Chase Cancer Center, Philadelphia, Pennsylvania 19111 (United States); Conti, Maurizio [Siemens Healthcare Molecular Imaging, 810 Innovation Drive, Knoxville, Tennessee 37932 (United States)

    2015-07-15

    Purpose: {sup 90}Y -positron emission tomography (PET) imaging is becoming a recognized modality for postinfusion quantitative assessment following radioembolization therapy. However, the extremely low counts and high random fraction associated with {sup 90}Y -PET may significantly impair both qualitative and quantitative results. The aim of this work was to study image quality and noise level in relation to the quantification and bias performance of two types of Siemens PET scanners when imaging {sup 90}Y and to compare experimental results with clinical data from two types of commercially available {sup 90}Y microspheres. Methods: Data were acquired on both Siemens Biograph TruePoint [non-time-of-flight (TOF)] and Biograph microcomputed tomography (mCT) (TOF) PET/CT scanners. The study was conducted in three phases. The first aimed to assess quantification and bias for different reconstruction methods according to random fraction and number of true counts in the scan. The NEMA 1994 PET phantom was filled with water with one cylindrical insert left empty (air) and the other filled with a solution of {sup 90}Y . The phantom was scanned for 60 min in the PET/CT scanner every one or two days. The second phase used the NEMA 2001 PET phantom to derive noise and image quality metrics. The spheres and the background were filled with a {sup 90}Y solution in an 8:1 contrast ratio and four 30 min acquisitions were performed over a one week period. Finally, 32 patient data (8 treated with Therasphere{sup ®} and 24 with SIR-Spheres{sup ®}) were retrospectively reconstructed and activity in the whole field of view and the liver was compared to theoretical injected activity. Results: The contribution of both bremsstrahlung and LSO trues was found to be negligible, allowing data to be decay corrected to obtain correct quantification. In general, the recovered activity for all reconstruction methods was stable over the range studied, with a small bias appearing at extremely

  11. (90)Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction.

    Science.gov (United States)

    Carlier, Thomas; Willowson, Kathy P; Fourkal, Eugene; Bailey, Dale L; Doss, Mohan; Conti, Maurizio

    2015-07-01

    (90)Y -positron emission tomography (PET) imaging is becoming a recognized modality for postinfusion quantitative assessment following radioembolization therapy. However, the extremely low counts and high random fraction associated with (90)Y -PET may significantly impair both qualitative and quantitative results. The aim of this work was to study image quality and noise level in relation to the quantification and bias performance of two types of Siemens PET scanners when imaging (90)Y and to compare experimental results with clinical data from two types of commercially available (90)Y microspheres. Data were acquired on both Siemens Biograph TruePoint [non-time-of-flight (TOF)] and Biograph microcomputed tomography (mCT) (TOF) PET/CT scanners. The study was conducted in three phases. The first aimed to assess quantification and bias for different reconstruction methods according to random fraction and number of true counts in the scan. The NEMA 1994 PET phantom was filled with water with one cylindrical insert left empty (air) and the other filled with a solution of (90)Y . The phantom was scanned for 60 min in the PET/CT scanner every one or two days. The second phase used the NEMA 2001 PET phantom to derive noise and image quality metrics. The spheres and the background were filled with a (90)Y solution in an 8:1 contrast ratio and four 30 min acquisitions were performed over a one week period. Finally, 32 patient data (8 treated with Therasphere(®) and 24 with SIR-Spheres(®)) were retrospectively reconstructed and activity in the whole field of view and the liver was compared to theoretical injected activity. The contribution of both bremsstrahlung and LSO trues was found to be negligible, allowing data to be decay corrected to obtain correct quantification. In general, the recovered activity for all reconstruction methods was stable over the range studied, with a small bias appearing at extremely high random fraction and low counts for iterative algorithms

  12. 90Y -PET imaging: Exploring limitations and accuracy under conditions of low counts and high random fraction

    International Nuclear Information System (INIS)

    Carlier, Thomas; Willowson, Kathy P.; Fourkal, Eugene; Bailey, Dale L.; Doss, Mohan; Conti, Maurizio

    2015-01-01

    Purpose: 90 Y -positron emission tomography (PET) imaging is becoming a recognized modality for postinfusion quantitative assessment following radioembolization therapy. However, the extremely low counts and high random fraction associated with 90 Y -PET may significantly impair both qualitative and quantitative results. The aim of this work was to study image quality and noise level in relation to the quantification and bias performance of two types of Siemens PET scanners when imaging 90 Y and to compare experimental results with clinical data from two types of commercially available 90 Y microspheres. Methods: Data were acquired on both Siemens Biograph TruePoint [non-time-of-flight (TOF)] and Biograph microcomputed tomography (mCT) (TOF) PET/CT scanners. The study was conducted in three phases. The first aimed to assess quantification and bias for different reconstruction methods according to random fraction and number of true counts in the scan. The NEMA 1994 PET phantom was filled with water with one cylindrical insert left empty (air) and the other filled with a solution of 90 Y . The phantom was scanned for 60 min in the PET/CT scanner every one or two days. The second phase used the NEMA 2001 PET phantom to derive noise and image quality metrics. The spheres and the background were filled with a 90 Y solution in an 8:1 contrast ratio and four 30 min acquisitions were performed over a one week period. Finally, 32 patient data (8 treated with Therasphere ® and 24 with SIR-Spheres ® ) were retrospectively reconstructed and activity in the whole field of view and the liver was compared to theoretical injected activity. Results: The contribution of both bremsstrahlung and LSO trues was found to be negligible, allowing data to be decay corrected to obtain correct quantification. In general, the recovered activity for all reconstruction methods was stable over the range studied, with a small bias appearing at extremely high random fraction and low counts for

  13. TREATMENT OF ACUTE PROMYELOCYTIC LEUKEMIA WITH HIGH WHITE CELL BLOOD COUNTS.

    Directory of Open Access Journals (Sweden)

    Charicleia Kelaidi

    2011-09-01

    Full Text Available Acute promyelocytic leukemia (APL with WBC above 10 G/L has long been considered, even in the all-trans retinoic acid (ATRA era, to carry a relatively poor prognosis (compared to  APL with WBC below 10 G/L, due to increased early mortality and relapse. However, early deaths can to a large extent be avoided if specific measures are rapidly instigated, including prompt referral to a specialized center, immediate onset of ATRA and chemotherapy, treatment of coagulopathy with adequate platelet transfusional support, and prevention and management of differentiation syndrome. Strategies to reduce relapse rate include chemotherapy reinforcement with cytarabine and/or arsenic trioxide during consolidation, prolonged maintenance treatment, especially with ATRA and low dose chemotherapy, and possibly, although this is debated, intrathecal prophylaxis to prevent central nervous system relapse. By applying those measures, outcomes of patients with high risk APL have considerably improved, and have become in many studies almost similar to those of standard risk APL patients.

  14. Landing Site Studies Using High Resolution MGS Crater Counts and Phobos-2 Termoskan Data

    Science.gov (United States)

    Hartmann, Willian K.; Berman, Daniel C.; Betts, Bruce H.

    1999-06-01

    We have examined a number of potential landing sites to study effects associated with impact crater populations. We used Mars Global Surveyor high resolution MOC images, and emphasized "ground truth" by calibrating with the MOC images of Viking 1 and Pathfinder sites. An interesting result is that most of Mars (all surfaces with model ages older than 100 My) have small crater populations in saturation equilibrium below diameters D approx. = 60 meters (and down to the smallest resolvable, countable sizes, approx. = 15 m). This may have consequences for preservation of surface bedrock exposures accessible to rovers. In the lunar maria, a similar saturation equilibrium is reached for crater diameters below about 300 meters, and this has produced a regolith depth of about 10-20 meters in those areas. Assuming linear scaling, we infer that saturation at D approx. = 60 m would produce gardening and Martian regolith, or fragmental layers, about 2 to 4 meters deep over all but extremely young surfaces (such as the very fresh thin surface flows in southern Elysium Planitia, which have model ages around 10 My or less). This result may explain the global production of ubiquitous dust and fragmental material on Mars. Removal of fines may leave the boulders that have been seen at all three of the first landing sites. Accumulation of the fines elsewhere produces dunes. Due to these effects, it may be difficult to set down rovers in areas where bedrock is well preserved at depths of centimeters, unless we find cliff sides or areas of deflation where wind has exposed clean surfaces (among residual boulders?) We have also surveyed the PHOBOS 2 Termoskan data to look for regions of thermal anomalies that might produce interesting landing sites. For landing site selection, two of the more interesting types of features are thermally distinct ejecta blankets and thermally distinct channels and valleys. Martian "thermal features" such as these that correlate closely with nonaeolian

  15. The Kruskal Count

    OpenAIRE

    Lagarias, Jeffrey C.; Rains, Eric; Vanderbei, Robert J.

    2001-01-01

    The Kruskal Count is a card trick invented by Martin J. Kruskal in which a magician "guesses" a card selected by a subject according to a certain counting procedure. With high probability the magician can correctly "guess" the card. The success of the trick is based on a mathematical principle related to coupling methods for Markov chains. This paper analyzes in detail two simplified variants of the trick and estimates the probability of success. The model predictions are compared with simula...

  16. The Top 100 Linked-To Pages on UK University Web Sites: High Inlink Counts Are Not Usually Associated with Quality Scholarly Content.

    Science.gov (United States)

    Thelwall, Mike

    2002-01-01

    Reports on an investigation into the most highly linked pages on United Kingdom university Web sites. Concludes that simple link counts are highly unreliable indicators of the average behavior of scholars, and that the most highly linked-to pages are those that facilitate access to a wide range of information rather than providing specific…

  17. Intraoperative detection of 18F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria

    International Nuclear Information System (INIS)

    Povoski, Stephen P; Chapman, Gregg J; Murrey, Douglas A; Lee, Robert; Martin, Edward W; Hall, Nathan C

    2013-01-01

    Intraoperative detection of 18 F-FDG-avid tissue sites during 18 F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of 18 F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Of 58 patients undergoing 18 F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine 18 F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each 18 F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2–15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0–2.1) and 1.0 (± 0, range 1.0–1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18 F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three

  18. Intraoperative detection of ¹⁸F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria.

    Science.gov (United States)

    Povoski, Stephen P; Chapman, Gregg J; Murrey, Douglas A; Lee, Robert; Martin, Edward W; Hall, Nathan C

    2013-03-04

    Intraoperative detection of (18)F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of (18)F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Of 58 patients undergoing (18)F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine (18)F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each (18)F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2-15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0-2.1) and 1.0 (± 0, range 1.0-1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical

  19. Lymphocytosis (High Lymphocyte Count)

    Science.gov (United States)

    ... com/test-catalog/Clinical+and+Interpretive/9109. Accessed May 19, 2016. Kaushansky K, et al. Lymphocytosis and lymphocytopenia. In: Williams Hematology. 9th ed. New York, N.Y.: The McGraw-Hill ...

  20. Behavioral and cellular consequences of high-electrode count Utah Arrays chronically implanted in rat sciatic nerve

    Science.gov (United States)

    Wark, H. A. C.; Mathews, K. S.; Normann, R. A.; Fernandez, E.

    2014-08-01

    Objective. Before peripheral nerve electrodes can be used for the restoration of sensory and motor functions in patients with neurological disorders, the behavioral and histological consequences of these devices must be investigated. These indices of biocompatibility can be defined in terms of desired functional outcomes; for example, a device may be considered for use as a therapeutic intervention if the implanted subject retains functional neurons post-implantation even in the presence of a foreign body response. The consequences of an indwelling device may remain localized to cellular responses at the device-tissue interface, such as fibrotic encapsulation of the device, or they may affect the animal more globally, such as impacting behavioral or sensorimotor functions. The objective of this study was to investigate the overall consequences of implantation of high-electrode count intrafascicular peripheral nerve arrays, High Density Utah Slanted Electrode Arrays (HD-USEAs; 25 electrodes mm-2). Approach. HD-USEAs were implanted in rat sciatic nerves for one and two month periods. We monitored wheel running, noxious sensory paw withdrawal reflexes, footprints, nerve morphology and macrophage presence at the tissue-device interface. In addition, we used a novel approach to contain the arrays in actively behaving animals that consisted of an organic nerve wrap. A total of 500 electrodes were implanted across all ten animals. Main results. The results demonstrated that chronic implantation (⩽8 weeks) of HD-USEAs into peripheral nerves can evoke behavioral deficits that recover over time. Morphology of the nerve distal to the implantation site showed variable signs of nerve fiber degeneration and regeneration. Cytology adjacent to the device-tissue interface also showed a variable response, with some electrodes having many macrophages surrounding the electrodes, while other electrodes had few or no macrophages present. This variability was also seen along the length

  1. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  2. Counting Tm dopant atoms around GaN dots using high-angle annular dark field images

    International Nuclear Information System (INIS)

    Rouvière, J-L; Okuno, H; Jouneau, P H; Bayle-Guillemaud, P; Daudin, B

    2011-01-01

    High resolution Z-contrast STEM imaging is used to study the Tm doping of GaN quantum dots grown in AlN by molecular beam epitaxy (MBE). High-angle annular dark field (HAADF) imaging allows us to visualize directly individual Tm atoms in the AlN matrix and even to count the number of Tm atoms in a given AlN atomic column. A new visibility coefficient to determine quantitatively the number of Tm atoms in a given atomic column is introduced. It is based on locally integrated intensities rather than on peak intensities of HAADF images. STEM image simulations shows that this new visibility is less sensitive to the defocus-induced blurring or to the position of the Tm atom within the thin lamella. Most of the Tm atoms diffuse out of GaN dots. Tm atoms are found at different positions in the AlN matrix, (i) Above the wetting layer, Tm atoms are spread within a thickness of 14 AlN monolayers (MLs). (ii) Above the quantum dots all the Tm are located in the same plane situated at 2-3 MLs above the apex of the GaN dot, i.e. at a distance of 14 MLs from the wetting layer, (iii) In addition, Tm can diffuse very far from the GaN dot by following threading dislocations lines.

  3. A transimpedance CMOS multichannel amplifier with a 50 Ω-wide output range buffer for high counting rate applications

    International Nuclear Information System (INIS)

    Haralabidis, N.; Loukas, D.; Misiakos, K.; Katsafouros, S.

    1997-01-01

    A fast transimpedance multichannel amplifier has been designed, fabricated in CMOS 1.2-microm technology and tested. Each channel consists of a current sensitive preamplifier followed by a voltage amplification stage and an on-chip buffer able to drive 50 Ω loads with an output range of ±800 mV. Measured peaking time at the output is 40 ns and the circuit recovers to baseline in 90 ns. This results in a counting capability of more than 10 7 hits/s. Signals of both polarities can be handled. The first two stages consume a total of 2 mW per channel and the 50 Ω buffer consumes another 17 mW. The equivalent noise charge (ENC) is 1,100 e - rms with a slope of 40e - /pF. The IC is intended for use in gas and solid-state detectors with high particle rate and extensive charge release as in high energy calorimetry

  4. Counting Possibilia

    Directory of Open Access Journals (Sweden)

    Alfredo Tomasetta

    2010-06-01

    Full Text Available Timothy Williamson supports the thesis that every possible entity necessarily exists and so he needs to explain how a possible son of Wittgenstein’s, for example, exists in our world:he exists as a merely possible object (MPO, a pure locus of potential. Williamson presents a short argument for the existence of MPOs: how many knives can be made by fitting together two blades and two handles? Four: at the most two are concrete objects, the others being merely possible knives and merely possible objects. This paper defends the idea that one can avoid reference and ontological commitment to MPOs. My proposal is that MPOs can be dispensed with by using the notion of rules of knife-making. I first present a solution according to which we count lists of instructions - selected by the rules - describing physical combinations between components. This account, however, has its own difficulties and I eventually suggest that one can find a way out by admitting possible worlds, entities which are more commonly accepted - at least by philosophers - than MPOs. I maintain that, in answering Williamson’s questions, we count classes of physically possible worlds in which the same instance of a general rule is applied.

  5. A high-sensitivity neutron counter and waste-drum counting with the high-sensitivity neutron instrument

    International Nuclear Information System (INIS)

    Hankins, D.E.; Thorngate, J.H.

    1993-04-01

    At Lawrence Livermore National Laboratory (LLNL), a highly sensitive neutron counter was developed that can detect and accurately measure the neutrons from small quantities of plutonium or from other low-level neutron sources. This neutron counter was originally designed to survey waste containers leaving the Plutonium Facility. However, it has proven to be useful in other research applications requiring a high-sensitivity neutron instrument

  6. State of the art of D ampersand D Instrumentation Technology: Alpha counting in the presence of high background

    International Nuclear Information System (INIS)

    Dickerman, C.E.

    1995-08-01

    Discrimination of alpha activity in the presence of a high radiation background has been identified as an area of concern to be studied for D ampersand D applications. Upon evaluating the range of alpha detection needs for D ampersand D operations, we have expanded this study to address the operational concern of greatly expediting alpha counting of rough surfaces and rubble. Note that the term, ''rough surfaces'' includes a wide range of practical cases, including contaminated equipment and work surfaces. We have developed provisional applications requirements for instrumentation of this type; and we also have generated the scope of a program of instrument evaluation and testing, with emphasis on practical implementation. In order to obtain the full operational benefit of alpha discrimination in the presence of strong beta-gamma radiation background, the detection system must be capable of some form of remote or semi-remote operation in order to reduce operator exposure. We have identified a highly promising technique, the long-range alpha detector (LRAD), for alpha discrimination in the presence of high radiation background. This technique operates upon the principle of transporting alphaionized air to an ionization detector. A transport time within a few seconds is adequate. Neither the provisional requirements nor the evaluation and testing scope were expressly tailored to force the selection of a LRAD technology, and they could be used as a basis for studies of other promising technologies. However, a technology that remotely detects alpha-ionized air (e. g., LRAD) is a natural fit to the key requirements of rejection of high background at the survey location and operator protection. Also, LRAD appears to be valuable for D ampersand D applications as a means of greatly expediting surface alpha-activity surveys that otherwise would require performing time-consuming scans over surfaces of interest with alpha detector probes, and even more labor-intensive surface

  7. A novel high electrode count spike recording array using an 81,920 pixel transimpedance amplifier-based imaging chip.

    Science.gov (United States)

    Johnson, Lee J; Cohen, Ethan; Ilg, Doug; Klein, Richard; Skeath, Perry; Scribner, Dean A

    2012-04-15

    Microelectrode recording arrays of 60-100 electrodes are commonly used to record neuronal biopotentials, and these have aided our understanding of brain function, development and pathology. However, higher density microelectrode recording arrays of larger area are needed to study neuronal function over broader brain regions such as in cerebral cortex or hippocampal slices. Here, we present a novel design of a high electrode count picocurrent imaging array (PIA), based on an 81,920 pixel Indigo ISC9809 readout integrated circuit camera chip. While originally developed for interfacing to infrared photodetector arrays, we have adapted the chip for neuron recording by bonding it to microwire glass resulting in an array with an inter-electrode pixel spacing of 30 μm. In a high density electrode array, the ability to selectively record neural regions at high speed and with good signal to noise ratio are both functionally important. A critical feature of our PIA is that each pixel contains a dedicated low noise transimpedance amplifier (∼0.32 pA rms) which allows recording high signal to noise ratio biocurrents comparable to single electrode voltage amplifier recordings. Using selective sampling of 256 pixel subarray regions, we recorded the extracellular biocurrents of rabbit retinal ganglion cell spikes at sampling rates up to 7.2 kHz. Full array local electroretinogram currents could also be recorded at frame rates up to 100 Hz. A PIA with a full complement of 4 readout circuits would span 1cm and could acquire simultaneous data from selected regions of 1024 electrodes at sampling rates up to 9.3 kHz. Published by Elsevier B.V.

  8. Strengthening Children's Math Skills with Enhanced Instruction: The Impacts of Making Pre-K Count and High 5s on Kindergarten Outcomes

    Science.gov (United States)

    Mattera, Shira K.; Jacob, Robin; Morris, Pamela A.

    2018-01-01

    Early math skills are a strong predictor of later achievement for young children, not only in math, but in other domains as well. Exhibiting strong math skills in elementary school is predictive of later high school completion and college attendance. To that end, the Making Pre-K Count and High 5s studies set out to rigorously assess whether…

  9. Eulerian and Lagrangian statistics from high resolution numerical simulations of weakly compressible turbulence

    NARCIS (Netherlands)

    Benzi, R.; Biferale, L.; Fisher, R.T.; Lamb, D.Q.; Toschi, F.

    2009-01-01

    We report a detailed study of Eulerian and Lagrangian statistics from high resolution Direct Numerical Simulations of isotropic weakly compressible turbulence. Reynolds number at the Taylor microscale is estimated to be around 600. Eulerian and Lagrangian statistics is evaluated over a huge data

  10. High-Sensitivity Semiconductor Photocathodes for Space-Born UV Photon-Counting and Imaging, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Many UV photon-counting and imaging applications, including space-borne astronomy, missile tracking and guidance, UV spectroscopy for chemical/biological...

  11. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study

  12. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: an example from a vertigo phase III study with longitudinal count data as primary endpoint.

    Science.gov (United States)

    Adrion, Christine; Mansmann, Ulrich

    2012-09-10

    A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions

  13. Automated high performance liquid chromatography and liquid scintillation counting determination of pesticide mixture octanol/water partition rates

    International Nuclear Information System (INIS)

    Moody, R.P.; Carroll, J.M.; Kresta, A.M.

    1987-01-01

    Two novel methods are reported for measuring octanol/water partition rates of pesticides. A liquid scintillation counting (LSC) method was developed for automated monitoring of 14 C-labeled pesticides partitioning in biphasic water/octanol cocktail systems with limited success. A high performance liquid chromatography (HPLC) method was developed for automated partition rate monitoring of several constituents in a pesticide mixture, simultaneously. The mean log Kow +/- SD determined from triplicate experimental runs were for: 2,4-D-DMA (2,4-dichlorophenoxyacetic acid dimethylamine), 0.65 +/- .17; Deet (N,N-diethyl-m-toluamide), 2.02 +/- .01; Guthion (O,O-dimethyl-S-(4-oxo-1,2,3-benzotriazin-3(4H)-ylmethyl) phosphorodithioate), 2.43 +/- .03; Methyl-Parathion (O,O-dimethyl-O-(p-nitrophenyl) phosphorothioate), 2.68 +/- .05; and Fenitrothion (O,O-dimethyl O-(4-nitro-m-tolyl) phosphorothioate), 3.16 +/- .03. A strong positive linear correlation (r = .9979) was obtained between log Kow and log k' (log Kow = 2.35 (log k') + 0.63). The advantages that this automated procedure has in comparison with the standard manual shake-flask procedure are discussed

  14. High-performance integrated pick-up circuit for SPAD arrays in time-correlated single photon counting

    Science.gov (United States)

    Acconcia, Giulia; Cominelli, Alessandro; Peronio, Pietro; Rech, Ivan; Ghioni, Massimo

    2017-05-01

    The analysis of optical signals by means of Single Photon Avalanche Diodes (SPADs) has been subject to a widespread interest in recent years. The development of multichannel high-performance Time Correlated Single Photon Counting (TCSPC) acquisition systems has undergone a fast trend. Concerning the detector performance, best in class results have been obtained resorting to custom technologies leading also to a strong dependence of the detector timing jitter from the threshold used to determine the onset of the photogenerated current flow. In this scenario, the avalanche current pick-up circuit plays a key role in determining the timing performance of the TCSPC acquisition system, especially with a large array of SPAD detectors because of electrical crosstalk issues. We developed a new current pick-up circuit based on a transimpedance amplifier structure able to extract the timing information from a 50-μm-diameter custom technology SPAD with a state-of-art timing jitter as low as 32ps and suitable to be exploited with SPAD arrays. In this paper we discuss the key features of this structure and we present a new version of the pick-up circuit that also provides quenching capabilities in order to minimize the number of interconnections required, an aspect that becomes more and more crucial in densely integrated systems.

  15. A high resolution, high counting rate bidimensional, MWPC imaging detector for small angle X-ray diffraction studies

    International Nuclear Information System (INIS)

    Bateman, J.E.; Connolly, J.F.; Sawyer, E.C.; Stephenson, R.

    1981-07-01

    The performance is reported of a 200 mm x 200 mm X-ray imaging MWPC aimed at applications in small angle X-ray diffraction and scattering. With quantum energies of approximately 8 keV high spatial resolution (+- 0.5 mm x +- 0.14 mm) with a capability for data taking at >approximately 350 kHz is reported. The detection efficiency is approximately 75% and the detector operates as a sealed unit with a long lifetime. (author)

  16. Bonding of Si wafers by surface activation method for the development of high efficiency high counting rate radiation detectors

    International Nuclear Information System (INIS)

    Kanno, Ikuo; Yamashita, Makoto; Onabe, Hideaki

    2006-01-01

    Si wafers with two different resistivities ranging over two orders of magnitude were bonded by the surface activation method. The resistivities of bonded Si wafers were measured as a function of annealing temperature. Using calculations based on a model, the interface resistivities of bonded Si wafers were estimated as a function of the measured resistivities of bonded Si wafers. With thermal treatment from 500degC to 900degC, all interfaces showed high resistivity, with behavior that was close to that of an insulator. Annealing at 1000degC decreased the interface resistivity and showed close to ideal bonding after thermal treatment at 1100degC. (author)

  17. A user configurable data acquisition and signal processing system for high-rate, high channel count applications

    International Nuclear Information System (INIS)

    Salim, Arwa; Crockett, Louise; McLean, John; Milne, Peter

    2012-01-01

    Highlights: ► The development of a new digital signal processing platform is described. ► The system will allow users to configure the real-time signal processing through software routines. ► The architecture of the DRUID system and signal processing elements is described. ► A prototype of the DRUID system has been developed for the digital chopper-integrator. ► The results of acquisition on 96 channels at 500 kSamples/s per channel are presented. - Abstract: Real-time signal processing in plasma fusion experiments is required for control and for data reduction as plasma pulse times grow longer. The development time and cost for these high-rate, multichannel signal processing systems can be significant. This paper proposes a new digital signal processing (DSP) platform for the data acquisition system that will allow users to easily customize real-time signal processing systems to meet their individual requirements. The D-TACQ reconfigurable user in-line DSP (DRUID) system carries out the signal processing tasks in hardware co-processors (CPs) implemented in an FPGA, with an embedded microprocessor (μP) for control. In the fully developed platform, users will be able to choose co-processors from a library and configure programmable parameters through the μP to meet their requirements. The DRUID system is implemented on a Spartan 6 FPGA, on the new rear transition module (RTM-T), a field upgrade to existing D-TACQ digitizers. As proof of concept, a multiply-accumulate (MAC) co-processor has been developed, which can be configured as a digital chopper-integrator for long pulse magnetic fusion devices. The DRUID platform allows users to set options for the integrator, such as the number of masking samples. Results from the digital integrator are presented for a data acquisition system with 96 channels simultaneously acquiring data at 500 kSamples/s per channel.

  18. Quantum optical signatures in strong-field laser physics: Infrared photon counting in high-order-harmonic generation.

    Science.gov (United States)

    Gonoskov, I A; Tsatrafyllis, N; Kominis, I K; Tzallas, P

    2016-09-07

    We analytically describe the strong-field light-electron interaction using a quantized coherent laser state with arbitrary photon number. We obtain a light-electron wave function which is a closed-form solution of the time-dependent Schrödinger equation (TDSE). This wave function provides information about the quantum optical features of the interaction not accessible by semi-classical theories. With this approach we can reveal the quantum optical properties of high harmonic generation (HHG) process in gases by measuring the photon statistics of the transmitted infrared (IR) laser radiation. This work can lead to novel experiments in high-resolution spectroscopy in extreme-ultraviolet (XUV) and attosecond science without the need to measure the XUV light, while it can pave the way for the development of intense non-classical light sources.

  19. Studies on the Pulse Rate, Pedometer Count and Satisfactoin Degree at Various Exercise

    OpenAIRE

    小原, 史朗

    2004-01-01

    This investigation examined whether free exercise of students became good stimulation of breathing circulation function from relation of pulse rate and pedometer count. And, I examined it on satisfaction degree after exercise. Object person was 432 man students (total of 1391) and 94 woman students (total of 472). As for relation of pulse rate and pedometer count, statistical meaning was recognized by man and women. The exercise that a pulse rate and pedometer count were high together seemed ...

  20. Categorical counting.

    Science.gov (United States)

    Fetterman, J Gregor; Killeen, P Richard

    2010-09-01

    Pigeons pecked on three keys, responses to one of which could be reinforced after a few pecks, to a second key after a somewhat larger number of pecks, and to a third key after the maximum pecking requirement. The values of the pecking requirements and the proportion of trials ending with reinforcement were varied. Transits among the keys were an orderly function of peck number, and showed approximately proportional changes with changes in the pecking requirements, consistent with Weber's law. Standard deviations of the switch points between successive keys increased more slowly within a condition than across conditions. Changes in reinforcement probability produced changes in the location of the psychometric functions that were consistent with models of timing. Analyses of the number of pecks emitted and the duration of the pecking sequences demonstrated that peck number was the primary determinant of choice, but that passage of time also played some role. We capture the basic results with a standard model of counting, which we qualify to account for the secondary experiments. Copyright 2010 Elsevier B.V. All rights reserved.

  1. Role of quenching on alpha/beta separation in liquid scintillation counting for several high capacity cocktails

    International Nuclear Information System (INIS)

    Pujol, L.; Sanchez-Cabeza, J.-A.

    1997-01-01

    The optimization of alpha/beta separation in liquid scintillation using pulse shape analysis is convenient for the simultaneous determination of alpha and beta emitters in natural water and other samples. In this work, alpha/beta separation was studied for different scintillant/vial combinations and it was observed that both the optimum pulse shape discrimination level and the total interference value (that is, the summed relative interference between alpha and beta spectra) were dependent on the sample quenching and independent of the scintillant/vial combination. These results provide a simple method for modifying the counting configuration, such as a change in the cocktail, vial or sample characteristics, without the need to perform exhaustive parameter optimizations. Also, it was observed that, for our counting conditions, the combination of Ultima Gold AB scintillation cocktail with Zinsser low diffusion vials presented the lowest total interference, namely 0.94 ± 0.28%, which is insignificant for the counting of environmental samples. (Author)

  2. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  3. Pyogenic arthritis, pyoderma gangrenosum, and acne (PAPA) syndrome: differential diagnosis of septic arthritis by regular detection of exceedingly high synovial cell counts.

    Science.gov (United States)

    Löffler, W; Lohse, P; Weihmayr, T; Widenmayer, W

    2017-08-01

    Pyogenic arthritis, pyoderma gangrenosum and acne syndrome was diagnosed in a 42-year-old patient, after an unusual persistency of high synovial cell counts had been noticed. Clinical peculiarities and problems with diagnosing septic versus non-septic arthritis are discussed.

  4. [Corrected count].

    Science.gov (United States)

    1991-11-27

    The data of the 1991 census indicated that the population count of Brazil fell short of a former estimate by 3 million people. The population reached 150 million people with an annual increase of 2%, while projections in the previous decade expected an increase of 2.48% to 153 million people. This reduction indicates more widespread use of family planning (FP) and control of fertility among families of lower social status as more information is being provided to them. However, the Ministry of Health ordered an investigation of foreign family planning organizations because it was suspected that women were forced to undergo tubal ligation during vaccination campaigns. A strange alliance of left wing politicians and the Roman Catholic Church alleges a conspiracy of international FP organizations receiving foreign funds. The FP strategies of Bemfam and Pro-Pater offer women who have little alternative the opportunity to undergo tubal ligation or to receive oral contraceptives to control fertility. The ongoing government program of distributing booklets on FP is feeble and is not backed up by an education campaign. Charges of foreign interference are leveled while the government hypocritically ignores the grave problem of 4 million abortions a year. The population is expected to continue to grow until the year 2040 and then to stabilize at a low growth rate of .4%. In 1980, the number of children per woman was 4.4 whereas the 1991 census figures indicate this has dropped to 3.5. The excess population is associated with poverty and a forsaken caste in the interior. The population actually has decreased in the interior and in cities with 15,000 people. The phenomenon of the drop of fertility associated with rural exodus is contrasted with cities and villages where the population is 20% less than expected.

  5. High energy resolution and high count rate gamma spectrometry measurement of primary coolant of generation 4 sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Coulon, R.

    2010-01-01

    Sodium-cooled Fast Reactors are under development for the fourth generation of nuclear reactor. Breeders reactors could gives solutions for the need of energy and the preservation of uranium resources. An other purpose is the radioactive wastes production reduction by transmutation and the control of non-proliferation using a closed-cycle. These thesis shows safety and profit advantages that could be obtained by a new generation of gamma spectrometry system for SFR. Now, the high count rate abilities, allow us to study new methods of accurate power measurement and fast clad failure detection. Simulations have been done and an experimental test has been performed at the French Phenix SFR of the CEA Marcoule showing promising results for these new measurements. (author) [fr

  6. Determining Gate Count Reliability in a Library Setting

    Directory of Open Access Journals (Sweden)

    Jeffrey Phillips

    2016-09-01

    Full Text Available Objective – Patron counts are a common form of measurement for library assessment. To develop accurate library statistics, it is necessary to determine any differences between various counting devices. A yearlong comparison between card reader turnstiles and laser gate counters in a university library sought to offer a standard percentage of variance and provide suggestions to increase the precision of counts. Methods – The collection of library exit counts identified the differences between turnstile and laser gate counter data. Statistical software helped to eliminate any inaccuracies in the collection of turnstile data, allowing this data set to be the base for comparison. Collection intervals were randomly determined and demonstrated periods of slow, average, and heavy traffic. Results – After analyzing 1,039,766 patron visits throughout a year, the final totals only showed a difference of .43% (.0043 between the two devices. The majority of collection periods did not exceed a difference of 3% between the counting instruments. Conclusion – Turnstiles card readers and laser gate counters provide similar levels of reliability when measuring patron activity. Each system has potential counting inaccuracies, but several methods exist to create more precise totals. Turnstile card readers are capable of offering greater detail involving patron identity, but their high cost makes them inaccessible for libraries with lower budgets. This makes laser gate counters an affordable alternative for reliable patron counting in an academic library.

  7. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    Science.gov (United States)

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  8. Alpha scintillation radon counting

    International Nuclear Information System (INIS)

    Lucas, H.F. Jr.

    1977-01-01

    Radon counting chambers which utilize the alpha-scintillation properties of silver activated zinc sulfide are simple to construct, have a high efficiency, and, with proper design, may be relatively insensitive to variations in the pressure or purity of the counter filling. Chambers which were constructed from glass, metal, or plastic in a wide variety of shapes and sizes were evaluated for the accuracy and the precision of the radon counting. The principles affecting the alpha-scintillation radon counting chamber design and an analytic system suitable for a large scale study of the 222 Rn and 226 Ra content of either air or other environmental samples are described. Particular note is taken of those factors which affect the accuracy and the precision of the method for monitoring radioactivity around uranium mines

  9. Every photon counts: improving low, mid, and high-spatial frequency errors on astronomical optics and materials with MRF

    Science.gov (United States)

    Maloney, Chris; Lormeau, Jean Pierre; Dumas, Paul

    2016-07-01

    Many astronomical sensing applications operate in low-light conditions; for these applications every photon counts. Controlling mid-spatial frequencies and surface roughness on astronomical optics are critical for mitigating scattering effects such as flare and energy loss. By improving these two frequency regimes higher contrast images can be collected with improved efficiency. Classically, Magnetorheological Finishing (MRF) has offered an optical fabrication technique to correct low order errors as well has quilting/print-through errors left over in light-weighted optics from conventional polishing techniques. MRF is a deterministic, sub-aperture polishing process that has been used to improve figure on an ever expanding assortment of optical geometries, such as planos, spheres, on and off axis aspheres, primary mirrors and freeform optics. Precision optics are routinely manufactured by this technology with sizes ranging from 5-2,000mm in diameter. MRF can be used for form corrections; turning a sphere into an asphere or free form, but more commonly for figure corrections achieving figure errors as low as 1nm RMS while using careful metrology setups. Recent advancements in MRF technology have improved the polishing performance expected for astronomical optics in low, mid and high spatial frequency regimes. Deterministic figure correction with MRF is compatible with most materials, including some recent examples on Silicon Carbide and RSA905 Aluminum. MRF also has the ability to produce `perfectly-bad' compensating surfaces, which may be used to compensate for measured or modeled optical deformation from sources such as gravity or mounting. In addition, recent advances in MRF technology allow for corrections of mid-spatial wavelengths as small as 1mm simultaneously with form error correction. Efficient midspatial frequency corrections make use of optimized process conditions including raster polishing in combination with a small tool size. Furthermore, a novel MRF

  10. Statistical issues in searches for new phenomena in High Energy Physics

    Science.gov (United States)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  11. Distribution of non-aureus staphylococci species in udder quarters with low and high somatic cell count, and clinical mastitis.

    Science.gov (United States)

    Condas, Larissa A Z; De Buck, Jeroen; Nobrega, Diego B; Carson, Domonique A; Roy, Jean-Philippe; Keefe, Greg P; DeVries, Trevor J; Middleton, John R; Dufour, Simon; Barkema, Herman W

    2017-07-01

    The effect of non-aureus staphylococci (NAS) in bovine mammary health is controversial. Overall, NAS intramammary infections (IMI) increase somatic cell count (SCC), with an effect categorized as mild, mostly causing subclinical or mild to moderate clinical mastitis. However, based on recent studies, specific NAS may affect the udder more severely. Some of these apparent discrepancies could be attributed to the large number of species that compose the NAS group. The objectives of this study were to determine (1) the SCC of quarters infected by individual NAS species compared with NAS as a group, culture-negative, and major pathogen-infected quarters; (2) the distribution of NAS species isolated from quarters with low SCC (mastitis; and (3) the prevalence of NAS species across quarters with low and high SCC. A total of 5,507 NAS isolates, 3,561 from low SCC quarters, 1,873 from high SCC quarters, and 73 from clinical mastitis cases, were obtained from the National Cohort of Dairy Farms of the Canadian Bovine Mastitis Research Network. Of quarters with low SCC, high SCC, or clinical mastitis, 7.6, 18.5, and 4.3% were NAS positive, respectively. The effect of NAS IMI on SCC was estimated using mixed-effect linear regression; prevalence of NAS IMI was estimated using Bayesian analyses. Mean SCC of NAS-positive quarters was 70,000 cells/mL, which was higher than culture-negative quarters (32,000 cells/mL) and lower than major pathogen-positive quarters (129,000 to 183,000 cells/mL). Compared with other NAS species, SCC was highest in quarters positive for Staphylococcus capitis, Staphylococcus gallinarum, Staphylococcus hyicus, Staphylococcus agnetis, or Staphylococcus simulans. In NAS-positive quarters, Staphylococcus xylosus (12.6%), Staphylococcus cohnii (3.1%), and Staphylococcus equorum (0.6%) were more frequently isolated from quarters with low SCC than other NAS species, whereas Staphylococcus sciuri (14%) was most frequently isolated from clinical mastitis cases

  12. High performance statistical computing with parallel R: applications to biology and climate modelling

    International Nuclear Information System (INIS)

    Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth

    2006-01-01

    Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines

  13. Copy Counts

    Science.gov (United States)

    Beaumont, Lee R.

    1970-01-01

    The level of difficulty of straight copy, which is used to measure typewriting speed, is influenced by syllable intensity (the average number of syllables per word), stroke intensity (average number of strokes per word), and high-frequency words. (CH)

  14. Spectrometry with high count rate for the study of the soft X-rays. Application for the plasma of WEGA

    International Nuclear Information System (INIS)

    Brouquet, P.

    1979-04-01

    The plasma of the WEGA torus, whose electron temperature varies between 0.5 and 1 keV, emits electromagnetic radiation extending to wavelengths of the order of 1A. Different improvements performed on a semi-conductor spectrometer have permitted the study of this emission in the soft X ray region (1 keV - 30 keV) at a count rate of 3.10 5 counts/s with an energy resolution of 350 eV. For each plasma shot, this diagnostic gives 4 measurements of the plasma electron temperature and of the effective charge, Zeff, with a time resolution of 5 ms. The values of the electron temperature and of the effective charge derived from the study of soft X rays are in agreement with those given by other diagnostic methods [fr

  15. An Odyssey of Connecticut's Children: KIDS COUNT Data Book 2001.

    Science.gov (United States)

    Sampson, Amy E.

    This Kids Count Data Book provides state and regional trends in the well-being of Connecticut's children. The statistical portrait is based on 19 indicators of well-being: (1) children in families receiving welfare; (2) children receiving free or reduced-price meals; (3) high school employment; (4) births to teen mothers; (5) low birth weight; (6)…

  16. Statistical behavior of high doses in medical radiodiagnosis; Comportamento estatistico das altas doses em radiodiagnostico medico

    Energy Technology Data Exchange (ETDEWEB)

    Barboza, Adriana Elisa, E-mail: adrianaebarboza@gmail.com, E-mail: elisa@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  17. A compact 7-cell Si-drift detector module for high-count rate X-ray spectroscopy.

    Science.gov (United States)

    Hansen, K; Reckleben, C; Diehl, I; Klär, H

    2008-05-01

    A new Si-drift detector module for fast X-ray spectroscopy experiments was developed and realized. The Peltier-cooled module comprises a sensor with 7 × 7-mm 2 active area, an integrated circuit for amplification, shaping and detection, storage, and derandomized readout of signal pulses in parallel, and amplifiers for line driving. The compactness and hexagonal shape of the module with a wrench size of 16mm allow very short distances to the specimen and multi-module arrangements. The power dissipation is 186mW. At a shaper peaking time of 190 ns and an integration time of 450 ns an electronic rms noise of ~11 electrons was achieved. When operated at 7 °C, FWHM line widths around 260 and 460 eV (Cu-K α ) were obtained at low rates and at sum-count rates of 1.7 MHz, respectively. The peak shift is below 1% for a broad range of count rates. At 1.7-MHz sum-count rate the throughput loss amounts to 30%.

  18. Characteristics of Febrile Patients with Normal White Blood Cell Counts and High C-Reactive Protein Levels in an Emergency Department

    Directory of Open Access Journals (Sweden)

    Kuan-Ting Liu

    2008-05-01

    Full Text Available Fever is one of the more common chief complaints of patients who visit emergency departments (ED. Many febrile patients have markedly elevated C-reactive protein (CRP levels and normal white blood cell (WBC counts. Most of these patients have bacterial infection and no previous underlying disease of impaired WBC functioning. We reviewed patients who visited our ED between November 2003 and July 2004. The WBC count and CRP level of patients over 18 years of age who visited the ED because of or with fever were recorded. Patients who had normal WBC count (4,000–10,000/mL and high CRP level (> 100 mg/L were included. The data, including gender, age and length of hospital stay, were reviewed. Underlying diseases, diagnosis of the febrile disease and final condition were recorded according to the chart. Within the study period, 54,078 patients visited our ED. Of 5,628 febrile adults, 214 (3.8% had elevated CRP level and normal WBC count. The major cause of febrility was infection (82.24%. Most of these patients were admitted (92.99%. There were 32 patients with malignant neoplasm, nine with liver cirrhosis, 66 with diabetes mellitus and 11 with uremia. There were no significant differences in age and gender between patients with and those without neoplasm. However, a higher inhospital mortality rate and other causes of febrility were noted in patients with neoplasm. It was not rare in febrile patients who visited the ED to have a high CRP level but normal WBC count. These patients did not necessarily have an underlying malignant neoplasm or hematologic illness. Factors other than malignant neoplasm or hematologic illness may be associated with the WBC response, and CRP may be a better indicator of infection under such conditions.

  19. High somatic cell counts and changes in milk fat and protein contents around insemination are negatively associated with conception in dairy cows.

    Science.gov (United States)

    Albaaj, Ahmad; Foucras, Gilles; Raboisson, Didier

    2017-01-15

    The fertility of dairy cows has decreased dramatically worldwide over the last few decades, and several causes of this trend have been reported. Several studies have associated compromised udder health with deteriorating reproduction performance. Subclinical ketosis (SCK) has also been reported to be a risk factor for decreased conception. The objective of the present study was to describe how SCK might interact with the reported association between udder health and conception in dairy cows. Data from the French Milk Control Program and data on 8,549,667 instances of artificial insemination (AI) and their corresponding preceding and subsequent test-days from 5,979,701 Holstein cows were examined over a 5-year period (2008-2012). The effect of udder health was evaluated through a low (L) or high (H) somatic cell count (SCC) before and after AI using a threshold of 200,000 cells/mL, and transformed into four groups (LL, LH, HL, and HH). Three proxies for defining SCK were proposed based on the milk fat and protein content (or their ratio) before AI. Statistical analysis first included a generalized additive model to help define the optimal threshold values. Next, a logistic regression with a Poisson correction was performed. On average, the risk of conception at first AI was reduced by 14% for LH or HH cows (relative risk [and 95% CI] = 0.86 [0.85-0.87]) when the SCC increased or remained high within 40 days before and after AI, relative to LL group. The reduction of conception success associated with SCK (fat and protein contents changes) varied from 3% to 17% depending on the used SCK proxy. Including the interaction term SCC ∗ SCK clearly showed that the association of increased SCC around AI with conception success was modified by the presence of SCK. A cow that already has SCK and experiences an increase in SCC around or after AI exhibits up to 2 times further decrease in conception success compared with a cow with a high SCC and no SCK. In conclusion

  20. New method for eliminating the statistical bias in highly turbulent flow measurements

    International Nuclear Information System (INIS)

    Nakao, S.I.; Terao, Y.; Hirata, K.I.; Kitakyushu Industrial Research Institute, Fukuoka, Japan)

    1987-01-01

    A simple method was developed for eliminating statistical bias which can be applied to highly turbulent flows with the sparse and nonuniform seeding conditions. Unlike the method proposed so far, a weighting function was determined based on the idea that the statistical bias could be eliminated if the asymmetric form of the probability density function of the velocity data were corrected. Moreover, the data more than three standard deviations away from the mean were discarded to remove the apparent turbulent intensity resulting from noise. The present method was applied to data obtained in the wake of a block, which provided local turbulent intensities up to about 120 percent, it was found to eliminate the statistical bias with high accuracy. 9 references

  1. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  2. Track counting in radon dosimetry

    International Nuclear Information System (INIS)

    Fesenbeck, Ingo; Koehler, Bernd; Reichert, Klaus-Martin

    2013-01-01

    The newly developed, computer-controlled track counting system is capable of imaging and analyzing the entire area of nuclear track detectors. The high optical resolution allows a new analysis approach for the process of automated counting using digital image processing technologies. This way, higher exposed detectors can be evaluated reliably by an automated process as well. (orig.)

  3. Excel 2016 in applied statistics for high school students a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2018-01-01

    This textbook is a step-by-step guide for high school, community college, or undergraduate students who are taking a course in applied statistics and wish to learn how to use Excel to solve statistical problems. All of the statistics problems in this book will come from the following fields of study: business, education, psychology, marketing, engineering and advertising. Students will learn how to perform key statistical tests in Excel without being overwhelmed by statistical theory. Each chapter briefly explains a topic and then demonstrates how to use Excel commands and formulas to solve specific statistics problems. This book gives practice in using Excel in two different ways: (1) writing formulas (e.g., confidence interval about the mean, one-group t-test, two-group t-test, correlation) and (2) using Excel’s drop-down formula menus (e.g., simple linear regression, multiple correlations and multiple regression, and one-way ANOVA). Three practice problems are provided at the end of each chapter, along w...

  4. High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.

    Science.gov (United States)

    Algin, Abdullah; Senay, Mustafa

    2012-04-01

    An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.

  5. Accuracy in activation analysis: count rate effects

    International Nuclear Information System (INIS)

    Lindstrom, R.M.; Fleming, R.F.

    1980-01-01

    The accuracy inherent in activation analysis is ultimately limited by the uncertainty of counting statistics. When careful attention is paid to detail, several workers have shown that all systematic errors can be reduced to an insignificant fraction of the total uncertainty, even when the statistical limit is well below one percent. A matter of particular importance is the reduction of errors due to high counting rate. The loss of counts due to random coincidence (pulse pileup) in the amplifier and to digitization time in the ADC may be treated as a series combination of extending and non-extending dead times, respectively. The two effects are experimentally distinct. Live timer circuits in commercial multi-channel analyzers compensate properly for ADC dead time for long-lived sources, but not for pileup. Several satisfactory solutions are available, including pileup rejection and dead time correction circuits, loss-free ADCs, and computed corrections in a calibrated system. These methods are sufficiently reliable and well understood that a decaying source can be measured routinely with acceptably small errors at a dead time as high as 20 percent

  6. Combined steam-ultrasound treatment of 2 seconds achieves significant high aerobic count and Enterobacteriaceae reduction on naturally contaminated food boxes, crates, conveyor belts, and meat knives.

    Science.gov (United States)

    Musavian, Hanieh S; Butt, Tariq M; Larsen, Annette Baltzer; Krebs, Niels

    2015-02-01

    Food contact surfaces require rigorous sanitation procedures for decontamination, although these methods very often fail to efficiently clean and disinfect surfaces that are visibly contaminated with food residues and possible biofilms. In this study, the results of a short treatment (1 to 2 s) of combined steam (95°C) and ultrasound (SonoSteam) of industrial fish and meat transportation boxes and live-chicken transportation crates naturally contaminated with food and fecal residues were investigated. Aerobic counts of 5.0 to 6.0 log CFU/24 cm(2) and an Enterobacteriaceae spp. level of 2.0 CFU/24 cm(2) were found on the surfaces prior to the treatment. After 1 s of treatment, the aerobic counts were significantly (P conveyor belts with hinge pins and one type of flat flexible rubber belt, all visibly contaminated with food residues. The aerobic counts of 3.0 to 5.0 CFU/50 cm(2) were significantly (P < 0.05) reduced, while Enterobacteriaceae spp. were reduced to a level below the detection limit. Industrial meat knives were contaminated with aerobic counts of 6.0 log CFU/5 cm(2) on the handle and 5.2 log CFU/14 cm(2) on the steel. The level of Enterobacteriaceae spp. contamination was approximately 2.5 log CFU on the handle and steel. Two seconds of steam-ultrasound treatment reduced the aerobic counts and Enterobacteriaceae spp. to levels below the detection limit on both handle and steel. This study shows that the steam-ultrasound treatment may be an effective replacement for disinfection processes and that it can be used for continuous disinfection at fast process lines. However, the treatment may not be able to replace efficient cleaning processes used to remove high loads of debris.

  7. Statistical modeling in phenomenological description of electromagnetic cascade processes produced by high-energy gamma quanta

    International Nuclear Information System (INIS)

    Slowinski, B.

    1987-01-01

    A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)

  8. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  9. Distribution of ion space charge in the volume of a multiwire proportional chamber at high counting rates

    International Nuclear Information System (INIS)

    Dmitriev, G.D.; Frumkin, I.B.

    1990-01-01

    The accumulation of ion space charge in the drift gap of a MWPC, separated from the amplification gap by a cathode grid, is studied. A significant dependence of the cathode grid transparency to ions on the counting rate is observed. It is shown that the decrease of ion transparency with beam intensity is defined both by the parameters and the mode of operation of the MWPC as well as by the gas mixture chosen. On the basis of suggested explanation of this effect a formula is obtained, that proved to be in good agreement with the experimental results. (orig.)

  10. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya [Jadavpur Univ., Calcutta (India)

    1994-07-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by [sup 16]O at 2.1 GeV/nucleon and [sup 12]C and [sup 24]Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author).

  11. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya

    1994-01-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by 16 O at 2.1 GeV/nucleon and 12 C and 24 Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author)

  12. Model Accuracy Comparison for High Resolution Insar Coherence Statistics Over Urban Areas

    Science.gov (United States)

    Zhang, Yue; Fu, Kun; Sun, Xian; Xu, Guangluan; Wang, Hongqi

    2016-06-01

    The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR) images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR) coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  13. MODEL ACCURACY COMPARISON FOR HIGH RESOLUTION INSAR COHERENCE STATISTICS OVER URBAN AREAS

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2016-06-01

    Full Text Available The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  14. Statistical and direct decay of high-lying single-particle excitations

    International Nuclear Information System (INIS)

    Gales, S.

    1993-01-01

    Transfer reactions induced by hadronic probes at intermediate energies have revealed a rich spectrum of high-lying excitations embedded in the nuclear continuum. The investigation of their decay properties is believed to be a severe test of their microscopic structure as predicted by microscopic nuclear models. In addition the degree of damping of these simple modes in the nuclear continuum can be obtained by means of the measured particle (n,p) decay branching ratios. The neutron and proton decay studies of high-lying single-particle states in heavy nuclei are presented. (author). 13 refs., 9 figs

  15. Compton suppression gamma-counting: The effect of count rate

    Science.gov (United States)

    Millard, H.T.

    1984-01-01

    Past research has shown that anti-coincidence shielded Ge(Li) spectrometers enhanced the signal-to-background ratios for gamma-photopeaks, which are situated on high Compton backgrounds. Ordinarily, an anti- or non-coincidence spectrum (A) and a coincidence spectrum (C) are collected simultaneously with these systems. To be useful in neutron activation analysis (NAA), the fractions of the photopeak counts routed to the two spectra must be constant from sample to sample to variations must be corrected quantitatively. Most Compton suppression counting has been done at low count rate, but in NAA applications, count rates may be much higher. To operate over the wider dynamic range, the effect of count rate on the ratio of the photopeak counts in the two spectra (A/C) was studied. It was found that as the count rate increases, A/C decreases for gammas not coincident with other gammas from the same decay. For gammas coincident with other gammas, A/C increases to a maximum and then decreases. These results suggest that calibration curves are required to correct photopeak areas so quantitative data can be obtained at higher count rates. ?? 1984.

  16. THE STATISTICS OF RADIO ASTRONOMICAL POLARIMETRY: BRIGHT SOURCES AND HIGH TIME RESOLUTION

    International Nuclear Information System (INIS)

    Van Straten, W.

    2009-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal-broadening phenomenon in single-pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single- and giant-pulse polarimetry typically involves sources with large flux-densities and observations with high time-resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self-noise is shown to fully explain the excess polarization dispersion previously noted in single-pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes parameters may be used to derive mode-separated pulse profiles without any assumptions about the intrinsic degrees of mode polarization. Finally, utilizing the small-number statistics of the Stokes parameters, it is established that the degree of polarization of an unresolved pulse is fundamentally undefined; therefore, previous claims of highly polarized giant pulses are unsubstantiated.

  17. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  18. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    International Nuclear Information System (INIS)

    Adams, T.; Batra, P.; Bugel, Leonard G.; Camilleri, Leslie Loris; Conrad, Janet Marie; Fisher, Peter H.; Formaggio, Joseph Angelo; Karagiorgi, Georgia S.; )

    2009-01-01

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDFs). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parameterized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of 'Beyond the Standard Model' physics

  19. A fast and high-sensitive dual-wavelength diffuse optical tomography system using digital lock-in photon-counting technique

    Science.gov (United States)

    Chen, Weiting; Yi, Xi; Zhao, Huijuan; Gao, Feng

    2014-09-01

    We presented a novel dual-wavelength diffuse optical imaging system which can perform 2-D or 3-D imaging fast and high-sensitively for monitoring the dynamic change of optical parameters. A newly proposed lock-in photon-counting detection method was adopted for week optical signal collection, which brought in excellent property as well as simplified geometry. Fundamental principles of the lock-in photon-counting detection were elaborately demonstrated, and the feasibility was strictly verified by the linearity experiment. Systemic performance of the prototype set up was experimentally accessed, including stray light rejection and inherent interference. Results showed that the system possessed superior anti-interference capability (under 0.58% in darkroom) compared with traditional photon-counting detection, and the crosstalk between two wavelengths was lower than 2.28%. For comprehensive assessment, 2-D phantom experiments towards relatively large dimension model (diameter of 4cm) were conducted. Different absorption targets were imaged to investigate detection sensitivity. Reconstruction image under all conditions was exciting, with a desirable SNR. Study on image quality v.s. integration time put forward a new method for accessing higher SNR with the sacrifice of measuring speed. In summary, the newly developed system showed great potential in promoting detection sensitivity as well as measuring speed. This will make substantial progress in dynamically tracking the blood concentration distribution in many clinical areas, such as small animal disease modeling, human brain activity research and thick tissues (for example, breast) diagnosis.

  20. Simulation of statistical γ-spectra of highly excited rare earth nuclei

    International Nuclear Information System (INIS)

    Schiller, A.; Munos, G.; Guttormsen, M.; Bergholt, L.; Melby, E.; Rekstad, J.; Siem, S.; Tveter, T.S.

    1997-05-01

    The statistical γ-spectra of highly excited even-even rare earth nuclei are simulated applying appropriate level density and strength function to a given nucleus. Hindrance effects due to K-conservation are taken into account. Simulations are compared to experimental data from the 163 Dy( 3 He,α) 162 Dy and 173 Yb( 3 He,α) 172 Yb reactions. The influence of the K quantum number at higher energies is discussed. 21 refs., 7 figs., 2 tabs

  1. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  2. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  3. Statistics for products of traces of high powers of the frobenius class of hyperelliptic curves

    OpenAIRE

    Roditty-Gershon, Edva

    2011-01-01

    We study the averages of products of traces of high powers of the Frobenius class of hyperelliptic curves of genus g over a fixed finite field. We show that for increasing genus g, the limiting expectation of these products equals to the expectation when the curve varies over the unitary symplectic group USp(2g). We also consider the scaling limit of linear statistics for eigenphases of the Frobenius class of hyperelliptic curves, and show that their first few moments are Gaussian.

  4. Do your syringes count?

    International Nuclear Information System (INIS)

    Brewster, K.

    2002-01-01

    Full text: This study was designed to investigate anecdotal evidence that residual Sestamibi (MIBI) activity vaned in certain situations. For rest studies different brands of syringes were tested to see if the residuals varied. The period of time MIBI doses remained in the syringe between dispensing and injection was also considered as a possible source of increased residual counts. Stress Mibi syringe residual activities were measured to assess if the method of stress test affected residual activity. MIBI was reconstituted using 13 Gbq of Technetium in 3mls of normal saline then boiled for 10 minutes. Doses were dispensed according to department protocol and injected via cannula. Residual syringes were collected for three syringe types. In each case the barrel and plunger were measured separately. As the syringe is flushed during the exercise stress test and not the pharmacological stress test the chosen method was recorded. No relationship was demonstrated between the time MIBI remained in a syringe prior to injection and residual activity. Residual activity was not affected by method of stress test used. Actual injected activity can be calculated if the amount of activity remaining in the syringe post injection is known. Imaging time can be adjusted for residual activity to optimise count statistics. Preliminary results in this study indicate there is no difference in residual activity between syringe brands.Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  5. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  6. High-dimensional data: p >> n in mathematical statistics and bio-medical applications

    OpenAIRE

    Van De Geer, Sara A.; Van Houwelingen, Hans C.

    2004-01-01

    The workshop 'High-dimensional data: p >> n in mathematical statistics and bio-medical applications' was held at the Lorentz Center in Leiden from 9 to 20 September 2002. This special issue of Bernoulli contains a selection of papers presented at that workshop. ¶ The introduction of high-throughput micro-array technology to measure gene-expression levels and the publication of the pioneering paper by Golub et al. (1999) has brought to life a whole new branch of data analysis under the name of...

  7. The naive CD4+ count in HIV-1-infected patients at time of initiation of highly active antiretroviral therapy is strongly associated with the level of immunological recovery

    DEFF Research Database (Denmark)

    Michael, OG; Kirk, O; Mathiesen, Lars Reinhardt

    2002-01-01

    CD4 + count followed a triphasic pattern, reflecting an initial phase of rapid redistribution from lymphoid tissues, followed by a slow increase, partially due to an increase in naive CD4+ cell count. From Month 18 onwards, both naive and total CD4 + cell counts stabilized, although viral suppression......-infected patients. The focus was on the naive CD4 + cell time course and associations between naive CD4 + cell counts and established prognostic markers. Total and naive CD4 + cell counts were measured using flow cytometry. The HIV-RNA detection limit was 20 copies/ml. During 36 months of HAART, the total...... was sustained. There was no association between plasma viral load and the increase in naive CD4 + cell count. Importantly, baseline naive CD4 + cell count was significantly associated with the change in naive CD4 + cell count, suggesting that the naive cell count at baseline does influence the immunological...

  8. Statistical damage analysis of transverse cracking in high temperature composite laminates

    International Nuclear Information System (INIS)

    Sun Zuo; Daniel, I.M.; Luo, J.J.

    2003-01-01

    High temperature polymer composites are receiving special attention because of their potential applications to high speed transport airframe structures and aircraft engine components exposed to elevated temperatures. In this study, a statistical analysis was used to study the progressive transverse cracking in a typical high temperature composite. The mechanical properties of this unidirectional laminate were first characterized both at room and high temperatures. Damage mechanisms of transverse cracking in cross-ply laminates were studied by X-ray radiography at room temperature and in-test photography technique at high temperature. Since the tensile strength of unidirectional laminate along transverse direction was found to follow Weibull distribution, Monte Carlo simulation technique based on experimentally obtained parameters was applied to predict transverse cracking at different temperatures. Experiments and simulation showed that they agree well both at room temperature and 149 deg. C (stress free temperature) in terms of applied stress versus crack density. The probability density function (PDF) of transverse crack spacing considering statistical strength distribution was also developed, and good agreements with simulation and experimental results are reached. Finally, a generalized master curve that predicts the normalized applied stress versus normalized crack density for various lay-ups and various temperatures was established

  9. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  10. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  11. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  12. Determining random counts in liquid scintillation counting

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1979-01-01

    During measurements involving coincidence counting techniques, errors can arise due to the detection of chance or random coincidences in the multiple detectors used. A method and the electronic circuits necessary are here described for eliminating this source of error in liquid scintillation detectors used in coincidence counting. (UK)

  13. High white blood cell count at diagnosis of childhood acute lymphoblastic leukaemia: biological background and prognostic impact. Results from the NOPHO ALL-92 and ALL-2000 studies

    DEFF Research Database (Denmark)

    Vaitkeviciene, G; Forestier, E; Hellebostad, M

    2011-01-01

    Prognostic impact of peripheral blood white blood cell count (WBC) at the diagnosis of childhood acute lymphoblastic leukaemia (ALL) was evaluated in a population-based consecutive series of 2666 children aged 1–15 treated for ALL between 1992 and 2008 in the five Nordic countries (Denmark, Finland.......58) and for T-ALL (pEFS5y 0.71 vs. 0.38). Whether the inferior EFS for the subset of patients with high WBC and slow initial response to treatment reflects rare or overlooked cytogenetic aberrations as well as the factors that determine WBC levels at diagnosis awaits exploration....

  14. Statistical study of high-latitude plasma flow during magnetospheric substorms

    Directory of Open Access Journals (Sweden)

    G. Provan

    2004-11-01

    Full Text Available We have utilised the near-global imaging capabilities of the Northern Hemisphere SuperDARN radars, to perform a statistical superposed epoch analysis of high-latitude plasma flows during magnetospheric substorms. The study involved 67 substorms, identified using the IMAGE FUV space-borne auroral imager. A substorm co-ordinate system was developed, centred on the magnetic local time and magnetic latitude of substorm onset determined from the auroral images. The plasma flow vectors from all 67 intervals were combined, creating global statistical plasma flow patterns and backscatter occurrence statistics during the substorm growth and expansion phases. The commencement of the substorm growth phase was clearly observed in the radar data 18-20min before substorm onset, with an increase in the anti-sunward component of the plasma velocity flowing across dawn sector of the polar cap and a peak in the dawn-to-dusk transpolar voltage. Nightside backscatter moved to lower latitudes as the growth phase progressed. At substorm onset a flow suppression region was observed on the nightside, with fast flows surrounding the suppressed flow region. The dawn-to-dusk transpolar voltage increased from ~40kV just before substorm onset to ~75kV 12min after onset. The low-latitude return flow started to increase at substorm onset and continued to increase until 8min after onset. The velocity flowing across the polar-cap peaked 12-14min after onset. This increase in the flux of the polar cap and the excitation of large-scale plasma flow occurred even though the IMF Bz component was increasing (becoming less negative during most of this time. This study is the first to statistically prove that nightside reconnection creates magnetic flux and excites high-latitude plasma flow in a similar way to dayside reconnection and that dayside and nightside reconnection, are two separate time-dependent processes.

  15. VSRR Provisional Drug Overdose Death Counts

    Data.gov (United States)

    U.S. Department of Health & Human Services — This data contains provisional counts for drug overdose deaths based on a current flow of mortality data in the National Vital Statistics System. National...

  16. A rare case of extremely high counts of circulating tumor cells detected in a patient with an oral squamous cell carcinoma

    International Nuclear Information System (INIS)

    Wu, Xianglei; Mastronicola, Romina; Tu, Qian; Faure, Gilbert Charles; De Carvalho Bittencourt, Marcelo; Dolivet, Gilles

    2016-01-01

    Despite aggressive regimens, the clinical outcome of head and neck squamous cell carcinoma remains poor. The detection of circulating tumor cells could potentially improve the management of patients with disseminated cancer, including diagnosis, treatment strategies, and surveillance. Currently, CellSearch ® is the most widely used and the only Food and Drug Administration-cleared system for circulating tumor cells detection in patients with metastatic breast, colorectal, or prostate cancer. In most cases of head and neck squamous cell carcinoma, only low counts of circulating tumor cells have been reported. A 56-year-old white male with no particular medical history, was diagnosed with a squamous cell carcinoma of oral cavity. According to the imaging results (computed tomography and 18 F-fluorodeoxyglucose positron emission tomography / computed tomography) and panendoscopy, the TNM staging was classified as T4N2M0. A non-interruptive pelvimandibulectomy was conducted according to the multidisciplinary meeting advices and the postoperative observations were normal. The patient complained of a painful cervical edema and a trismus 6 weeks after the surgery. A relapse was found by computed tomography and the patient died two weeks later. The search for circulating tumor cells in peripheral venous blood by using the CellSearch ® system revealed a very high count compared with published reports at three time points (pre-operative: 400; intra-operative: 150 and post-operative day 7: 1400 circulating tumor cells). Of note, all detected circulating tumor cells were epidermal growth factor receptor negative. We report here for the first time a rare case of oral squamous cell carcinoma with extremely high circulating tumor cells counts using the CellSearch ® system. The absolute number of circulating tumor cells might predict a particular phase of cancer development as well as a poor survival, potentially contributing to a personalized healthcare

  17. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  18. Large-eddy simulation in a mixing tee junction: High-order turbulent statistics analysis

    International Nuclear Information System (INIS)

    Howard, Richard J.A.; Serre, Eric

    2015-01-01

    Highlights: • Mixing and thermal fluctuations in a junction are studied using large eddy simulation. • Adiabatic and conducting steel wall boundaries are tested. • Wall thermal fluctuations are not the same between the flow and the solid. • Solid thermal fluctuations cannot be predicted from the fluid thermal fluctuations. • High-order turbulent statistics show that the turbulent transport term is important. - Abstract: This study analyses the mixing and thermal fluctuations induced in a mixing tee junction with circular cross-sections when cold water flowing in a pipe is joined by hot water from a branch pipe. This configuration is representative of industrial piping systems in which temperature fluctuations in the fluid may cause thermal fatigue damage on the walls. Implicit large-eddy simulations (LES) are performed for equal inflow rates corresponding to a bulk Reynolds number Re = 39,080. Two different thermal boundary conditions are studied for the pipe walls; an insulating adiabatic boundary and a conducting steel wall boundary. The predicted flow structures show a satisfactory agreement with the literature. The velocity and thermal fields (including high-order statistics) are not affected by the heat transfer with the steel walls. However, predicted thermal fluctuations at the boundary are not the same between the flow and the solid, showing that solid thermal fluctuations cannot be predicted by the knowledge of the fluid thermal fluctuations alone. The analysis of high-order turbulent statistics provides a better understanding of the turbulence features. In particular, the budgets of the turbulent kinetic energy and temperature variance allows a comparative analysis of dissipation, production and transport terms. It is found that the turbulent transport term is an important term that acts to balance the production. We therefore use a priori tests to evaluate three different models for the triple correlation

  19. An automated approach for annual layer counting in ice cores

    Directory of Open Access Journals (Sweden)

    M. Winstrup

    2012-11-01

    Full Text Available A novel method for automated annual layer counting in seasonally-resolved paleoclimate records has been developed. It relies on algorithms from the statistical framework of hidden Markov models (HMMs, which originally was developed for use in machine speech recognition. The strength of the layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on statistical criteria for annual layer identification. The most likely positions of multiple layer boundaries in a section of ice core data are determined simultaneously, and a probabilistic uncertainty estimate of the resulting layer count is provided, ensuring an objective treatment of ambiguous layers in the data. Furthermore, multiple data series can be incorporated and used simultaneously. In this study, the automated layer counting algorithm has been applied to two ice core records from Greenland: one displaying a distinct annual signal and one which is more challenging. The algorithm shows high skill in reproducing the results from manual layer counts, and the resulting timescale compares well to absolute-dated volcanic marker horizons where these exist.

  20. Statistical approach to predict compressive strength of high workability slag-cement mortars

    International Nuclear Information System (INIS)

    Memon, N.A.; Memon, N.A.; Sumadi, S.R.

    2009-01-01

    This paper reports an attempt made to develop empirical expressions to estimate/ predict the compressive strength of high workability slag-cement mortars. Experimental data of 54 mix mortars were used. The mortars were prepared with slag as cement replacement of the order of 0, 50 and 60%. The flow (workability) was maintained at 136+-3%. The numerical and statistical analysis was performed by using database computer software Microsoft Office Excel 2003. Three empirical mathematical models were developed to estimate/predict 28 days compressive strength of high workability slag cement-mortars with 0, 50 and 60% slag which predict the values accurate between 97 and 98%. Finally a generalized empirical mathematical model was proposed which can predict 28 days compressive strength of high workability mortars up to degree of accuracy 95%. (author)

  1. Infrared maritime target detection using the high order statistic filtering in fractional Fourier domain

    Science.gov (United States)

    Zhou, Anran; Xie, Weixin; Pei, Jihong

    2018-06-01

    Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.

  2. An automated approach for annual layer counting in ice cores

    Science.gov (United States)

    Winstrup, M.; Svensson, A.; Rasmussen, S. O.; Winther, O.; Steig, E.; Axelrod, A.

    2012-04-01

    The temporal resolution of some ice cores is sufficient to preserve seasonal information in the ice core record. In such cases, annual layer counting represents one of the most accurate methods to produce a chronology for the core. Yet, manual layer counting is a tedious and sometimes ambiguous job. As reliable layer recognition becomes more difficult, a manual approach increasingly relies on human interpretation of the available data. Thus, much may be gained by an automated and therefore objective approach for annual layer identification in ice cores. We have developed a novel method for automated annual layer counting in ice cores, which relies on Bayesian statistics. It uses algorithms from the statistical framework of Hidden Markov Models (HMM), originally developed for use in machine speech recognition. The strength of this layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on purely objective criteria for annual layer identification. With this methodology, it is possible to determine the most likely position of multiple layer boundaries in an entire section of ice core data at once. It provides a probabilistic uncertainty estimate of the resulting layer count, hence ensuring a proper treatment of ambiguous layer boundaries in the data. Furthermore multiple data series can be incorporated to be used at once, hence allowing for a full multi-parameter annual layer counting method similar to a manual approach. In this study, the automated layer counting algorithm has been applied to data from the NGRIP ice core, Greenland. The NGRIP ice core has very high temporal resolution with depth, and hence the potential to be dated by annual layer counting far back in time. In previous studies [Andersen et al., 2006; Svensson et al., 2008], manual layer counting has been carried out back to 60 kyr BP. A comparison between the counted annual layers based on the two approaches will be presented

  3. Data analysis in high energy physics. A practical guide to statistical methods

    International Nuclear Information System (INIS)

    Behnke, Olaf; Schoerner-Sadenius, Thomas; Kroeninger, Kevin; Schott, Gregory

    2013-01-01

    This practical guide covers the essential tasks in statistical data analysis encountered in high energy physics and provides comprehensive advice for typical questions and problems. The basic methods for inferring results from data are presented as well as tools for advanced tasks such as improving the signal-to-background ratio, correcting detector effects, determining systematics and many others. Concrete applications are discussed in analysis walkthroughs. Each chapter is supplemented by numerous examples and exercises and by a list of literature and relevant links. The book targets a broad readership at all career levels - from students to senior researchers.

  4. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  5. Pile-up corrections for high-precision superallowed β decay half-life measurements via γ-ray photopeak counting

    Science.gov (United States)

    Grinyer, G. F.; Svensson, C. E.; Andreoiu, C.; Andreyev, A. N.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Chakrawarthy, R. S.; Finlay, P.; Garrett, P. E.; Hackman, G.; Hyland, B.; Kulp, W. D.; Leach, K. G.; Leslie, J. R.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Williams, S. J.; Wong, J.; Wood, J. L.; Zganjar, E. F.

    2007-09-01

    A general technique that corrects γ-ray gated β decay-curve data for detector pulse pile-up is presented. The method includes corrections for non-zero time-resolution and energy-threshold effects in addition to a special treatment of saturating events due to cosmic rays. This technique is verified through a Monte Carlo simulation and experimental data using radioactive beams of Na26 implanted at the center of the 8π γ-ray spectrometer at the ISAC facility at TRIUMF in Vancouver, Canada. The β-decay half-life of Na26 obtained from counting 1809-keV γ-ray photopeaks emitted by the daughter Mg26 was determined to be T=1.07167±0.00055 s following a 27σ correction for detector pulse pile-up. This result is in excellent agreement with the result of a previous measurement that employed direct β counting and demonstrates the feasibility of high-precision β-decay half-life measurements through the use of high-purity germanium γ-ray detectors. The technique presented here, while motivated by superallowed-Fermi β decay studies, is general and can be used for all half-life determinations (e.g. α-, β-, X-ray, fission) in which a γ-ray photopeak is used to select the decays of a particular isotope.

  6. Hanford whole body counting manual

    International Nuclear Information System (INIS)

    Palmer, H.E.; Rieksts, G.A.; Lynch, T.P.

    1990-06-01

    This document describes the Hanford Whole Body Counting Program as it is administered by Pacific Northwest Laboratory (PNL) in support of the US Department of Energy--Richland Operations Office (DOE-RL) and its Hanford contractors. Program services include providing in vivo measurements of internally deposited radioactivity in Hanford employees (or visitors). Specific chapters of this manual deal with the following subjects: program operational charter, authority, administration, and practices, including interpreting applicable DOE Orders, regulations, and guidance into criteria for in vivo measurement frequency, etc., for the plant-wide whole body counting services; state-of-the-art facilities and equipment used to provide the best in vivo measurement results possible for the approximately 11,000 measurements made annually; procedures for performing the various in vivo measurements at the Whole Body Counter (WBC) and related facilities including whole body counts; operation and maintenance of counting equipment, quality assurance provisions of the program, WBC data processing functions, statistical aspects of in vivo measurements, and whole body counting records and associated guidance documents. 16 refs., 48 figs., 22 tabs

  7. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  8. Clean Hands Count

    Medline Plus

    Full Text Available ... Like this video? Sign in to make your opinion count. Sign in 131 2 Don't like this video? Sign in to make your opinion count. Sign in 3 Loading... Loading... Transcript The ...

  9. The Big Pumpkin Count.

    Science.gov (United States)

    Coplestone-Loomis, Lenny

    1981-01-01

    Pumpkin seeds are counted after students convert pumpkins to jack-o-lanterns. Among the activities involved, pupils learn to count by 10s, make estimates, and to construct a visual representation of 1,000. (MP)

  10. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    Science.gov (United States)

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  11. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    Energy Technology Data Exchange (ETDEWEB)

    Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)

    2013-02-15

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  12. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    Science.gov (United States)

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  13. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    International Nuclear Information System (INIS)

    Durán-Lobato, Matilde; Enguix-González, Alicia; Fernández-Arévalo, Mercedes; Martín-Banderas, Lucía

    2013-01-01

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 μm, negative zeta potential under –30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R L/S ) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R L/S , while the number of passes applied mainly determined polydispersion. α-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  14. High-resolution Statistics of Solar Wind Turbulence at Kinetic Scales Using the Magnetospheric Multiscale Mission

    Energy Technology Data Exchange (ETDEWEB)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.; Maruca, B. A. [University of Delaware, Newark, DE (United States); Fuselier, S. A.; Burch, J. L. [Southwest Research Institute, San Antonio, TX (United States); Phan, T. D. [Space Sciences Laboratory, University of California, Berkeley, CA (United States); Moore, T. E.; Pollock, C. J.; Gershman, D. J. [NASA Goddard Space Flight Center, Greenbelt, MD (United States); Torbert, R. B. [University of New Hampshire, Durham, NH (United States); Russell, C. T.; Strangeway, R. J., E-mail: chasapis@udel.edu [University of California, Los Angeles, CA (United States)

    2017-07-20

    Using data from the Magnetospheric Multiscale (MMS) and Cluster missions obtained in the solar wind, we examine second-order and fourth-order structure functions at varying spatial lags normalized to ion inertial scales. The analysis includes direct two-spacecraft results and single-spacecraft results employing the familiar Taylor frozen-in flow approximation. Several familiar statistical results, including the spectral distribution of energy, and the sale-dependent kurtosis, are extended down to unprecedented spatial scales of ∼6 km, approaching electron scales. The Taylor approximation is also confirmed at those small scales, although small deviations are present in the kinetic range. The kurtosis is seen to attain very high values at sub-proton scales, supporting the previously reported suggestion that monofractal behavior may be due to high-frequency plasma waves at kinetic scales.

  15. An instrument for the high-statistics measurement of plastic scintillating fibers

    International Nuclear Information System (INIS)

    Buontempo, S.; Ereditato, A.; Marchetti-Stasi, F.; Riccardi, F.; Strolin, P.

    1994-01-01

    There is today widespread use of plastic scintillating fibers in particle physics, mainly for calorimetric and tracking applications. In the case of calorimeters, we have to cope with very massive detectors and a large quantity of scintillating fibers. The CHORUS Collaboration has built a new detector to search for ν μ -ν τ oscillations in the CERN neutrino beam. A crucial task of the detector is ruled by the high-energy resolution calorimeter. For its construction more than 400 000 scintillating plastic fibers have been used. In this paper we report on the design and performance of a new instrument for the high-statistics measurement of the fiber properties, in terms of light yield and light attenuation length. The instrument has been successfully used to test about 3% of the total number of fibers before the construction of the calorimeter. ((orig.))

  16. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  17. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  18. Fast counting electronics for neutron coincidence counting

    International Nuclear Information System (INIS)

    Swansen, J.E.

    1987-01-01

    This patent describes a high speed circuit for accurate neutron coincidence counting comprising: neutron detecting means for providing an above-threshold signal upon neutron detection; amplifying means inputted by the neutron detecting means for providing a pulse output having a pulse width of about 0.5 microseconds upon the input of each above threshold signal; digital processing means inputted by the pulse output of the amplifying means for generating a pulse responsive to each input pulse from the amplifying means and having a pulse width of about 50 nanoseconds effective for processing an expected neutron event rate of about 1 Mpps: pulse stretching means inputted by the digital processing means for producing a pulse having a pulse width of several milliseconds for each pulse received form the digital processing means; visual indicating means inputted by the pulse stretching means for producing a visual output for each pulse received from the digital processing means; and derandomizing means effective to receive the 50 ns neutron event pulses from the digital processing means for storage at a rate up to the neutron event rate of 1 Mpps and having first counter means for storing the input neutron event pulses

  19. Estimating annual high-flow statistics and monthly and seasonal low-flow statistics for ungaged sites on streams in Alaska and conterminous basins in Canada

    Science.gov (United States)

    Wiley, Jeffrey B.; Curran, Janet H.

    2003-01-01

    Methods for estimating daily mean flow-duration statistics for seven regions in Alaska and low-flow frequencies for one region, southeastern Alaska, were developed from daily mean discharges for streamflow-gaging stations in Alaska and conterminous basins in Canada. The 15-, 10-, 9-, 8-, 7-, 6-, 5-, 4-, 3-, 2-, and 1-percent duration flows were computed for the October-through-September water year for 222 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the individual months of July, August, and September for 226 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the season July-through-September for 65 stations in southeastern Alaska. The 7-day, 10-year and 7-day, 2-year low-flow frequencies for the season July-through-September were computed for 65 stations for most of southeastern Alaska. Low-flow analyses were limited to particular months or seasons in order to omit winter low flows, when ice effects reduce the quality of the records and validity of statistical assumptions. Regression equations for estimating the selected high-flow and low-flow statistics for the selected months and seasons for ungaged sites were developed from an ordinary-least-squares regression model using basin characteristics as independent variables. Drainage area and precipitation were significant explanatory variables for high flows, and drainage area, precipitation, mean basin elevation, and area of glaciers were significant explanatory variables for low flows. The estimating equations can be used at ungaged sites in Alaska and conterminous basins in Canada where streamflow regulation, streamflow diversion, urbanization, and natural damming and releasing of water do not affect the streamflow data for the given month or season. Standard errors of estimate ranged from 15 to 56 percent for high-duration flow

  20. Gas detectors for thermal neutron at high counting rates; Etude des detecteurs a gaz pour neutrons thermiques fonctionnant en collection de courant

    Energy Technology Data Exchange (ETDEWEB)

    Mai, V Q [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1965-09-01

    After reminding of the current pulse formation theory in a cylindrical shape counter with and without gas multiplication, one gives the schemes of pulse amplifier and level discriminator which have allowed to verify the above calculations and to make clear the part at high counting rates of the space charge in proportional counters. The theory of that phenomenon is given in chapter V I at last, one gives the results obtained in a nuclear reactor with a counting-channel built with the above electronics circuits. (author) [French] Apres avoir rappele la theorie de la formation de l'impulsion de courant dans un compteur a geometrie cylindrique, fonctionnant sans et avec multiplication de charge, on etudie l'amplificateur et le discriminateur qui ont permis de verifier experimentalement les calculs precedents et de mettre en evidence l'action de la charge d'espace dans les compteurs proportionnels fonctionnant a tres fort taux de comptage. Une theorie de ce phenomene est donnee au chapitre VI; on indique enfin les resultats obtenus dans un reacteur avec une chaine de comptage utilisant les circuits electroniques precedents. (auteur)

  1. Improved method for the determination of the cortisol production rate using high-performance liquid chromatography and liquid scintillation counting

    NARCIS (Netherlands)

    van Ingen, H. E.; Endert, E.

    1988-01-01

    Two new methods for the determination of the cortisol production rate using reversed-phase high-performance liquid chromatography are described. One uses ultraviolet detection at 205 nm, the other on-line post-column derivatization with benzamidine, followed by fluorimetric detection. The specific

  2. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  3. Integration of statistical modeling and high-content microscopy to systematically investigate cell-substrate interactions.

    Science.gov (United States)

    Chen, Wen Li Kelly; Likhitpanichkul, Morakot; Ho, Anthony; Simmons, Craig A

    2010-03-01

    Cell-substrate interactions are multifaceted, involving the integration of various physical and biochemical signals. The interactions among these microenvironmental factors cannot be facilely elucidated and quantified by conventional experimentation, and necessitate multifactorial strategies. Here we describe an approach that integrates statistical design and analysis of experiments with automated microscopy to systematically investigate the combinatorial effects of substrate-derived stimuli (substrate stiffness and matrix protein concentration) on mesenchymal stem cell (MSC) spreading, proliferation and osteogenic differentiation. C3H10T1/2 cells were grown on type I collagen- or fibronectin-coated polyacrylamide hydrogels with tunable mechanical properties. Experimental conditions, which were defined according to central composite design, consisted of specific permutations of substrate stiffness (3-144 kPa) and adhesion protein concentration (7-520 microg/mL). Spreading area, BrdU incorporation and Runx2 nuclear translocation were quantified using high-content microscopy and modeled as mathematical functions of substrate stiffness and protein concentration. The resulting response surfaces revealed distinct patterns of protein-specific, substrate stiffness-dependent modulation of MSC proliferation and differentiation, demonstrating the advantage of statistical modeling in the detection and description of higher-order cellular responses. In a broader context, this approach can be adapted to study other types of cell-material interactions and can facilitate the efficient screening and optimization of substrate properties for applications involving cell-material interfaces. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. Waste generated in high-rise buildings construction: a quantification model based on statistical multiple regression.

    Science.gov (United States)

    Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana

    2015-05-01

    Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Statistical characteristics of transient enclosure voltage in ultra-high-voltage gas-insulated switchgear

    Science.gov (United States)

    Cai, Yuanji; Guan, Yonggang; Liu, Weidong

    2017-06-01

    Transient enclosure voltage (TEV), which is a phenomenon induced by the inner dielectric breakdown of SF6 during disconnector operations in a gas-insulated switchgear (GIS), may cause issues relating to shock hazard and electromagnetic interference to secondary equipment. This is a critical factor regarding the electromagnetic compatibility of ultra-high-voltage (UHV) substations. In this paper, the statistical characteristics of TEV at UHV level are collected from field experiments, and are analyzed and compared to those from a repeated strike process. The TEV waveforms during disconnector operations are recorded by a self-developed measurement system first. Then, statistical characteristics, such as the pulse number, duration of pulses, frequency components, magnitude and single pulse duration, are extracted. The transmission line theory is introduced to analyze the TEV and is validated by the experimental results. Finally, the relationship between the TEV and the repeated strike process is analyzed. This proves that the pulse voltage of the TEV is proportional to the corresponding breakdown voltage. The results contribute to the definition of the standard testing waveform of the TEV, and can aid the protection of electronic devices in substations by minimizing the threat of this phenomenon.

  6. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution.

    All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity.

    An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to

  7. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution. All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity. An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to auroral arcs as a result of ionosphere-magnetosphere coupling, as discussed by Aikio et al. (2004 In

  8. Counting to ten milliseconds: low-anger, but not high-anger, individuals pause following negative evaluations.

    Science.gov (United States)

    Robinson, Michael D; Wilkowski, Benjamin M; Meier, Brian P; Moeller, Sara K; Fetterman, Adam K

    2012-01-01

    Low-anger individuals are less reactive, both emotionally and behaviourally, to a large variety of situational primes to anger and aggression. Why this is so, from an affective processing perspective, has been largely conjectural. Four studies (total N=270) sought to link individual differences in anger to tendencies exhibited in basic affective processing tasks. On the basis of motivational factors and considerations, it was hypothesised that negative evaluations would differentially activate a psychological alarm system at low levels of anger, resulting in a pause that should be evident in the speed of making subsequent evaluations. Just such a pattern was evident in all studies. By contrast, high-anger individuals did not pause following their negative evaluations. In relation to this affective processing tendency, at least, dramatically different effects were observed among low- versus high-anger individuals. Implications for the personality-processing literature, theories of trait anger, and fast-acting regulatory processes are discussed.

  9. A High-resolution Atlas and Statistical Model of the Vocal Tract from Structural MRI.

    Science.gov (United States)

    Woo, Jonghye; Lee, Junghoon; Murano, Emi Z; Xing, Fangxu; Al-Talib, Meena; Stone, Maureen; Prince, Jerry L

    Magnetic resonance imaging (MRI) is an essential tool in the study of muscle anatomy and functional activity in the tongue. Objective assessment of similarities and differences in tongue structure and function has been performed using unnormalized data, but this is biased by the differences in size, shape, and orientation of the structures. To remedy this, we propose a methodology to build a 3D vocal tract atlas based on structural MRI volumes from twenty normal subjects. We first constructed high-resolution volumes from three orthogonal stacks. We then removed extraneous data so that all 3D volumes contained the same anatomy. We used an unbiased diffeomorphic groupwise registration using a cross-correlation similarity metric. Principal component analysis was applied to the deformation fields to create a statistical model from the atlas. Various evaluations and applications were carried out to show the behaviour and utility of the atlas.

  10. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  11. Geant4 electromagnetic physics for high statistic simulation of LHC experiments

    CERN Document Server

    Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L

    2012-01-01

    An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.

  12. The Statistical Analysis of Relation between Compressive and Tensile/Flexural Strength of High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Kępniak M.

    2016-12-01

    Full Text Available This paper addresses the tensile and flexural strength of HPC (high performance concrete. The aim of the paper is to analyse the efficiency of models proposed in different codes. In particular, three design procedures from: the ACI 318 [1], Eurocode 2 [2] and the Model Code 2010 [3] are considered. The associations between design tensile strength of concrete obtained from these three codes and compressive strength are compared with experimental results of tensile strength and flexural strength by statistical tools. Experimental results of tensile strength were obtained in the splitting test. Based on this comparison, conclusions are drawn according to the fit between the design methods and the test data. The comparison shows that tensile strength and flexural strength of HPC depend on more influential factors and not only compressive strength.

  13. Dissipative Effects on Inertial-Range Statistics at High Reynolds Numbers.

    Science.gov (United States)

    Sinhuber, Michael; Bewley, Gregory P; Bodenschatz, Eberhard

    2017-09-29

    Using the unique capabilities of the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, we report experimental measurements in classical grid turbulence that uncover oscillations of the velocity structure functions in the inertial range. This was made possible by measuring extremely long time series of up to 10^{10} samples of the turbulent fluctuating velocity, which corresponds to O(10^{7}) integral length scales. The measurements were conducted in a well-controlled environment at a wide range of high Reynolds numbers from R_{λ}=110 up to R_{λ}=1600, using both traditional hot-wire probes as well as the nanoscale thermal anemometry probe developed at Princeton University. An implication of the observed oscillations is that dissipation influences the inertial-range statistics of turbulent flows at scales significantly larger than predicted by current models and theories.

  14. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  15. Statistical dynamic image reconstruction in state-of-the-art high-resolution PET

    International Nuclear Information System (INIS)

    Rahmim, Arman; Cheng, J-C; Blinder, Stephan; Camborde, Maurie-Laure; Sossi, Vesna

    2005-01-01

    Modern high-resolution PET is now more than ever in need of scrutiny into the nature and limitations of the imaging modality itself as well as image reconstruction techniques. In this work, we have reviewed, analysed and addressed the following three considerations within the particular context of state-of-the-art dynamic PET imaging: (i) the typical average numbers of events per line-of-response (LOR) are now (much) less than unity (ii) due to the physical and biological decay of the activity distribution, one requires robust and efficient reconstruction algorithms applicable to a wide range of statistics and (iii) the computational considerations in dynamic imaging are much enhanced (i.e., more frames to be stored and reconstructed). Within the framework of statistical image reconstruction, we have argued theoretically and shown experimentally that the sinogram non-negativity constraint (when using the delayed-coincidence and/or scatter-subtraction techniques) is especially expected to result in an overestimation bias. Subsequently, two schemes are considered: (a) subtraction techniques in which an image non-negativity constraint has been imposed and (b) implementation of random and scatter estimates inside the reconstruction algorithms, thus enabling direct processing of Poisson-distributed prompts. Both techniques are able to remove the aforementioned bias, while the latter, being better conditioned theoretically, is able to exhibit superior noise characteristics. We have also elaborated upon and verified the applicability of the accelerated list-mode image reconstruction method as a powerful solution for accurate, robust and efficient dynamic reconstructions of high-resolution data (as well as a number of additional benefits in the context of state-of-the-art PET)

  16. Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data

    Science.gov (United States)

    Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.

    2018-01-01

    One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.

  17. Statistical surrogate models for prediction of high-consequence climate change.

    Energy Technology Data Exchange (ETDEWEB)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.

  18. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Science.gov (United States)

    Wuensch, Walter; Degiovanni, Alberto; Calatroni, Sergio; Korsbäck, Anders; Djurabekova, Flyura; Rajamäki, Robin; Giner-Navarro, Jorge

    2017-01-01

    In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  19. Matching of experimental and statistical-model thermonuclear reaction rates at high temperatures

    International Nuclear Information System (INIS)

    Newton, J. R.; Longland, R.; Iliadis, C.

    2008-01-01

    We address the problem of extrapolating experimental thermonuclear reaction rates toward high stellar temperatures (T>1 GK) by using statistical model (Hauser-Feshbach) results. Reliable reaction rates at such temperatures are required for studies of advanced stellar burning stages, supernovae, and x-ray bursts. Generally accepted methods are based on the concept of a Gamow peak. We follow recent ideas that emphasized the fundamental shortcomings of the Gamow peak concept for narrow resonances at high stellar temperatures. Our new method defines the effective thermonuclear energy range (ETER) by using the 8th, 50th, and 92nd percentiles of the cumulative distribution of fractional resonant reaction rate contributions. This definition is unambiguous and has a straightforward probability interpretation. The ETER is used to define a temperature at which Hauser-Feshbach rates can be matched to experimental rates. This matching temperature is usually much higher compared to previous estimates that employed the Gamow peak concept. We suggest that an increased matching temperature provides more reliable extrapolated reaction rates since Hauser-Feshbach results are more trustwhorthy the higher the temperature. Our ideas are applied to 21 (p,γ), (p,α), and (α,γ) reactions on A=20-40 target nuclei. For many of the cases studied here, our extrapolated reaction rates at high temperatures differ significantly from those obtained using the Gamow peak concept

  20. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Directory of Open Access Journals (Sweden)

    Walter Wuensch

    2017-01-01

    Full Text Available In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  1. The importance of counting cows: Social and economic effects of a high-level nuclear waste repository in Texas

    International Nuclear Information System (INIS)

    Fleishman, J.; Brody, J.; Galavotti, C.

    1987-01-01

    Impact assessments that rely on existing records and extrapolation from broad geographic areas provide inadequate information about social and economic conditions important in siting a high-level nuclear waste repository. Texas has used an alternative approach, involving systematic surveys of representative samples of local residents, farm operators and businesses in the proposed site counties and comparison areas. Results show that this technique is useful in describing current economic conditions, including characteristics of key sectors of the economy, changes related to the siting process, and expectations that may influence investment. In addition, the surveys are useful in assessing the degree of consensus in local communities and in identifying possible differential effects of a repository on particular groups. They also provide a baseline for long-term monitoring of repository effects and contribute to their understanding of the underlying processes that shape public response to the nuclear waste program

  2. Predictive value of pretreatment lymphocyte count in stage II colorectal cancer and in high-risk patients treated with adjuvant chemotherapy.

    Science.gov (United States)

    Liang, Lei; Zhu, Ji; Jia, Huixun; Huang, Liyong; Li, Dawei; Li, Qingguo; Li, Xinxiang

    2016-01-05

    Pretreatment lymphocyte count (LC) has been associated with prognosis and chemotherapy response in several cancers. The predictive value of LC for stage II colorectal cancer (CRC) and for high-risk patients treated with adjuvant chemotherapy (AC) has not been determined. A retrospective review of prospectively collected data from 1332 consecutive stage II CRC patients who underwent curative tumor resection was conducted. A pretreatment LC value risk, 459 (62.2%) of whom received AC. Patients with low LCs had significantly worse 5-year OS (74.6% vs. 90.2%, p risk patients with low LCs had the poorest DFS (p value or combined with high-risk status were both independent prognostic factors(p risk, AC-treated patients with high LCs had significantly longer DFS than untreated patients (HR, 0.594; 95% CI, 0.364-0.970; p = 0.035). There was no difference or trend for DFS or OS in patients with low LCs, regardless of the use of AC (DFS, p = 0.692; OS, p = 0.522). Low LC was also independently associated with poorer DFS in high-risk, AC-treated patients (HR, 1.885; 95% CI, 1.112-3.196; p = 0.019). Pretreatment LC is an independent prognostic factor for survival in stage II CRC. Furthermore, pretreatment LC reliably predicts chemotherapeutic efficacy in high-risk patients with stage II CRC.

  3. Title V Permitting Statistics Inventory

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Title V Permitting Statistics Inventory contains measured and estimated nationwide statistical data, consisting of counts of permitted sources, types of permits...

  4. Statistics of high-altitude and high-latitude O+ ion outflows observed by Cluster/CIS

    Directory of Open Access Journals (Sweden)

    A. Korth

    2005-07-01

    Full Text Available The persistent outflows of O+ ions observed by the Cluster CIS/CODIF instrument were studied statistically in the high-altitude (from 3 up to 11 RE and high-latitude (from 70 to ~90 deg invariant latitude, ILAT polar region. The principal results are: (1 Outflowing O+ ions with more than 1keV are observed above 10 RE geocentric distance and above 85deg ILAT location; (2 at 6-8 RE geocentric distance, the latitudinal distribution of O+ ion outflow is consistent with velocity filter dispersion from a source equatorward and below the spacecraft (e.g. the cusp/cleft; (3 however, at 8-12 RE geocentric distance the distribution of O+ outflows cannot be explained by velocity filter only. The results suggest that additional energization or acceleration processes for outflowing O+ ions occur at high altitudes and high latitudes in the dayside polar region. Keywords. Magnetospheric physics (Magnetospheric configuration and dynamics, Solar wind-magnetosphere interactions

  5. High-temperature short-time pasteurisation of human breastmilk is efficient in retaining protein and reducing the bacterial count.

    Science.gov (United States)

    Klotz, Daniel; Joellenbeck, Mirjam; Winkler, Karl; Kunze, Mirjam; Huzly, Daniela; Hentschel, Roland

    2017-05-01

    Milk banks are advised to use Holder pasteurisation to inactivate the cytomegalovirus, but the process adversely affects the bioactive properties of human breastmilk. This study explored the antibacterial efficacy of an alternative high-temperature short-time (HTST) treatment of human breastmilk and its effect on marker proteins, compared with the Holder method. Breastmilk samples were obtained from 27 mothers with infants in a German neonatal intensive care unit. The samples were either heated to 62°C for five seconds using HTST or processed using Holder pasteurisation, at 63 ± 0.5°C for 30 minutes. Immunoglobulin A, lactoferrin, lysozyme, alkaline phosphatase and bile salt-stimulated lipase concentrations and bacterial colony-forming units/mL were measured before and after heating. HTST-treated samples retained higher rates of immunoglobulin A (95% versus 83%), alkaline phosphatase (6% versus 0%) and bile salt-stimulated lipase (0.8% versus 0.4%) than Holder pasteurisation samples (all p HTST treatment protocol retained some of the bioactive properties of human breastmilk and appeared to have similar antibacterial efficacy to Holder pasteurisation. ©2017 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  6. Oral manifestations of human immunodeficiency virus/acquired immunodeficiency syndrome and their correlation to cluster of differentiation lymphocyte count in population of North-East India in highly active antiretroviral therapy era

    Directory of Open Access Journals (Sweden)

    Sarat Kumar Nayak

    2016-01-01

    Full Text Available Background: The human immunodeficiency virus (HIV infection which manifests as acquired immunodeficiency syndrome (AIDS is a disease involving the defects of the T-lymphocyte arm of the immune system. Certain laboratory parameters such as the cluster of differentiation (CD4 count and clinical parameters have long been used as markers of disease progression. In industrialized countries, many studies show a highly correlation between the incidence of oral lesions and immunosuppression and hence, can be used as a marker of immunosuppression. This might not be applicable to a developing country like India. In this study, efforts have been made to supplement the present knowledge on various aspects of oral manifestations in HIV patients in the Indian subcontinent. Aims: To correlate the oral manifestations in HIV/AIDS patients to the level of circulating CD4+ T-lymphocyte count and their effect in anti-retroviral therapy (ART. Subjects and Methods: A total of 104 HIV positive patients were examined for oral lesions. The CD4 count estimated on the same day by fluorescent activated cell sort count machine was then correlated with various oral lesions. Results: Oral manifestations appeared when CD4 count decreased below 500 cells/mm3. Moreover, oral lesions found at different stages showed very strong correlation to their respective CD4 count. Furthermore, there was considerable decline in the incidence of oral manifestations in patients undergoing highly active ART. Conclusions: Oral manifestations are highly predictive markers of severe immune deterioration and disease progression in HIV patients.

  7. Cluster survey of the high-altitude cusp properties: a three-year statistical study

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2004-09-01

    Full Text Available The global characteristics of the high-altitude cusp and its surrounding regions are investigated using a three-year statistical survey based on data obtained by the Cluster spacecraft. The analysis involves an elaborate orbit-sampling methodology that uses a model field and takes into account the actual solar wind conditions and level of geomagnetic activity. The spatial distribution of the magnetic field and various plasma parameters in the vicinity of the low magnetic field exterior cusp are determined and it is found that: 1 The magnetic field distribution shows the presence of an intermediate region between the magnetosheath and the magnetosphere: the exterior cusp, 2 This region is characterized by the presence of dense plasma of magnetosheath origin; a comparison with the Tsyganenko (1996 magnetic field model shows that it is diamagnetic in nature, 3 The spatial distributions show that three distinct boundaries with the lobes, the dayside plasma sheet and the magnetosheath surround the exterior cusp, 4 The external boundary with the magnetosheath has a sharp bulk velocity gradient, as well as a density decrease and temperature increase as one goes from the magnetosheath to the exterior cusp, 5 While the two inner boundaries form a funnel, the external boundary shows no clear indentation, 6 The plasma and magnetic pressure distributions suggest that the exterior cusp is in equilibrium with its surroundings in a statistical sense, and 7 A preliminary analysis of the bulk flow distributions suggests that the exterior cusp is stagnant under northward IMF conditions but convective under southward IMF conditions.

  8. Effectiveness of mouse minute virus inactivation by high temperature short time treatment technology: a statistical assessment.

    Science.gov (United States)

    Murphy, Marie; Quesada, Guillermo Miro; Chen, Dayue

    2011-11-01

    Viral contamination of mammalian cell cultures in GMP manufacturing facility represents a serious safety threat to biopharmaceutical industry. Such adverse events usually require facility shutdown for cleaning/decontamination, and thus result in significant loss of production and/or delay of product development. High temperature short time (HTST) treatment of culture media has been considered as an effective method to protect GMP facilities from viral contaminations. Log reduction factor (LRF) has been commonly used to measure the effectiveness of HTST treatment for viral inactivation. However, in order to prevent viral contaminations, HTST treatment must inactivate all infectious viruses (100%) in the medium batch since a single virus is sufficient to cause contamination. Therefore, LRF may not be the most appropriate indicator for measuring the effectiveness of HTST in preventing viral contaminations. We report here the use of the probability to achieve complete (100%) virus inactivation to assess the effectiveness of HTST treatment. By using mouse minute virus (MMV) as a model virus, we have demonstrated that the effectiveness of HTST treatment highly depends upon the level of viral contaminants in addition to treatment temperature and duration. We believe that the statistical method described in this report can provide more accurate information about the power and potential limitation of technologies such as HTST in our shared quest to mitigate the risk of viral contamination in manufacturing facilities. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  9. On the efficiency of high-energy particle identification statistical methods

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1982-01-01

    An attempt is made to analyze the statistical methods of making decisions on the high-energy particle identification. The Bayesian approach is shown to provide the most complete account of the primary discriminative information between the particles of various tupes. It does not impose rigid requirements on the density form of the probability function and ensures the account of the a priori information as compared with the Neyman-Pearson approach, the mimimax technique and the heristic rules of the decision limits construction in the variant region of the specially chosen parameter. The methods based on the concept of the nearest neighbourhood are shown to be the most effective one among the local methods of the probability function density estimation. The probability distances between the training sample classes are suggested to make a decision on selecting the high-energy particle detector optimal parameters. The method proposed and the software constructed are tested on the problem of the cosmic radiation hadron identification by means of transition radiation detectors (the ''PION'' experiment)

  10. Statistics of Deep Convection in the Congo Basin Derived From High-Resolution Simulations.

    Science.gov (United States)

    White, B.; Stier, P.; Kipling, Z.; Gryspeerdt, E.; Taylor, S.

    2016-12-01

    Convection transports moisture, momentum, heat and aerosols through the troposphere, and so the temporal variability of convection is a major driver of global weather and climate. The Congo basin is home to some of the most intense convective activity on the planet and is under strong seasonal influence of biomass burning aerosol. However, deep convection in the Congo basin remains under studied compared to other regions of tropical storm systems, especially when compared to the neighbouring, relatively well-understood West African climate system. We use the WRF model to perform a high-resolution, cloud-system resolving simulation to investigate convective storm systems in the Congo. Our setup pushes the boundaries of current computational resources, using a 1 km grid length over a domain covering millions of square kilometres and for a time period of one month. This allows us to draw statistical conclusions on the nature of the simulated storm systems. Comparing data from satellite observations and the model enables us to quantify the diurnal variability of deep convection in the Congo basin. This approach allows us to evaluate our simulations despite the lack of in-situ observational data. This provides a more comprehensive analysis of the diurnal cycle than has previously been shown. Further, we show that high-resolution convection-permitting simulations performed over near-seasonal timescales can be used in conjunction with satellite observations as an effective tool to evaluate new convection parameterisations.

  11. Is total lymphocyte count related to nutritional markers in hospitalized older adults?

    Directory of Open Access Journals (Sweden)

    Vânia Aparecida LEANDRO-MERHI

    Full Text Available ABSTRACT BACKGROUND Older patients are commonly malnourished during hospital stay, and a high prevalence of malnutrition is found in hospitalized patients aged more than 65 years. OBJECTIVE To investigate whether total lymphocyte count is related to other nutritional markers in hospitalized older adults. METHODS Hospitalized older adults (N=131 were recruited for a cross-sectional study. Their nutritional status was assessed by the Nutritional Risk Screening (NRS, anthropometry, and total lymphocyte count. The statistical analyses included the chi-square test, Fisher's exact test, and Mann-Whitney test. Spearman's linear correlation coefficient determined whether total lymphocyte count was correlated with the nutritional markers. Multiple linear regression determined the parameters associated with lymphocyte count. The significance level was set at 5%. RESULTS According to the NRS, 41.2% of the patients were at nutritional risk, and 36% had mild or moderate depletion according to total lymphocyte count. Total lymphocyte count was weakly correlated with mid-upper arm circumference (r=0.20507; triceps skinfold thickness (r=0.29036, and length of hospital stay (r= -0.21518. Total lymphocyte count in different NRS categories differed significantly: older adults who were not at nutritional risk had higher mean and median total lymphocyte count ( P =0.0245. Multiple regression analysis showed that higher lymphocyte counts were associated with higher triceps skinfold thicknesses and no nutritional risk according to the NRS. CONCLUSION Total lymphocyte count was correlated with mid-upper arm circumference, triceps skinfold thickness, and nutritional risk according to the NRS. In multiple regression the combined factors that remained associated with lymphocyte count were NRS and triceps skinfold thickness. Therefore, total lymphocyte count may be considered a nutritional marker. Other studies should confirm these findings.

  12. Serum inhibin-b in fertile men is strongly correlated with low but not high sperm counts: a coordinated study of 1,797 European and US men

    DEFF Research Database (Denmark)

    Jørgensen, Niels; Liu, Fan; Andersson, Anna-Maria

    2010-01-01

    To describe associations between serum inhibin-b and sperm counts, adjusted for effect of time of blood sampling, in larger cohorts than have been previously reported.......To describe associations between serum inhibin-b and sperm counts, adjusted for effect of time of blood sampling, in larger cohorts than have been previously reported....

  13. What every radiochemist should know about statistics

    International Nuclear Information System (INIS)

    Nicholson, W.L.

    1994-04-01

    Radionuclide decay and measurement with appropriate counting instruments is one of the few physical processes for which exact mathematical/probabilistic models are available. This paper discusses statistical procedures associated with display and analysis of radionuclide counting data that derive from these exact models. For low count situations the attractiveness of fixed-count-random-time procedures is discussed

  14. First high-statistics and high-resolution recoil-ion data from the WITCH retardation spectrometer

    Science.gov (United States)

    Finlay, P.; Breitenfeldt, M.; Porobić, T.; Wursten, E.; Ban, G.; Beck, M.; Couratin, C.; Fabian, X.; Fléchard, X.; Friedag, P.; Glück, F.; Herlert, A.; Knecht, A.; Kozlov, V. Y.; Liénard, E.; Soti, G.; Tandecki, M.; Traykov, E.; Van Gorp, S.; Weinheimer, Ch.; Zákoucký, D.; Severijns, N.

    2016-07-01

    The first high-statistics and high-resolution data set for the integrated recoil-ion energy spectrum following the β^+ decay of 35Ar has been collected with the WITCH retardation spectrometer located at CERN-ISOLDE. Over 25 million recoil-ion events were recorded on a large-area multichannel plate (MCP) detector with a time-stamp precision of 2ns and position resolution of 0.1mm due to the newly upgraded data acquisition based on the LPC Caen FASTER protocol. The number of recoil ions was measured for more than 15 different settings of the retardation potential, complemented by dedicated background and half-life measurements. Previously unidentified systematic effects, including an energy-dependent efficiency of the main MCP and a radiation-induced time-dependent background, have been identified and incorporated into the analysis. However, further understanding and treatment of the radiation-induced background requires additional dedicated measurements and remains the current limiting factor in extracting a beta-neutrino angular correlation coefficient for 35Ar decay using the WITCH spectrometer.

  15. On Counting the Rational Numbers

    Science.gov (United States)

    Almada, Carlos

    2010-01-01

    In this study, we show how to construct a function from the set N of natural numbers that explicitly counts the set Q[superscript +] of all positive rational numbers using a very intuitive approach. The function has the appeal of Cantor's function and it has the advantage that any high school student can understand the main idea at a glance…

  16. Determining Gate Count Reliability in a Library Setting

    OpenAIRE

    Jeffrey Phillips

    2016-01-01

    Objective – Patron counts are a common form of measurement for library assessment. To develop accurate library statistics, it is necessary to determine any differences between various counting devices. A yearlong comparison between card reader turnstiles and laser gate counters in a university library sought to offer a standard percentage of variance and provide suggestions to increase the precision of counts. Methods – The collection of library exit counts identified the differences be...

  17. A statistical study of high-altitude electric fields measured on the Viking satellite

    International Nuclear Information System (INIS)

    Lindqvist, P.A.; Marklund, G.T.

    1990-01-01

    Characteristics of high-altitude data from the Viking electric field instrument are presented in a statistical study based on 109 Viking orbits. The study is focused in particular on the signatures of and relationships between various parameters measured by the electric field instrument, such as the parallel and transverse (to B) components of the electric field instrument, such as electric field variability. A major goal of the Viking mission was to investigate the occurrence and properties of parallel electric fields and their role in the auroral acceleration process. The results in this paper on the altitude distribution of the electric field variability confirm earlier findings on the distribution of small-scale electric fields and indicate the presence of parallel fields up to about 11,000 km altitude. The directly measured parallel electric field is also investigated in some detail. It is in general directed upward with an average value of 1 mV/m, but depends on, for example, altitude and plasma density. Possible sources of error in the measurement of the parallel field are also considered and accounted for

  18. Statistical properties of highly excited quantum eigenstates of a strongly chaotic system

    International Nuclear Information System (INIS)

    Aurich, R.; Steiner, F.

    1992-06-01

    Statistical properties of highly excited quantal eigenstates are studied for the free motion (geodesic flow) on a compact surface of constant negative curvature (hyperbolic octagon) which represents a strongly chaotic system (K-system). The eigenstates are expanded in a circular-wave basis, and it turns out that the expansion coefficients behave as Gaussian pseudo-random numbers. It is shown that this property leads to a Gaussian amplitude distribution P(ψ) in the semiclassical limit, i.e. the wavefunctions behave as Gaussian random functions. This behaviour, which should hold for chaotic systems in general, is nicely confirmed for eigenstates lying 10000 states above the ground state thus probing the semiclassical limit. In addition, the autocorrelation function and the path-correlation function are calculated and compared with a crude semiclassical Bessel-function approximation. Agreement with the semiclassical prediction is only found, if a local averaging is performed over roughly 1000 de Broglie wavelengths. On smaller scales, the eigenstates show much more structure than predicted by the first semiclassical approximation. (orig.)

  19. Statistical list-mode image reconstruction for the high resolution research tomograph

    International Nuclear Information System (INIS)

    Rahmim, A; Lenox, M; Reader, A J; Michel, C; Burbar, Z; Ruth, T J; Sossi, V

    2004-01-01

    We have investigated statistical list-mode reconstruction applicable to a depth-encoding high resolution research tomograph. An image non-negativity constraint has been employed in the reconstructions and is shown to effectively remove the overestimation bias introduced by the sinogram non-negativity constraint. We have furthermore implemented a convergent subsetized (CS) list-mode reconstruction algorithm, based on previous work (Hsiao et al 2002 Conf. Rec. SPIE Med. Imaging 4684 10-19; Hsiao et al 2002 Conf. Rec. IEEE Int. Symp. Biomed. Imaging 409-12) on convergent histogram OSEM reconstruction. We have demonstrated that the first step of the convergent algorithm is exactly equivalent (unlike the histogram-mode case) to the regular subsetized list-mode EM algorithm, while the second and final step takes the form of additive updates in image space. We have shown that in terms of contrast, noise as well as FWHM width behaviour, the CS algorithm is robust and does not result in limit cycles. A hybrid algorithm based on the ordinary and the convergent algorithms is also proposed, and is shown to combine the advantages of the two algorithms (i.e. it is able to reach a higher image quality in fewer iterations while maintaining the convergent behaviour), making the hybrid approach a good alternative to the ordinary subsetized list-mode EM algorithm

  20. Generalized interpolative quantum statistics

    International Nuclear Information System (INIS)

    Ramanathan, R.

    1992-01-01

    A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently

  1. Platelet Count and Plateletcrit

    African Journals Online (AJOL)

    strated that neonates with late onset sepsis (bacteremia after 3 days of age) had a dramatic increase in MPV and. PDW18. We hypothesize that as the MPV and PDW increase and platelet count and PCT decrease in sick children, intui- tively, the ratio of MPV to PCT; MPV to Platelet count,. PDW to PCT, PDW to platelet ...

  2. Clean Hands Count

    Medline Plus

    Full Text Available ... starting stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with ... ads? Get YouTube Red. Working... Not now Try it free Find out why Close Clean Hands Count ...

  3. Counting It Twice.

    Science.gov (United States)

    Schattschneider, Doris

    1991-01-01

    Provided are examples from many domains of mathematics that illustrate the Fubini Principle in its discrete version: the value of a summation over a rectangular array is independent of the order of summation. Included are: counting using partitions as in proof by pictures, combinatorial arguments, indirect counting as in the inclusion-exclusion…

  4. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  5. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  6. Micro-electrodeposition techniques for the preparation of small actinide counting sources for ultra-high resolution alpha spectrometry by microcalorimetry

    International Nuclear Information System (INIS)

    Plionis, A.A.; Hastings, E.P.; LaMont, S.P.; Dry, D.E.; Bacrania, M.K.; Rabin, M.W.; Rim, J.H.

    2009-01-01

    Special considerations and techniques are desired for the preparation of small actinide counting sources. Counting sources have been prepared on metal disk substrates (planchets) with an active area of only 0.079 mm 2 . This represents a 93.75% reduction in deposition area from standard electrodeposition methods. The actinide distribution upon the smaller planchet must remain thin and uniform to allow alpha particle emissions to escape the counting source with a minimal amount of self-attenuation. This work describes the development of micro-electrodeposition methods and optimization of the technique with respect to deposition time and current density for various planchet sizes. (author)

  7. Number projected statistics and the pairing correlations at high excitation energies

    International Nuclear Information System (INIS)

    Esebbag, C.; Egido, J.L.

    1993-01-01

    We analyze the use of particle-number projected statistics (PNPS) as an effective way to include the quantum and statistical fluctuations, associated with the pairing degree of freedom, left out in finite-temperature mean-field theories. As a numerical application the exact-soluble degenerate model is worked out. In particular, we find that the sharp temperature-induced superfluid-normal phase transition, predicted in the mean-field approximations, is washed out in the PNPS. Some approximations as well as the Landau prescription to include statistical fluctuations are also discussed. We find that the Landau prescription provides a reasonable approximation to the PNPS. (orig.)

  8. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, H.; Waara, M.; Arvelius, S.; Yamauchi, M.; Lundin, R. [Inst. of Space Physics, Kiruna (Sweden); Marghitu, O. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Inst. for Space Sciences, Bucharest (Romania); Bouhram, M. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); CETP-CNRS, Saint-Maur (France); Hobara, Y. [Inst. of Space Physics, Kiruna (Sweden); Univ. of Sheffield, Sheffield (United Kingdom); Reme, H.; Sauvaud, J.A.; Dandouras, I. [Centre d' Etude Spatiale des Rayonnements, Toulouse (France); Balogh, A. [Imperial Coll. of Science, Technology and Medicine, London (United Kingdom); Kistler, L.M. [Univ. of New Hampshire, Durham (United States); Klecker, B. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Carlson, C.W. [Space Science Lab., Univ. of California, Berkeley (United States); Bavassano-Cattaneo, M.B. [Ist. di Fisica dello Spazio Interplanetario, Roma (Italy); Korth, A. [Max-Planck-Inst. fuer Sonnensystemforschung, Katlenburg-Lindau (Germany)

    2006-07-01

    The results of a statistical study of oxygen ion outflow using cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H{sup +}) and oxygen ions (O{sup +}) from 3 years (2001-2003) of spring orbits (January to May) have been used. The altitudes covered were mainly in the range 5-12 R{sub E} geocentric distance. It was found that O{sup +} is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O{sup +} parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O{sup +} parallel bulk velocities in excess of 60 km s{sup -1} were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O{sup +} the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H{sup +} and O{sup +} was found to typically be close to the same throughout the observation interval when the H{sup +} bulk velocity was calculated for all pitch-angles. When the H{sup +} bulk velocity was calculated for upward moving particles only the H{sup +} parallel bulk velocity was typically higher than that of O{sup +}. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O{sup +} ions dominates. The thermal velocity of O{sup +} was always well below that of H{sup +}. Thus perpendicular energization that is more effective for O{sup +} takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel

  9. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further

  10. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel acceleration. In particular centrifugal acceleration of the outflowing ions, which may

  11. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  12. A statistical study towards high-mass BGPS clumps with the MALT90 survey

    Science.gov (United States)

    Liu, Xiao-Lan; Xu, Jin-Long; Ning, Chang-Chun; Zhang, Chuan-Peng; Liu, Xiao-Tao

    2018-01-01

    In this work, we perform a statistical investigation towards 50 high-mass clumps using data from the Bolocam Galactic Plane Survey (BGPS) and Millimetre Astronomy Legacy Team 90-GHz survey (MALT90). Eleven dense molecular lines (N2H+(1–0), HNC(1–0), HCO+(1–0), HCN(1–0), HN13C(1–0), H13CO+(1–0), C2H(1–0), HC3N(10–9), SiO(2–1), 13CS(2–1)and HNCO(44,0 ‑ 30,3)) are detected. N2H+ and HNC are shown to be good tracers for clumps in various evolutionary stages since they are detected in all the fields. The detection rates of N-bearing molecules decrease as the clumps evolve, but those of O-bearing species increase with evolution. Furthermore, the abundance ratios [N2H+]/[HCO+] and log([HC3N]/[HCO+]) decline with log([HCO+]) as two linear functions, respectively. This suggests that N2H+ and HC3N transform to HCO+ as the clumps evolve. We also find that C2H is the most abundant molecule with an order of magnitude 10‑8. In addition, three new infall candidates, G010.214–00.324, G011.121–00.128 and G012.215–00.118(a), are discovered to have large-scale infall motions and infall rates with an order of magnitude 10‑3 M ⊙ yr‑1.

  13. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  14. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  15. Analysis for reflection peaks of multiple-phase-shift based sampled fiber Bragg gratings and application in high channel-count filter design.

    Science.gov (United States)

    Wen, Kun Hua; Yan, Lian Shan; Pan, Wei; Luo, Bin; Zou, Xi Hua; Ye, Jia; Ma, Ya Nan

    2009-10-10

    An analytical expression for calculating the reflection-peak wavelengths (RPWs) of a uniform sampled fiber Bragg grating (SFBG) with the multiple-phase-shift (MPS) technique is derived through Fourier transform of the index modulation. The new expression can accurately depict the RPWs incorporating various parameters such as the duty cycle and the DC index change. The effectiveness of the derived expression is further confirmed by comparing the RPWs estimated from the expression with the simulated reflective spectra using the piecewise uniform method. And the reflective spectrum has been well optimized by introducing the Gaussian apodization function to suppress the sidelobes without any wavelength shift on the RPWs. Then, a high-channel-count comb filter based on MPS is proposed by cascading two or more SFBGs with different Bragg periods but with the same RPWs. Noticeably, the RPWs of the new structured SFBG can also be accurately calculated through the expression. Furthermore, the number of spectral channels can be controlled by choosing gratings with specified difference Bragg periods.

  16. Task-based statistical image reconstruction for high-quality cone-beam CT

    Science.gov (United States)

    Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.

    2017-11-01

    Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a

  17. Clean Hands Count

    Medline Plus

    Full Text Available ... Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed Unsubscribe 65K ...

  18. Clean Hands Count

    Medline Plus

    Full Text Available ... Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed Unsubscribe 66K ...

  19. Housing Inventory Count

    Data.gov (United States)

    Department of Housing and Urban Development — This report displays the data communities reported to HUD about the nature of their dedicated homeless inventory, referred to as their Housing Inventory Count (HIC)....

  20. Scintillation counting apparatus

    International Nuclear Information System (INIS)

    Noakes, J.E.

    1978-01-01

    Apparatus is described for the accurate measurement of radiation by means of scintillation counters and in particular for the liquid scintillation counting of both soft beta radiation and gamma radiation. Full constructional and operating details are given. (UK)

  1. Allegheny County Traffic Counts

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Traffic sensors at over 1,200 locations in Allegheny County collect vehicle counts for the Pennsylvania Department of Transportation. Data included in the Health...

  2. Counting Knights and Knaves

    Science.gov (United States)

    Levin,Oscar; Roberts, Gerri M.

    2013-01-01

    To understand better some of the classic knights and knaves puzzles, we count them. Doing so reveals a surprising connection between puzzles and solutions, and highlights some beautiful combinatorial identities.

  3. Clean Hands Count

    Medline Plus

    Full Text Available ... why Close Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed ...

  4. Clean Hands Count

    Medline Plus

    Full Text Available ... stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. Working... Not now Try it free Find ...

  5. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  6. Statistical study of overvoltages by maneuvering in switches in high voltage using EMTP-RV

    International Nuclear Information System (INIS)

    Dominguez Herrera, Diego Armando

    2013-01-01

    The transient overvoltages produced by maneuvering of switches are studied in a statistical way and through a variation the sequential closing times of switches in networks larger than 230 kV. This study is performed according to time delays and typical deviation ranges, using the tool EMTP- RV (ElectroMagnetic Trasient Program Restructured Version). A conceptual framework related with the electromagnetic transients by maneuver is developed in triphasic switches installed in nominal voltages higher than 230 kV. The methodology established for the execution of statistical studies of overvoltages by switch maneuver is reviewed and evaluated by simulating two fictitious cases in EMTP-RV [es

  7. Relationship between high white blood cell count and insulin resistance (HOMA-IR) in Korean children and adolescents: Korean National Health and Nutrition Examination Survey 2008-2010.

    Science.gov (United States)

    Park, J-M; Lee, D-C; Lee, Y-J

    2017-05-01

    Increasing evidence has indicated that insulin resistance is associated with inflammation. However, few studies have investigated the association between white blood cell (WBC) count and insulin resistance, as measured by a homeostasis model assessment of insulin resistance (HOMA-IR) in a general pediatric population. This study aimed to examine the association between WBC count and insulin resistance as measured by HOMA-IR in a nationally representative sample of children and adolescents. In total, 2761 participants (1479 boys and 1282 girls) aged 10-18 years were selected from the 2008-2010 Korean National Health and Nutrition Examination Survey. Insulin resistance was defined as a HOMA-IR value greater than the 90th percentile. The odds ratios and 95% confidence intervals for insulin resistance were determined using multiple logistic regression analysis. The mean values of most cardiometabolic variables tended to increase proportionally with WBC count quartiles. The prevalence of insulin resistance significantly increased in accordance with WBC count quartiles in both boys and girls. Compared to individuals in the lowest WBC count quartile, the odds ratio for insulin resistance for individuals in the highest quartile was 2.84 in boys and 3.20 in girls, after adjusting for age, systolic blood pressure, body mass index, and waist circumference. A higher WBC count was positively associated with an increased risk of insulin resistance in Korean children and adolescents. This study suggests that WBC count could facilitate the identification of children and adolescents with insulin resistance. Copyright © 2017 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  8. H I, galaxy counts, and reddening: Variation in the gas-to-dust ratio, the extinction at high galactic latitudes, and a new method for determining galactic reddening

    International Nuclear Information System (INIS)

    Burstein, D.; Heiles, C.

    1978-01-01

    We reanalyze the interrelationships among Shane-Wirtanen galaxy counts, H I column densities, and reddenings, and resolve many of the problems raised by Heiles. These problems were caused by two factors: subtle biases in the reddening data and a variable gas-to-dust ratio in the galaxy. We present a compilation of reddenings for RR Lyrae stars and globular clusters which are on the same system and which we believe to be relatively free of biases. The extinction at the galactic poles, as determined by galaxy counts, is reexamined by using a new method to analyze galaxy counts. This new method partially accounts for the nonrandom clustering of galaxies and permits a reasonable estimate of the error in log N/sub gal/ as a function of latitude. The analysis shows that the galaxy counts (or galaxy cluster counts) are too noisy to allow direct determination of the extinction, or variation in extinction, near the galactic poles. From all available data, we conclude that the reddening at the poles is small [< or =0.02 mag in E (B--V) over much of the north galactic pole] and irregularly distributed. We find that there are zero offsets in the relations between E (B--V) and H I, and between galaxy counts and H I, which are at least partly the result of an instrumental effect in the radio data. We also show that the gas-to-dust ratio can vary by a factor of 2 from the average, and we present two methods for correcting for this variability in predicting the reddening of objects which are located outside of the galactic absorbing layer. We present a prescription for predicting these reddenings; in the area of sky covered by the Shane-Wirtanen galaxy counts, the error in these predictions is, on average, less than 0.03 mag in E

  9. Study of mast cell count in skin tags

    Directory of Open Access Journals (Sweden)

    Zaher Hesham

    2007-01-01

    Full Text Available Background: Skin tags or acrochordons are common tumors of middle-aged and elderly subjects. They consist of loose fibrous tissue and occur mainly on the neck and major flexures as small, soft, pedunculated protrusions. Objectives: The aim was to compare the mast cells count in skin tags to adjacent normal skin in diabetic and nondiabetic participants in an attempt to elucidate the possible role of mast cells in the pathogenesis of skin tags. Participants and Methods: Thirty participants with skin tags were divided into group I (15 nondiabetic participants and group II (15 diabetic participants. Three biopsies were obtained from each participant: a large skin tag, a small skin tag and adjacent normal skin. Mast cell count from all the obtained sections was carried out, and the mast cell density was expressed as the average mast cell count/high power field (HPF. Results: A statistically significant increase in mast cells count in skin tags in comparison to normal skin was detected in group I and group II. There was no statistically significant difference between mast cell counts in skin tags of both the groups. Conclusion: Both the mast cell mediators and hyperinsulinemia are capable of inducing fibroblast proliferation and epidermal hyperplasia that are the main pathologic abnormalities seen in all types of skin tags. However, the presence of mast cells in all examined skin tags regardless of diabetes and obesity may point to the possible crucial role of mast cells in the etiogenesis of skin tags through its interaction with fibroblasts and keratinocytes.

  10. Heterogeneous counting on filter support media

    International Nuclear Information System (INIS)

    Long, E.; Kohler, V.; Kelly, M.J.

    1976-01-01

    Many investigators in the biomedical research area have used filter paper as the support for radioactive samples. This means that a heterogeneous counting of sample sometimes results. The count rate of a sample on a filter will be affected by positioning, degree of dryness, sample application procedure, the type of filter, and the type of cocktail used. Positioning of the filter (up or down) in the counting vial can cause a variation of 35% or more when counting tritiated samples on filter paper. Samples of varying degrees of dryness when added to the counting cocktail can cause nonreproducible counts if handled improperly. Count rates starting at 2400 CPM initially can become 10,000 CPM in 24 hours for 3 H-DNA (deoxyribonucleic acid) samples dried on standard cellulose acetate membrane filters. Data on cellulose nitrate filters show a similar trend. Sample application procedures in which the sample is applied to the filter in a small spot or on a large amount of the surface area can cause nonreproducible or very low counting rates. A tritiated DNA sample, when applied topically, gives a count rate of 4,000 CPM. When the sample is spread over the whole filter, 13,400 CPM are obtained with a much better coefficient of variation (5% versus 20%). Adding protein carrier (bovine serum albumin-BSA) to the sample to trap more of the tritiated DNA on the filter during the filtration process causes a serious beta absorption problem. Count rates which are one-fourth the count rate applied to the filter are obtained on calibrated runs. Many of the problems encountered can be alleviated by a proper choice of filter and the use of a liquid scintillation cocktail which dissolves the filter. Filter-Solv has been used to dissolve cellulose nitrate filters and filters which are a combination of cellulose nitrate and cellulose acetate. Count rates obtained for these dissolved samples are very reproducible and highly efficient

  11. Expression of androgen-producing enzyme genes and testosterone concentration in Angus and Nellore heifers with high and low ovarian follicle count.

    Science.gov (United States)

    Loureiro, Bárbara; Ereno, Ronaldo L; Favoreto, Mauricio G; Barros, Ciro M

    2016-07-15

    Follicle population is important when animals are used in assisted reproductive programs. Bos indicus animals have more follicles per follicular wave than Bos taurus animals. On the other hand, B taurus animals present better fertility when compared with B indicus animals. Androgens are positively related with the number of antral follicles; moreover, they increase growth factor expression in granulose cells and oocytes. Experimentation was designed to compare testosterone concentration in plasma, and follicular fluid and androgen enzymes mRNA expression (CYP11A1, CYP17A1, 3BHSD, and 17BHSD) in follicles from Angus and Nellore heifers. Heifers were assigned into two groups according to the number of follicles: low and high follicle count groups. Increased testosterone concentration was measured in both plasma and follicular fluid of Angus heifers. However, there was no difference within groups. Expression of CYP11A1 gene was higher in follicles from Angus heifers; however, there was no difference within groups. Expression of CYP17A1, 3BHSD, and 17BHSD genes was higher in follicles from Nellore heifers, and expression of CYP17A1 and 3BHSD genes was also higher in HFC groups from both breeds. It was found that Nellore heifers have more antral follicles than Angus heifers. Testosterone concentration was higher in Angus heifers; this increase could be associated with the increased mRNA expression of CYP11A1. Increased expression of androgen-producing enzyme genes (CYP17A1, 3BHSD, and 17BHSD) was detected in Nellore heifers. It can be suggested that testosterone is acting through different mechanisms to increase follicle development in Nellore and improve fertility in Angus heifers. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Low White Blood Cell Count

    Science.gov (United States)

    Symptoms Low white blood cell count By Mayo Clinic Staff A low white blood cell count (leukopenia) is a decrease ... of white blood cell (neutrophil). The definition of low white blood cell count varies from one medical ...

  13. Polarizing a stored proton beam by spin flip? - A high statistic reanalysis

    International Nuclear Information System (INIS)

    Oellers, Dieter

    2011-01-01

    Prompted by recent, conflicting calculations, we have carried out a measurement of the spin flip cross section in low-energy electron-proton scattering. The experiment uses the cooling electron beam at COSY as an electron target. A reanalysis of the data leeds to a reduced statistical errors resulting in a factor of 4 reduced upper limit for the spin flip cross section. The measured cross sections are too small for making spin flip a viable tool in polarizing a stored beam.

  14. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermi's equation. Approximate calculus methods are found from analytic study of the T-Fermi's equation for non zero temperature. T-Fermi's equation is solved with the code ''Golem''written in Fortran V (Univac). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (author) [es

  15. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermis equation. Approximate calculus methods are found from analytic study of the T-Fermis equation for non zero temperature. T-Fermis equation is solved with the code GOLEM written in FORTRAN V (UNIVAC). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (Author) 24 refs

  16. A New Statistical Approach to Characterize Chemical-Elicited Behavioral Effects in High-Throughput Studies Using Zebrafish.

    Directory of Open Access Journals (Sweden)

    Guozhu Zhang

    Full Text Available Zebrafish have become an important alternative model for characterizing chemical bioactivity, partly due to the efficiency at which systematic, high-dimensional data can be generated. However, these new data present analytical challenges associated with scale and diversity. We developed a novel, robust statistical approach to characterize chemical-elicited effects in behavioral data from high-throughput screening (HTS of all 1,060 Toxicity Forecaster (ToxCast™ chemicals across 5 concentrations at 120 hours post-fertilization (hpf. Taking advantage of the immense scale of data for a global view, we show that this new approach reduces bias introduced by extreme values yet allows for diverse response patterns that confound the application of traditional statistics. We have also shown that, as a summary measure of response for local tests of chemical-associated behavioral effects, it achieves a significant reduction in coefficient of variation compared to many traditional statistical modeling methods. This effective increase in signal-to-noise ratio augments statistical power and is observed across experimental periods (light/dark conditions that display varied distributional response patterns. Finally, we integrated results with data from concomitant developmental endpoint measurements to show that appropriate statistical handling of HTS behavioral data can add important biological context that informs mechanistic hypotheses.

  17. Nevada Kids Count Data Book, 1997.

    Science.gov (United States)

    We Can, Inc., Las Vegas, NV.

    This Kids Count data book is the first to examine statewide indicators of the well being of Nevada's children. The statistical portrait is based on 15 indicators of child well being: (1) percent low birth-weight babies; (2) infant mortality rate; (3) percent of children in poverty; (4) percent of children in single-parent families; (5) percent of…

  18. Alabama Kids Count 2002 Data Book.

    Science.gov (United States)

    Curtis, Apreill; Bogie, Don

    This Kids Count data book examines statewide trends in well-being of Alabamas children. The statistical portrait is based on 18 indicators in the areas of child health, education, safety, and security: (1) infant mortality rate; (2) low weight births; (3) child health index; (4) births to unmarried teens; (5) first grade retention; (6) school…

  19. Alabama Kids Count 2001 Data Book.

    Science.gov (United States)

    Curtis, Apreill; Bogie, Don

    This Kids Count data book examines statewide trends in well-being for Alabama's children. The statistical portrait is based on 17 indicators in the areas of health, education, safety, and security. The indicators are: (1) infant mortality rate; (2) low weight births; (3) child health index; (4) births to unmarried teens; (5) first grade retention;…

  20. Principles of correlation counting

    International Nuclear Information System (INIS)

    Mueller, J.W.

    1975-01-01

    A review is given of the various applications which have been made of correlation techniques in the field of nuclear physics, in particular for absolute counting. Whereas in most cases the usual coincidence method will be preferable for its simplicity, correlation counting may be the only possible approach in such cases where the two radiations of the cascade cannot be well separated or when there is a longliving intermediate state. The measurement of half-lives and of count rates of spurious pulses is also briefly discussed. The various experimental situations lead to different ways the correlation method is best applied (covariance technique with one or with two detectors, application of correlation functions, etc.). Formulae are given for some simple model cases, neglecting dead-time corrections

  1. Interpretation of galaxy counts

    International Nuclear Information System (INIS)

    Tinsely, B.M.

    1980-01-01

    New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation

  2. An Adaptive Smoother for Counting Measurements

    International Nuclear Information System (INIS)

    Kondrasovs Vladimir; Coulon Romain; Normand Stephane

    2013-06-01

    Counting measurements associated with nuclear instruments are tricky to carry out due to the stochastic process of the radioactivity. Indeed events counting have to be processed and filtered in order to display a stable count rate value and to allow variations monitoring in the measured activity. Smoothers (as the moving average) are adjusted by a time constant defined as a compromise between stability and response time. A new approach has been developed and consists in improving the response time while maintaining count rate stability. It uses the combination of a smoother together with a detection filter. A memory of counting data is processed to calculate several count rate estimates using several integration times. These estimates are then sorted into the memory from short to long integration times. A measurement position, in terms of integration time, is then chosen into this memory after a detection test. An inhomogeneity into the Poisson counting process is detected by comparison between current position estimate and the other estimates contained into the memory in respect with the associated statistical variance calculated with homogeneous assumption. The measurement position (historical time) and the ability to forget an obsolete data or to keep in memory a useful data are managed using the detection test result. The proposed smoother is then an adaptive and a learning algorithm allowing an optimization of the response time while maintaining measurement counting stability and converging efficiently to the best counting estimate after an effective change in activity. This algorithm has also the specificity to be low recursive and thus easily embedded into DSP electronics based on FPGA or micro-controllers meeting 'real life' time requirements. (authors)

  3. Microbial counts of food contact surfaces at schools depending on a feeding scheme

    Directory of Open Access Journals (Sweden)

    Nthabiseng Nhlapo

    2014-11-01

    Full Text Available The prominence of disease transmission between individuals in confined environments is a concern, particularly in the educational environment. With respect to school feeding schemes, food contact surfaces have been shown to be potential vehicles of foodborne pathogens. The aim of this study was to assess the cleanliness of the surfaces that come into contact with food that is provided to children through the National School Nutrition Programme in central South Africa. In each school under study, microbiological samples were collected from the preparation surface and the dominant hand and apron of the food handler. The samples were analysed for total viable counts, coliforms, Escherichia coli, Staphylococcus aureus and yeasts and moulds. The criteria specified in the British Columbia Guide for Environmental Health Officers were used to evaluate the results. Total viable counts were high for all surfaces, with the majority of colonies being too numerous to count (over 100 colonies per plate. Counts of organisms were relatively low, with 20% of the surfaces producing unsatisfactory enumeration of S. aureus and E. coli and 30% unsatisfactory for coliforms. Yeast and mould produced 50% and 60% unsatisfactory counts from preparation surfaces and aprons, respectively. Statistically significant differences could not be established amongst microbial counts of the surfaces, which suggests cross-contamination may have occurred. Contamination may be attributed to foodstuffs and animals in the vicinity of the preparation area rather than to the food handlers, because hands had the lowest counts of enumerated organisms amongst the analysed surfaces.

  4. Which HIV-infected adults with high CD4 T-cell counts benefit most from immediate initiation of antiretroviral therapy?

    DEFF Research Database (Denmark)

    Molina, Jean-Michel; Grund, Birgit; Gordin, Fred

    2018-01-01

    BACKGROUND: Immediate initiation of antiretroviral therapy (ART) in asymptomatic adults with CD4 counts higher than 500 cells per μL, as recommended, might not always be possible in resource-limited settings. We aimed to identify subgroups of individuals who would benefit most from immediate trea...

  5. Factors associated with development of opportunistic infections in HIV-1 infected adults with high CD4 cell counts: a EuroSIDA study

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Mocroft, A; Dragsted, Ulrik Bak

    2006-01-01

    BACKGROUND: Limited data exist on factors predicting the development of opportunistic infections (OIs) at higher-than-expected CD4(+) cell counts in human immunodeficiency virus (HIV) type 1-infected adults. METHODS: Multivariate Poisson regression models were used to determine factors related to...

  6. The use of polyimide foils to prevent contamination from self-sputtering of {sup 252}Cf deposits in high-accuracy fission counting

    Energy Technology Data Exchange (ETDEWEB)

    Gilliam, David M. [National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States)], E-mail: david.gilliam@nist.gov; Yue, Andrew [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN (United States); Scott Dewey, M. [National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States)

    2008-06-01

    It is demonstrated that a thin polyimide foil can be employed to prevent contamination from the self-sputtering of a {sup 252}Cf source under vacuum, with small energy loss of the emitted fission fragments, with very small effect on the efficiency of counting the fission fragments, and with a long lifetime of the plastic foils.

  7. Statistical evaluation and measuring strategy for extremely small line shifts

    International Nuclear Information System (INIS)

    Hansen, P.G.

    1978-01-01

    For a measuring situation limited by counting statistics, but where the level of precision is such that possible systematic errors are a major concern, it is proposed to determine the position of a spectral line from a measured line segment by applying a bias correction to the centre of gravity of the segment. This procedure is statistically highly efficient and not sensitive to small errors in assumptions about the line shape. The counting strategy for an instrument that takes data point by point is also considered. It is shown that an optimum (''two-point'') strategy exists; a scan of the central part of the line is 68% efficient by this standard. (Auth.)

  8. Resistance to penicillin of Staphylococcus aureus isolates from cows with high somatic cell counts in organic and conventional dairy herds in Denmark

    DEFF Research Database (Denmark)

    Bennedsgaard, Torben W.; Thamsborg, Stig M.; Aarestrup, Frank Møller

    2006-01-01

    Background: Quarter milk samples from cows with high risk of intramammary infection were examined to determine the prevalence of Staphylococcus aureus (SA) and penicillin resistant SA (SAr) in conventional and organic dairy herds and herds converting to organic farming in a combined longitudinal......: 2%-5%) respectively. The prevalence of penicillin resistance among SA infected cows was 12% (95% confidence interval: 6%-19%) when calculated from the first herd visits. No statistically significant differences were observed in the prevalence of SAr or the proportion of isolates resistant...... to penicillin between herd groups. Conclusion: The proportion of isolates resistant to penicillin was low compared to studies in other countries except Norway and Sweden. Based on the low prevalence of penicillin resistance of SA, penicillin should still be the first choice of antimicrobial agent for treatment...

  9. Computerized radioautographic grain counting

    International Nuclear Information System (INIS)

    McKanna, J.A.; Casagrande, V.A.

    1985-01-01

    In recent years, radiolabeling techniques have become fundamental assays in physiology and biochemistry experiments. They also have assumed increasingly important roles in morphologic studies. Characteristically, radioautographic analysis of structure has been qualitative rather than quantitative, however, microcomputers have opened the door to several methods for quantifying grain counts and density. The overall goal of this chapter is to describe grain counting using the Bioquant, an image analysis package based originally on the Apple II+, and now available for several popular microcomputers. The authors discuss their image analysis procedures by applying them to a study of development in the central nervous system

  10. Rainflow counting revisited

    Energy Technology Data Exchange (ETDEWEB)

    Soeker, H [Deutsches Windenergie-Institut (Germany)

    1996-09-01

    As state of the art method the rainflow counting technique is presently applied everywhere in fatigue analysis. However, the author feels that the potential of the technique is not fully recognized in wind energy industries as it is used, most of the times, as a mere data reduction technique disregarding some of the inherent information of the rainflow counting results. The ideas described in the following aim at exploitation of this information and making it available for use in the design and verification process. (au)

  11. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...

  12. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  13. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 824 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 409,492 ...

  14. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 786 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 413,702 ...

  15. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 414 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  16. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 869 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  17. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 460 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  18. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 741 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  19. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 029 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,974 ...

  20. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 396 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  1. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 094 views 1:19 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,974 ...

  2. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 319 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  3. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 585 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 413,097 ...

  4. Detection and counting systems

    International Nuclear Information System (INIS)

    Abreu, M.A.N. de

    1976-01-01

    Detection devices based on gaseous ionization are analysed, such as: electroscopes ionization chambers, proportional counters and Geiger-Mueller counters. Scintillation methods are also commented. A revision of the basic concepts in electronics is done and the main equipment for counting is detailed. In the study of gama spectrometry, scintillation and semiconductor detectors are analysed [pt

  5. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 384 views 1:19 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  6. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 285 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  7. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 033 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  8. Reticulocyte Count Test

    Science.gov (United States)

    ... htm. (2004 Summer). Immature Reticulocyte Fraction(IRF). The Pathology Center Newsletter v9(1). [On-line information]. Available ... Company, Philadelphia, PA [18th Edition]. Levin, M. (2007 March 8, Updated). Reticulocyte Count. MedlinePlus Medical Encyclopedia [On- ...

  9. Clean Hands Count

    Medline Plus

    Full Text Available ... is starting stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos ... empower patients to play a role in their care by asking or reminding healthcare providers to clean ...

  10. Radiation intensity counting system

    International Nuclear Information System (INIS)

    Peterson, R.J.

    1982-01-01

    A method is described of excluding the natural dead time of the radiation detector (or eg Geiger-Mueller counter) in a ratemeter counting circuit, thus eliminating the need for dead time corrections. Using a pulse generator an artificial dead time is introduced which is longer than the natural dead time of the detector. (U.K.)

  11. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 043 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,292 ...

  12. Calorie count - fast food

    Science.gov (United States)

    ... GO About MedlinePlus Site Map FAQs Customer Support Health Topics Drugs & Supplements Videos & Tools Español You Are Here: Home → Medical Encyclopedia → Calorie count - fast food URL of this page: //medlineplus.gov/ency/patientinstructions/ ...

  13. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  14. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  15. High-statistics measurement of the η →3 π0 decay at the Mainz Microtron

    Science.gov (United States)

    Prakhov, S.; Abt, S.; Achenbach, P.; Adlarson, P.; Afzal, F.; Aguar-Bartolomé, P.; Ahmed, Z.; Ahrens, J.; Annand, J. R. M.; Arends, H. J.; Bantawa, K.; Bashkanov, M.; Beck, R.; Biroth, M.; Borisov, N. S.; Braghieri, A.; Briscoe, W. J.; Cherepnya, S.; Cividini, F.; Collicott, C.; Costanza, S.; Denig, A.; Dieterle, M.; Downie, E. J.; Drexler, P.; Ferretti Bondy, M. I.; Fil'kov, L. V.; Fix, A.; Gardner, S.; Garni, S.; Glazier, D. I.; Gorodnov, I.; Gradl, W.; Gurevich, G. M.; Hamill, C. B.; Heijkenskjöld, L.; Hornidge, D.; Huber, G. M.; Käser, A.; Kashevarov, V. L.; Kay, S.; Keshelashvili, I.; Kondratiev, R.; Korolija, M.; Krusche, B.; Lazarev, A.; Lisin, V.; Livingston, K.; Lutterer, S.; MacGregor, I. J. D.; Manley, D. M.; Martel, P. P.; McGeorge, J. C.; Middleton, D. G.; Miskimen, R.; Mornacchi, E.; Mushkarenkov, A.; Neganov, A.; Neiser, A.; Oberle, M.; Ostrick, M.; Otte, P. B.; Paudyal, D.; Pedroni, P.; Polonski, A.; Ron, G.; Rostomyan, T.; Sarty, A.; Sfienti, C.; Sokhoyan, V.; Spieker, K.; Steffen, O.; Strakovsky, I. I.; Strandberg, B.; Strub, Th.; Supek, I.; Thiel, A.; Thiel, M.; Thomas, A.; Unverzagt, M.; Usov, Yu. A.; Wagner, S.; Walford, N. K.; Watts, D. P.; Werthmüller, D.; Wettig, J.; Witthauer, L.; Wolfes, M.; Zana, L. A.; A2 Collaboration at MAMI

    2018-06-01

    The largest, at the moment, statistics of 7 ×106η →3 π0 decays, based on 6.2 ×107η mesons produced in the γ p →η p reaction, has been accumulated by the A2 Collaboration at the Mainz Microtron, MAMI. It allowed a detailed study of the η →3 π0 dynamics beyond its conventional parametrization with just the quadratic slope parameter α and enabled, for the first time, a measurement of the second-order term and a better understanding of the cusp structure in the neutral decay. The present data are also compared to recent theoretical calculations that predict a nonlinear dependence along the quadratic distance from the Dalitz-plot center.

  16. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  17. High frequency statistical energy analysis applied to fluid filled pipe systems

    NARCIS (Netherlands)

    Beek, P.J.G. van; Smeulers, J.P.M.

    2013-01-01

    In pipe systems, carrying gas with high velocities, broadband turbulent pulsations can be generated causing strong vibrations and fatigue failure, called Acoustic Fatigue. This occurs at valves with high pressure differences (i.e. chokes), relief valves and obstructions in the flow, such as sharp

  18. On the stability of performance of NaI(Tl) scintillation spectrometer with FEhU-49 photomultiplier at high counting rates

    International Nuclear Information System (INIS)

    Belousov, A.S.; Vazdik, Ya.A.; Malinovskij, E.N.; Rusakov, S.V.; Solov'ev, Yu.V.; Fomenko, A.M.; Sharejko, P.N.

    1986-01-01

    The dependence of instability in NaI(Tl)-spectrometer characteristics on the instability of photomultiplier (PM), the multiplication factor of which grows with an increase in counting rate, is determined. A simple way to stabilize PM gain factor with an accuracy up to 1.7% is suggested, which consists in stabilization of voltage in two terminal dynodes of PM and photocathode illumination by an auxillary light source

  19. Simultaneous use of 82Br and 24Na radionuclides in the whole-body counting of animals by high-resolution gamma-ray spektrometry

    Czech Academy of Sciences Publication Activity Database

    Pavelka, Stanislav; Babický, Arnošt; Vobecký, Miloslav

    2008-01-01

    Roč. 278, č. 3 (2008), s. 571-574 ISSN 0236-5731 Grant - others:EC(XE) LSHG-CT-2004-511978 Institutional research plan: CEZ:AV0Z50110509; CEZ:AV0Z40310501 Keywords : gamma-spectrometry * whole-body counting * radionuclide Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 0.659, year: 2008

  20. Counting paths with Schur transitions

    Energy Technology Data Exchange (ETDEWEB)

    Díaz, Pablo [Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta, T1K 3M4 (Canada); Kemp, Garreth [Department of Physics, University of Johannesburg, P.O. Box 524, Auckland Park 2006 (South Africa); Véliz-Osorio, Alvaro, E-mail: aveliz@gmail.com [Mandelstam Institute for Theoretical Physics, University of the Witwatersrand, WITS 2050, Johannesburg (South Africa); School of Physics and Astronomy, Queen Mary, University of London, Mile End Road, London E1 4NS (United Kingdom)

    2016-10-15

    In this work we explore the structure of the branching graph of the unitary group using Schur transitions. We find that these transitions suggest a new combinatorial expression for counting paths in the branching graph. This formula, which is valid for any rank of the unitary group, reproduces known asymptotic results. We proceed to establish the general validity of this expression by a formal proof. The form of this equation strongly hints towards a quantum generalization. Thus, we introduce a notion of quantum relative dimension and subject it to the appropriate consistency tests. This new quantity finds its natural environment in the context of RCFTs and fractional statistics; where the already established notion of quantum dimension has proven to be of great physical importance.

  1. Monitoring Milk Somatic Cell Counts

    Directory of Open Access Journals (Sweden)

    Gheorghe Şteţca

    2014-11-01

    Full Text Available The presence of somatic cells in milk is a widely disputed issue in milk production sector. The somatic cell counts in raw milk are a marker for the specific cow diseases such as mastitis or swollen udder. The high level of somatic cells causes physical and chemical changes to milk composition and nutritional value, and as well to milk products. Also, the mastitic milk is not proper for human consumption due to its contribution to spreading of certain diseases and food poisoning. According to these effects, EU Regulations established the maximum threshold of admitted somatic cells in raw milk to 400000 cells / mL starting with 2014. The purpose of this study was carried out in order to examine the raw milk samples provided from small farms, industrial type farms and milk processing units. There are several ways to count somatic cells in milk but the reference accepted method is the microscopic method described by the SR EN ISO 13366-1/2008. Generally samples registered values in accordance with the admissible limit. By periodical monitoring of the somatic cell count, certain technological process issues are being avoided and consumer’s health ensured.

  2. High Flux Energy-Resolved Photon-Counting X-Ray Imaging Arrays with CdTe and CdZnTe for Clinical CT

    International Nuclear Information System (INIS)

    Barber, William C.; Hartsough, Neal E.; Gandhi, Thulasidharan; Iwanczyk, Jan S.; Wessel, Jan C.; Nygard, Einar; Malakhov, Nail; Wawrzyniak, Gregor; Dorholt, Ole; Danielsen, Roar

    2013-06-01

    We have fabricated fast room-temperature energy dispersive photon counting x-ray imaging arrays using pixellated cadmium zinc (CdTe) and cadmium zinc telluride (CdZnTe) semiconductors. We have also fabricated fast application specific integrated circuits (ASICs) with a two dimensional (2D) array of inputs for readout from the CdZnTe sensors. The new CdTe and CdZnTe sensors have a 2D array of pixels with a 0.5 mm pitch and can be tiled in 2D. The new 2D ASICs have four energy discriminators per pixel with a linear energy response across the entire dynamic range for clinical CT. The ASICs can also be tiled in 2D and are designed to fit within the active area of the 2D sensors. We have measured several important performance parameters including; an output count rate (OCR) in excess of 20 million counts per second per square mm, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor less than 20 keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdTE and CdZnTe sensors incurring very little additional capacitance. We present a comparison of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, and noise floor. (authors)

  3. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  4. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    Science.gov (United States)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  5. Atypical manifestation of progressive outer retinal necrosis in AIDS patient with CD4+ T-cell counts more than 100 cells/microL on highly active antiretroviral therapy.

    Science.gov (United States)

    Vichitvejpaisal, Pornpattana; Reeponmahar, Somporn; Tantisiriwat, Woraphot

    2009-06-01

    Typical progressive outer retinal necrosis (PORN) is an acute ocular infectious disease in acquired immunodeficiency syndrome (AIDS) patients with extremely low CD4+ T-cell counts. It is a form of the Varicella- zoster virus (VZV) infection. This destructive infection has an extremely rapid course that may lead to blindness in affected eyes within days or weeks. Attempts at its treatment have had limited success. We describe the case of a bilateral PORN in an AIDS patient with an initial CD4+ T-cell count >100 cells/microL that developed after initiation of highly active antiretroviral therapy (HAART). A 29-year-old Thai female initially diagnosed with human immunodeficiency virus (HIV) in 1998, presented with bilaterally decreased visual acuity after initiating HAART two months earlier. Multiple yellowish spots appeared in the deep retina without evidence of intraocular inflammation or retinal vasculitis. Her CD4+ T-cell count was 127 cells/microL. She was diagnosed as having PORN based on clinical features and positive VZV in the aqueous humor and vitreous by polymerase chain reaction (PCR). Despite combined treatment with intravenous acyclovir and intravitreous ganciclovir, the patient's visual acuity worsened with no light-perception in either eye. This case suggests that PORN should be included in the differential diagnosis of reduced visual acuity in AIDS patients initiating HAART with higher CD4+ T-cell counts. PORN may be a manifestation of the immune reconstitution syndrome.

  6. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    Science.gov (United States)

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.

  7. A Profile of Romanian Highly Educated Eco-Consumers Interested in Product Recycling A Statistical Approach

    Directory of Open Access Journals (Sweden)

    Simionescu Mihaela

    2014-07-01

    Full Text Available The objective of this research is to create a profile of the Romanian eco-consumer with university education. The profile is not limited to the information regarding environmental and economic benefits of recycling, but focuses on ecological behaviour. A detailed statistical analysis was made based on a large representative sample of respondents with secondary and university education. Indeed, the tendency of practical ecobehaviour becomes more pronounced for the people with university education. For people that are more than 30 years old the chance of being aware of the significance of the recycling symbols on the packages decreases, the lowest chance being given to people aged more than 50. The respondents that are interested in environment protection buy products with ecological symbols. However, those people who already know the meaning of these symbols do not buy this type of products for ecological reasons, even if they are interested in the environment protection. This research also offers an extensive description of its results, being an opportunity for the respondents to know more about the meaning of the recycling symbols. The results of this research also provide information being a guideline for consumers. This study achieves two main goals: the ecological component (the eco-consumers were identified and ordinary consumers were attracted through the ecological behaviour and the economic aspect (the resources allocation will be more efficient and the marketers will be able to address ecoconsumers who have specific characteristics.

  8. Statistical fission parameters for nuclei at high excitation and angular momenta

    International Nuclear Information System (INIS)

    Blann, M.; Komoto, T.A.

    1982-01-01

    Experimental fusion/fission excitation functions are analyzed by the statistical model with modified rotating liquid drop model barriers and with single particle level densities modeled for deformation for ground state (a/sub ν/) and saddle point nuclei (a/sub f/). Values are estimated for the errors in rotating liquid drop model barriers for the different systems analyzed. These results are found to correlate well with the trends predicted by the finite range model of Krappe, Nix, and Sierk, although the discrepancies seem to be approximately 1 MeV greater than the finite range model predictions over the limited range tested. The a priori values calculated for a/sub f/ and a/sub ν/ are within +- 2% of optimum free parameter values. Analyses for barrier decrements explore the importance of collective enhancement on level densities and of nuclear deformation in calculating transmission coefficients. A calculation is performed for the 97 Rh nucleus for which a first order angular momentum scaling is used for the J = 0 finite range corrections. An excellent fit is found for the fission excitation function in this approach. Results are compared in which rotating liquid drop model barriers are decremented by a constant energy, or alternatively multiplied by a constant factor. Either parametrization is shown to be capable of satisfactorily reproducing the data although their J = 0 extrapolated values differ markedly from one another. This underscores the dangers inherent in arbitrary barrier extrapolations

  9. Statistical physics of fracture: scientific discovery through high-performance computing

    International Nuclear Information System (INIS)

    Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T

    2006-01-01

    The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness

  10. Multiplicity counting from fission detector signals with time delay effects

    Science.gov (United States)

    Nagy, L.; Pázsit, I.; Pál, L.

    2018-03-01

    In recent work, we have developed the theory of using the first three auto- and joint central moments of the currents of up to three fission chambers to extract the singles, doubles and triples count rates of traditional multiplicity counting (Pázsit and Pál, 2016; Pázsit et al., 2016). The objective is to elaborate a method for determining the fissile mass, neutron multiplication, and (α, n) neutron emission rate of an unknown assembly of fissile material from the statistics of the fission chamber signals, analogous to the traditional multiplicity counting methods with detectors in the pulse mode. Such a method would be an alternative to He-3 detector systems, which would be free from the dead time problems that would be encountered in high counting rate applications, for example the assay of spent nuclear fuel. A significant restriction of our previous work was that all neutrons born in a source event (spontaneous fission) were assumed to be detected simultaneously, which is not fulfilled in reality. In the present work, this restriction is eliminated, by assuming an independent, identically distributed random time delay for all neutrons arising from one source event. Expressions are derived for the same auto- and joint central moments of the detector current(s) as in the previous case, expressed with the singles, doubles, and triples (S, D and T) count rates. It is shown that if the time-dispersion of neutron detections is of the same order of magnitude as the detector pulse width, as they typically are in measurements of fast neutrons, the multiplicity rates can still be extracted from the moments of the detector current, although with more involved calibration factors. The presented formulae, and hence also the performance of the proposed method, are tested by both analytical models of the time delay as well as with numerical simulations. Methods are suggested also for the modification of the method for large time delay effects (for thermalised neutrons).

  11. A Statist Political Economy and High Demand for Education in South Korea

    Directory of Open Access Journals (Sweden)

    Ki Su Kim

    1999-06-01

    Full Text Available In the 1998 academic year, 84 percent of South Korea's high school "leavers" entered a university or college while almost all children went up to high schools. This is to say, South Korea is now moving into a new age of universal higher education. Even so, competition for university entrance remains intense. What is here interesting is South Koreans' unusually high demand for education. In this article, I criticize the existing cultural and socio-economic interpretations of the phenomenon. Instead, I explore a new interpretation by critically referring to the recent political economy debate on South Korea's state-society/market relationship. In my interpretation, the unusually high demand for education is largely due to the powerful South Korean state's losing flexibility in the management of its "developmental" policies. For this, I blame the traditional "personalist ethic" which still prevails as the

  12. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  13. Use of Statistics for Data Evaluation in Environmental Radioactivity Measurements

    International Nuclear Information System (INIS)

    Sutarman

    2001-01-01

    Counting statistics will give a correction on environmental radioactivity measurement result. Statistics provides formulas to determine standard deviation (S B ) and minimum detectable concentration (MDC) according to the Poisson distribution. Both formulas depend on the background count rate, counting time, counting efficiency, gamma intensity, and sample size. A long time background counting results in relatively low S B and MDC that can present relatively accurate measurement results. (author)

  14. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  15. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    Science.gov (United States)

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  16. The Nonlinear Statistics of High-contrast Patches in Natural Images

    DEFF Research Database (Denmark)

    Lee, Ann; Pedersen, Kim Steenstrup; Mumford, David

    2003-01-01

    described. In this study, we explore the space of data points representing the values of 3 × 3 high-contrast patches from optical and 3D range images. We find that the distribution of data is extremely sparse with the majority of the data points concentrated in clusters and non-linear low...

  17. High blood levels of persistent organic pollutants are statistically correlated with smoking

    DEFF Research Database (Denmark)

    Deutch, Bente; Hansen, Jens C.

    1999-01-01

    , smoking and intake of traditional Inuit food. Multiple linear regression analyses showed highly significant positive associations between the mothers' smoking status (never, previous, present) and plasma concentrations of all the studied organic pollutants both in maternal blood and umbilical cord blood...

  18. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  19. High Resolution 3D Experimental Investigation of Flow Structures and Turbulence Statistics in the Viscous and Buffer Layer

    Science.gov (United States)

    Sheng, Jian; Malkiel, Edwin; Katz, Joseph

    2006-11-01

    Digital Holographic Microscopy is implemented to perform 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. The measurements are performed at a resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50), sufficient for resolving buffer layer structures and for measuring the instantaneous wall shear stress distributions from velocity gradients in the sublayer. The data provides detailed statistics on the spatial distribution of both wall shear stress components along with the characteristic flow structures, including streamwise counter-rotating vortex pairs, multiple streamwise vortices, and rare hairpins. Conditional sampling identifies characteristic length scales of 70 wall units in spanwise and 10 wall units in wall-normal direction. In the region of high stress, the conditionally averaged flow consists of a stagnation-like sweeping motion induced by a counter rotating pair of streamwise vortices. Regions with low stress are associated with ejection motion, also generated by pairs of counter-rotating vortices. Statistics on the local strain and geometric alignment between strain and vorticity shows that the high shear generating vortices are inclined at 45 to streamwise direction, indicating that vortices are being stretched. Results of on-going analysis examines statistics of helicity, strain and impacts of near-wall structures.

  20. Use of the lymphocyte count as a diagnostic screen in adults with suspected Epstein-Barr virus infectious mononucleosis.

    Science.gov (United States)

    Biggs, Timothy C; Hayes, Stephen M; Bird, Jonathan H; Harries, Philip G; Salib, Rami J

    2013-10-01

    To evaluate the predictive diagnostic accuracy of the lymphocyte count in Epstein-Barr virus-related infectious mononucleosis (IM). Retrospective case note and blood results review within a university-affiliated teaching hospital. A retrospective review of 726 patients undergoing full blood count and Monospot testing was undertaken. Monospot testing outcomes were compared with the lymphocyte count, examining for significant statistical correlations. With a lymphocyte count of ≤4 × 10(9) /L, 99% of patients had an associated negative Monospot result (sensitivity of 84% and specificity of 94%). A group subanalysis of the population older than 18 years with a lymphocyte count ≤4 × 10(9) /L revealed that 100% were Monospot negative (sensitivity of 100% and specificity of 97%). A lymphocyte count of ≤4 × 10(9) /L correlated significantly with a negative Monospot result. A lymphocyte count of ≤4 × 10(9) /L appears to be a highly reliable predictor of a negative Monospot result, particularly in the population aged >18 years. Pediatric patients, and adults with strongly suggestive symptoms and signs of IM, should still undergo Monospot testing. However, in adults with more subtle symptoms and signs, representing the vast majority, Monospot testing should be restricted to those with a lymphocyte count >4 × 10(9) /L. NA Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.