WorldWideScience

Sample records for high counting statistics

  1. Oscillations in counting statistics

    CERN Document Server

    Wilk, Grzegorz

    2016-01-01

    The very large transverse momenta and large multiplicities available in present LHC experiments on pp collisions allow a much closer look at the corresponding distributions. Some time ago we discussed a possible physical meaning of apparent log-periodic oscillations showing up in p_T distributions (suggesting that the exponent of the observed power-like behavior is complex). In this talk we concentrate on another example of oscillations, this time connected with multiplicity distributions P(N). We argue that some combinations of the experimentally measured values of P(N) (satisfying the recurrence relations used in the description of cascade-stochastic processes in quantum optics) exhibit distinct oscillatory behavior, not observed in the usual Negative Binomial Distributions used to fit data. These oscillations provide yet another example of oscillations seen in counting statistics in many different, apparently very disparate branches of physics further demonstrating the universality of this phenomenon.

  2. Multiterminal counting statistics

    OpenAIRE

    2003-01-01

    The review is given of the calculational schemes that allows for easy evaluation of full current statistics (FCS) in multi-terminal mesoscopic systems. First, the scattering approach by Levitov {\\it et.al} to FCS is outlined. Then the multi-terminal FCS of the non-interacting electrons is considered. We show, that this theory appears to be a circuit theory of $2\\times 2$ matrices associated with Keldysh Green functions. Further on the FCS in the opposite situation of mesoscopic systems placed...

  3. Statistical modelling for falls count data.

    Science.gov (United States)

    Ullah, Shahid; Finch, Caroline F; Day, Lesley

    2010-03-01

    Falls and their injury outcomes have count distributions that are highly skewed toward the right with clumping at zero, posing analytical challenges. Different modelling approaches have been used in the published literature to describe falls count distributions, often without consideration of the underlying statistical and modelling assumptions. This paper compares the use of modified Poisson and negative binomial (NB) models as alternatives to Poisson (P) regression, for the analysis of fall outcome counts. Four different count-based regression models (P, NB, zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB)) were each individually fitted to four separate fall count datasets from Australia, New Zealand and United States. The finite mixtures of P and NB regression models were also compared to the standard NB model. Both analytical (F, Vuong and bootstrap tests) and graphical approaches were used to select and compare models. Simulation studies assessed the size and power of each model fit. This study confirms that falls count distributions are over-dispersed, but not dispersed due to excess zero counts or heterogeneous population. Accordingly, the P model generally provided the poorest fit to all datasets. The fit improved significantly with NB and both zero-inflated models. The fit was also improved with the NB model, compared to finite mixtures of both P and NB regression models. Although there was little difference in fit between NB and ZINB models, in the interests of parsimony it is recommended that future studies involving modelling of falls count data routinely use the NB models in preference to the P or ZINB or finite mixture distribution. The fact that these conclusions apply across four separate datasets from four different samples of older people participating in studies of different methodology, adds strength to this general guiding principle.

  4. Counting statistics of transport through Coulomb blockade nanostructures: High-order cumulants and non-Markovian effects

    DEFF Research Database (Denmark)

    Flindt, Christian; Novotny, Tomás; Braggio, Alessandro

    2010-01-01

    Recent experimental progress has made it possible to detect in real-time single electrons tunneling through Coulomb blockade nanostructures, thereby allowing for precise measurements of the statistical distribution of the number of transferred charges, the so-called full counting statistics...

  5. Mesoscopic full counting statistics and exclusion models

    Science.gov (United States)

    Roche, P.-E.; Derrida, B.; Douçot, B.

    2005-02-01

    We calculate the distribution of current fluctuations in two simple exclusion models. Although these models are classical, we recover even for small systems such as a simple or a double barrier, the same distibution of current as given by traditional formalisms for quantum mesoscopic conductors. Due to their simplicity, the full counting statistics in exclusion models can be reduced to the calculation of the largest eigenvalue of a matrix, the size of which is the number of internal configurations of the system. As examples, we derive the shot noise power and higher order statistics of current fluctuations (skewness, full counting statistics, ....) of various conductors, including multiple barriers, diffusive islands between tunnel barriers and diffusive media. A special attention is dedicated to the third cumulant, which experimental measurability has been demonstrated lately.

  6. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  7. High Red Blood Cell Count

    Science.gov (United States)

    Symptoms High red blood cell count By Mayo Clinic Staff A high red blood cell count is an increase in oxygen-carrying cells in your bloodstream. Red blood cells transport oxygen from your lungs to tissues throughout ...

  8. Submillimeter Number Counts From Statistical Analysis of BLAST Maps

    CERN Document Server

    Patanchon, Guillaume; Bock, James J; Chapin, Edward L; Devlin, Mark J; Dicker, Simon R; Griffin, Matthew; Gundersen, Joshua O; Halpern, Mark; Hargrave, Peter C; Hughes, David H; Klein, Jeff; Marsden, Gaelen; Mauskopf, Philip; Moncelsi, Lorenzo; Netterfield, Calvin B; Olmi, Luca; Pascale, Enzo; Rex, Marie; Scott, Douglas; Semisch, Christopher; Thomas, Nicholas; Truch, Matthew D P; Tucker, Carole; Tucker, Gregory S; Viero, Marco P; Wiebe, Donald V

    2009-01-01

    We describe the application of a statistical method to estimate submillimeter galaxy number counts from the confusion limited observations of the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Our method is based on a maximum likelihood fit to the pixel histogram, sometimes called 'P(D)', an approach which has been used before to probe faint counts, the difference being that here we advocate its use even for sources with relatively high signal-to-noise ratios. This method has an advantage over standard techniques of source extraction in providing an unbiased estimate of the counts from the bright end down to flux densities well below the confusion limit. We specifically analyse BLAST observations of a roughly 10 sq. deg map centered on the Great Observatories Origins Deep Survey South field. We provide estimates of number counts at the three BLAST wavelengths, 250, 350, and 500 microns, instead of counting sources in flux bins we estimate the counts at several flux density nodes connected with ...

  9. Particle number counting statistics in ideal Bose gases

    National Research Council Canada - National Science Library

    Christoph Weiss; Martin Wilkens

    1997-01-01

    We discuss the exact particle number counting statistics of degenerate ideal Bose gases in the microcanonical, canonical, and grand-canonical ensemble, respectively, for various trapping potentials...

  10. Statistical Methods for Unusual Count Data

    DEFF Research Database (Denmark)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...

  11. Counting statistics for genetic switches based on effective interaction approximation

    Science.gov (United States)

    Ohkubo, Jun

    2012-09-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid having the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  12. Counting statistics for genetic switches based on effective interaction approximation

    CERN Document Server

    Ohkubo, Jun

    2012-01-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid to have the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  13. Counting Statistics and Ion Interval Density in AMS

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, J S; Ognibene, T; Palmblad, M; Reimer, P

    2004-08-03

    Confidence in the precisions of AMS and decay measurements must be comparable for the application of the {sup 14}C calibration to age determinations using both technologies. We confirmed the random nature of the temporal distribution of {sup 14}C ions in an AMS spectrometer for a number of sample counting rates and properties of the sputtering process. The temporal distribution of ion counts was also measured to confirm the applicability of traditional counting statistics.

  14. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  15. Counting losses due to saturation effects of scintillation counters at high count rates

    CERN Document Server

    Hashimoto, K

    1999-01-01

    The counting statistics of a scintillation counter, with a preamplifier saturated by an overloading input, are investigated. First, the formulae for the variance and the mean number of counts, accumulated within a given gating time, are derived by considering counting-loss effects originating from the saturation and a finite resolving time of the electronic circuit. Numerical examples based on the formulae indicate that the saturation makes a positive contribution to the variance-to-mean ratio and that the contribution increases with count rate. Next the ratios are measured under high count rates when the preamplifier saturation can be observed. By fitting the present formula to the measured data, the counting-loss parameters can be evaluated. Corrections based on the parameters are made for various count rates measured in a nuclear reactor. As a result of the corrections, the linearity between count rate and reactor power can be restored.

  16. Extreme value statistics of weak lensing shear peak counts

    CERN Document Server

    Reischke, Robert; Bartelmann, Matthias

    2015-01-01

    The statistics of peaks in weak gravitational lensing maps is a promising technique to constrain cosmological parameters in present and future surveys. Here we investigate its power when using general extreme value statistics which is very sensitive to the exponential tail of the halo mass function. To this end, we use an analytic method to quantify the number of weak lensing peaks caused by galaxy clusters, large-scale structures and observational noise. Doing so, we further improve the method in the regime of high signal-to-noise ratios dominated by non-linear structures by accounting for the embedding of those counts into the surrounding shear caused by large scale structures. We derive the extreme value and order statistics for both over-densities (positive peaks) and under-densities (negative peaks) and provide an optimized criterion to split a wide field survey into sub-fields in order to sample the distribution of extreme values such that the expected objects causing the largest signals are mostly due ...

  17. Gene coexpression measures in large heterogeneous samples using count statistics.

    Science.gov (United States)

    Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan

    2014-11-18

    With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.

  18. High Count Rate Single Photon Counting Detector Array Project

    Data.gov (United States)

    National Aeronautics and Space Administration — An optical communications receiver requires efficient and high-rate photon-counting capability so that the information from every photon, received at the aperture,...

  19. Probing the Conformations of Single Molecule via Photon Counting Statistics

    CERN Document Server

    Peng, Yonggang; Yang, Chuanlu; Zheng, Yujun

    2014-01-01

    We suggest an approach to detect the conformation of single molecule by using the photon counting statistics. The generalized Smoluchoswki equation is employed to describe the dynamical process of conformational change of single molecule. The resonant trajectories of the emission photon numbers $$ and the Mandel's $Q$ parameter, in the space of conformational coordinates $\\bm{\\mathcal{X}}$ and frequency $\\omega_L$ of external field ($\\bm{\\mathcal{X}}-\\omega_L$ space), can be used to rebuild the conformation of the single molecule. As an example, we consider Thioflavin T molecule. It demonstrates that the results of conformations extracted by employing the photon counting statistics is excellent agreement with the results of {\\it ab initio} computation.

  20. Large deviations of ergodic counting processes: a statistical mechanics approach.

    Science.gov (United States)

    Budini, Adrián A

    2011-07-01

    The large-deviation method allows to characterize an ergodic counting process in terms of a thermodynamic frame where a free energy function determines the asymptotic nonstationary statistical properties of its fluctuations. Here we study this formalism through a statistical mechanics approach, that is, with an auxiliary counting process that maximizes an entropy function associated with the thermodynamic potential. We show that the realizations of this auxiliary process can be obtained after applying a conditional measurement scheme to the original ones, providing is this way an alternative measurement interpretation of the thermodynamic approach. General results are obtained for renewal counting processes, that is, those where the time intervals between consecutive events are independent and defined by a unique waiting time distribution. The underlying statistical mechanics is controlled by the same waiting time distribution, rescaled by an exponential decay measured by the free energy function. A scale invariance, shift closure, and intermittence phenomena are obtained and interpreted in this context. Similar conclusions apply for nonrenewal processes when the memory between successive events is induced by a stochastic waiting time distribution.

  1. Full counting statistics of a nonadiabatic electron pump

    Science.gov (United States)

    Croy, Alexander; Saalmann, Ulf

    2016-04-01

    Nonadiabatic charge pumping through a single-level quantum dot with periodically modulated parameters is studied theoretically. By means of a quantum-master-equation approach the full counting statistics of the system is obtained. We find a trinomial-probability distribution of the charge transfer, which adequately describes the reversal of the pumping current by sweeping the driving frequency. Further, we derive equations of motion for current and noise and solve those numerically for two different driving schemes. Both show interesting features, which can be fully analyzed due to the simple and generic model studied.

  2. Full counting statistics in the self-dual interacting resonant level model.

    Science.gov (United States)

    Carr, Sam T; Bagrets, Dmitry A; Schmitteckert, Peter

    2011-11-11

    We present a general technique to obtain the zero temperature cumulant generating function of the full counting statistics of charge transfer in interacting impurity models out of equilibrium from time-dependent simulations on a lattice. We demonstrate the technique with application to the self-dual interacting resonant level model, where very good agreement between numerical simulations using the density matrix renormalization group and those obtained analytically from the thermodynamic Bethe ansatz is found. We show from the exact form of counting statistics that the quasiparticles involved in transport carry charge 2e in the low bias regime and e/2 in the high bias regime.

  3. Finite-frequency counting statistics of electron transport: Markovian theory

    Energy Technology Data Exchange (ETDEWEB)

    Marcos, D; Aguado, R [Departamento de Teoria y Simulacion de Materiales, Instituto de Ciencia de Materiales de Madrid, CSIC, Cantoblanco 28049, Madrid (Spain); Emary, C; Brandes, T, E-mail: david.marcos@icmm.csic.es [Institut fuer Theoretische Physik, Hardenbergstrasse 36, TU Berlin, D-10623 Berlin (Germany)

    2010-12-15

    We present a theory of frequency-dependent counting statistics of electron transport through nanostructures within the framework of Markovian quantum master equations. Our method allows the calculation of finite-frequency current cumulants of arbitrary order, as we explicitly show for the second- and third-order cumulants. Our formulae generalize previous zero-frequency expressions in the literature and can be viewed as an extension of MacDonald's formula beyond shot noise. When combined with an appropriate treatment of tunneling using, e.g., the Liouvillian perturbation theory in Laplace space, our method can deal with arbitrary bias voltages and frequencies, as we illustrate with the paradigmatic example of transport through a single resonant level model. We discuss various interesting limits, including the recovery of the fluctuation-dissipation theorem near linear response, as well as some drawbacks inherent to the Markovian description arising from the neglect of quantum fluctuations.

  4. Reprint of : Full counting statistics of Majorana interferometers

    Science.gov (United States)

    Strübi, Grégory; Belzig, Wolfgang; Schmidt, Thomas L.; Bruder, Christoph

    2016-08-01

    We study the full counting statistics of interferometers for chiral Majorana fermions with two incoming and two outgoing Dirac fermion channels. In the absence of interactions, the FCS can be obtained from the 4×4 scattering matrix S that relates the outgoing Dirac fermions to the incoming Dirac fermions. After presenting explicit expressions for the higher-order current correlations for a modified Hanbury Brown-Twiss interferometer, we note that the cumulant-generating function can be interpreted such that unit-charge transfer processes correspond to two independent half-charge transfer processes, or alternatively, to two independent electron-hole conversion processes. By a combination of analytical and numerical approaches, we verify that this factorization property holds for a general SO(4) scattering matrix, i.e. for a general interferometer geometry.

  5. Particle number counting statistics in ideal Bose gases.

    Science.gov (United States)

    Weiss, C; Wilkens, M

    1997-11-10

    We discuss the exact particle number counting statistics of degenerate ideal Bose gases in the microcanonical, canonical, and grand-canonical ensemble, respectively, for various trapping potentials. We then invoke the Maxwell's Demon ensemble [Navez et el., Phys. Rev. Lett. (1997)] and show that for large total number of particles the root-mean-square fluctuation of the condensate occupation scales n0 / [T=Tc] r N s with scaling exponents r = 3=2, s = 1=2 for the3D harmonic oscillator trapping potential, and r = 1,s= 2=3 for the 3D box. We derive an explicit expression for r and s in terms of spatial dimension D and spectral index sigma of the single-particle energy spectrum. Our predictions also apply to systems where Bose-Einstein condensation does not occur. We point out that the condensate fluctuations in the microcanonical and canonical ensemble respect the principle of thermodynamic equivalence.

  6. High Count Rate Electron Probe Microanalysis

    Science.gov (United States)

    Geller, Joseph D.; Herrington, Charles

    2002-01-01

    Reducing the measurement uncertainty of quantitative analyses made using electron probe microanalyzers (EPMA) requires a careful study of the individual uncertainties from each definable step of the measurement. Those steps include measuring the incident electron beam current and voltage, knowing the angle between the electron beam and the sample (takeoff angle), collecting the emitted x rays from the sample, comparing the emitted x-ray flux to known standards (to determine the k-ratio) and transformation of the k-ratio to concentration using algorithms which includes, as a minimum, the atomic number, absorption, and fluorescence corrections. This paper discusses the collection and counting of the emitted x rays, which are diffracted into the gas flow or sealed proportional x-ray detectors. The representation of the uncertainty in the number of collected x rays collected reduces as the number of counts increase. The uncertainty of the collected signal is fully described by Poisson statistics. Increasing the number of x rays collected involves either counting longer or at a higher counting rate. Counting longer means the analysis time increases and may become excessive to get to the desired uncertainty. Instrument drift also becomes an issue. Counting at higher rates has its limitations, which are a function of the detector physics and the detecting electronics. Since the beginning of EPMA analysis, analog electronics have been used to amplify and discriminate the x-ray induced ionizations within the proportional counter. This paper will discuss the use of digital electronics for this purpose. These electronics are similar to that used for energy dispersive analysis of x rays with either Si(Li) or Ge(Li) detectors except that the shaping time constants are much smaller. PMID:27446749

  7. RESPONSE OF NEUTRON MONITORS TO COSMIC RAY COUNTS: A STATISTICAL APPROACH

    Directory of Open Access Journals (Sweden)

    R. BHATTACHARYA

    2013-09-01

    Full Text Available Study of cosmic ray became a subject of study with the invention of neutron monitor by Simpson. But recording of cosmic ray counts was started regularly from International Geophysical Year at different locations having different climatic zones over the globe. Here statistical analysis is performed to investigate the degree of response of different monitors towards cosmic ray counts. No significant difference is observed in statistical analysis if cosmic ray counts are normalized with respect to their mean counts in respective solar cycles. Correlation between cosmic ray counts of any two stations is found ranges from 0.88 to 0.99.

  8. Experimental reconstruction of photon statistics without photon counting.

    Science.gov (United States)

    Zambra, Guido; Andreoni, Alessandra; Bondani, Maria; Gramegna, Marco; Genovese, Marco; Brida, Giorgio; Rossi, Andrea; Paris, Matteo G A

    2005-08-05

    Experimental reconstructions of photon number distributions of both continuous-wave and pulsed light beams are reported. Our scheme is based on on/off avalanche photo-detection assisted by maximum-likelihood estimation and does not involve photon counting. Reconstructions of the distribution for both semiclassical and quantum states of light are reported for single-mode as well as for multi-mode beams.

  9. Unveiling the Gamma-ray Source Count Distribution below the Fermi Detection Limit with Photon Statistics

    CERN Document Server

    Zechlin, Hannes-S; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea

    2015-01-01

    The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi-LAT photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b|>30 deg) between 1 GeV and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into: (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6-year Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power-law of index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2x10^{-11} cm^{-2}s^{-1}, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken pow...

  10. Statistical Measurement of the Gamma-ray Source-count Distribution as a Function of Energy

    CERN Document Server

    Zechlin, Hannes-S; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco

    2016-01-01

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. We employ the 1-point probability distribution function of 6 years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 GeV and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of 50 GeV. The index below the break is between 1.95 and 2.0. For higher energies, a simple power law fits the data, with an index of 2.2^{+0.7}_{-0.3} in the energy band between 50 GeV and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point source populations probed by this method can explain 83^{+7}_{-13}% (81^{+52}_{-19}%) of the extrag...

  11. Counting

    Institute of Scientific and Technical Information of China (English)

    许有国

    2005-01-01

    Most people began to count in tens because they had ten fingers on their hands. But in some countries, people counted on one hand and used the three parts of their four fingers. So they counted in twelves, not in tens.

  12. Monte Carlo study of single-barrier structure based on exclusion model full counting statistics

    Institute of Scientific and Technical Information of China (English)

    Chen Hua; Du Lei; Qu Cheng-Li; He Liang; Chen Wen-Hao; Sun Peng

    2011-01-01

    Different from the usual full counting statistics theoretical work that focuses on the higher order cumulants computation by using cumulant generating function in electrical structures, Monte Carlo simulation of single-barrier structure is performed to obtain time series for two types of widely applicable exclusion models, counter-flows model,and tunnel model. With high-order spectrum analysis of Matlab, the validation of Monte Carlo methods is shown through the extracted first four cumulants from the time series, which are in agreement with those from cumulant generating function. After the comparison between the counter-flows model and the tunnel model in a single barrier structure, it is found that the essential difference between them consists in the strictly holding of Pauli principle in the former and in the statistical consideration of Pauli principle in the latter.

  13. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-01-25

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials.

  14. Photon counts statistics of squeezed and multi-mode thermal states of light on multiplexed on-off detectors

    CERN Document Server

    Chrapkiewicz, Radosław

    2015-01-01

    Photon number resolving detectors can be highly useful for studying the statistics of multi-photon quantum states of light. In this work we study the counts statistics of different states of light measured on multiplexed on-off detectors. We put special emphasis on artificial nonclassical features of the statistics obtained. We show new ways to derive analytical formulas for counts statistics and their moments. Using our approach we are the first to derive statistics moments for multi-mode thermal states measured on multiplexed on-off detectors. We use them to determine empirical Mandel parameters and recently proposed subbinomial parameters suitable for tests of nonclassicality of the measured states. Additionally, we investigate subpoissonian and superbunching properties of the two-mode squeezed state measured on a pair of multiplexed detectors and we present results of the Fano factor and second-order correlation function for these states.

  15. Bias Expansion of Spatial Statistics and Approximation of Differenced Lattice Point Counts

    Indian Academy of Sciences (India)

    Daniel J Nordman; Soumendra N Lahiri

    2011-05-01

    Investigations of spatial statistics, computed from lattice data in the plane, can lead to a special lattice point counting problem. The statistical goal is to expand the asymptotic expectation or large-sample bias of certain spatial covariance estimators, where this bias typically depends on the shape of a spatial sampling region. In particular, such bias expansions often require approximating a difference between two lattice point counts, where the counts correspond to a set of increasing domain (i.e., the sampling region) and an intersection of this set with a vector translate of itself. Non-trivially, the approximation error needs to be of smaller order than the spatial region’s perimeter length. For all convex regions in 2-dimensional Euclidean space and certain unions of convex sets, we show that a difference in areas can approximate a difference in lattice point counts to this required accuracy, even though area can poorly measure the lattice point count of any single set involved in the difference. When investigating large-sample properties of spatial estimators, this approximation result facilitates direct calculation of limiting bias, because, unlike counts, differences in areas are often tractable to compute even with non-rectangular regions. We illustrate the counting approximations with two statistical examples.

  16. Systematic and Statistical Errors Associated with Nuclear Decay Constant Measurements Using the Counting Technique

    Science.gov (United States)

    Koltick, David; Wang, Haoyu; Liu, Shih-Chieh; Heim, Jordan; Nistor, Jonathan

    2016-03-01

    Typical nuclear decay constants are measured at the accuracy level of 10-2. There are numerous reasons: tests of unconventional theories, dating of materials, and long term inventory evolution which require decay constants accuracy at a level of 10-4 to 10-5. The statistical and systematic errors associated with precision measurements of decays using the counting technique are presented. Precision requires high count rates, which introduces time dependent dead time and pile-up corrections. An approach to overcome these issues is presented by continuous recording of the detector current. Other systematic corrections include, the time dependent dead time due to background radiation, control of target motion and radiation flight path variation due to environmental conditions, and the time dependent effects caused by scattered events are presented. The incorporation of blind experimental techniques can help make measurement independent of past results. A spectrometer design and data analysis is reviewed that can accomplish these goals. The author would like to thank TechSource, Inc. and Advanced Physics Technologies, LLC. for their support in this work.

  17. High counting rate resistive-plate chamber

    Science.gov (United States)

    Peskov, V.; Anderson, D. F.; Kwan, S.

    1993-05-01

    Parallel-plate avalanche chambers (PPAC) are widely used in physics experiments because they are fast (less than 1 ns) and have very simple construction: just two parallel metallic plates or mesh electrodes. Depending on the applied voltage they may work either in spark mode or avalanche mode. The advantage of the spark mode of operation is a large signal amplitude from the chamber, the disadvantage is that there is a large dead time (msec) for the entire chamber after an event. The main advantage of the avalanche mode is high rate capability 10(exp 5) counts/mm(sup 2). A resistive-plate chamber (RPC) is similar to the PPAC in construction except that one or both of the electrodes are made from high resistivity (greater than 10(exp 10) Omega(cm) materials. In practice RPC's are usually used in the spark mode. Resistive electrodes are charged by sparks, locally reducing the actual electric field in the gap. The size of the charged surface is about 10 mm(sup 2), leaving the rest of the detector unaffected. Therefore, the rate capability of such detectors in the spark mode is considerably higher than conventional spark counters. Among the different glasses tested the best results were obtained with electron type conductive glasses, which obey Ohm's law. Most of the work with such glasses was done with high pressure parallel-plate chambers (10 atm) for time-of-flight measurements. Resistive glasses have been expensive and produced only in small quantities. Now resistive glasses are commercially available, although they are still expensive in small scale production. From the positive experience of different groups working with the resistive glasses, it was decided to review the old idea to use this glass for the RPC. This work has investigated the possibility of using the RPC at 1 atm and in the avalanche mode. This has several advantages: simplicity of construction, high rate capability, low voltage operation, and the ability to work with non-flammable gases.

  18. Unveiling the Gamma-Ray Source Count Distribution Below the Fermi Detection Limit with Photon Statistics

    Science.gov (United States)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea

    2016-08-01

    The source-count distribution as a function of their flux, {dN}/{dS}, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (| b| ≥slant 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6 yr Fermi-LAT data set (P7REP), we show that the {dN}/{dS} distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure {dN}/{dS} down to an integral flux of ˜ 2× {10}-11 {{cm}}-2 {{{s}}}-1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall {dN}/{dS} distribution is consistent with a broken power law, with a break at {2.1}-1.3+1.0× {10}-8 {{cm}}-2 {{{s}}}-1. The power-law index {n}1={3.1}-0.5+0.7 for bright sources above the break hardens to {n}2=1.97+/- 0.03 for fainter sources below the break. A possible second break of the {dN}/{dS} distribution is constrained to be at fluxes below 6.4× {10}-11 {{cm}}-2 {{{s}}}-1 at 95% confidence level. The high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ˜25% point sources, ˜69.3% diffuse Galactic foreground emission, and ˜6% isotropic diffuse background.

  19. Using spatiotemporal statistical models to estimate animal abundance and infer ecological dynamics from survey counts

    Science.gov (United States)

    Conn, Paul B.; Johnson, Devin S.; Ver Hoef, Jay M.; Hooten, Mevin B.; London, Joshua M.; Boveng, Peter L.

    2015-01-01

    Ecologists often fit models to survey data to estimate and explain variation in animal abundance. Such models typically require that animal density remains constant across the landscape where sampling is being conducted, a potentially problematic assumption for animals inhabiting dynamic landscapes or otherwise exhibiting considerable spatiotemporal variation in density. We review several concepts from the burgeoning literature on spatiotemporal statistical models, including the nature of the temporal structure (i.e., descriptive or dynamical) and strategies for dimension reduction to promote computational tractability. We also review several features as they specifically relate to abundance estimation, including boundary conditions, population closure, choice of link function, and extrapolation of predicted relationships to unsampled areas. We then compare a suite of novel and existing spatiotemporal hierarchical models for animal count data that permit animal density to vary over space and time, including formulations motivated by resource selection and allowing for closed populations. We gauge the relative performance (bias, precision, computational demands) of alternative spatiotemporal models when confronted with simulated and real data sets from dynamic animal populations. For the latter, we analyze spotted seal (Phoca largha) counts from an aerial survey of the Bering Sea where the quantity and quality of suitable habitat (sea ice) changed dramatically while surveys were being conducted. Simulation analyses suggested that multiple types of spatiotemporal models provide reasonable inference (low positive bias, high precision) about animal abundance, but have potential for overestimating precision. Analysis of spotted seal data indicated that several model formulations, including those based on a log-Gaussian Cox process, had a tendency to overestimate abundance. By contrast, a model that included a population closure assumption and a scale prior on total

  20. Counting statistics of chaotic resonances at optical frequencies: Theory and experiments

    Science.gov (United States)

    Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng

    2017-07-01

    A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].

  1. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    Science.gov (United States)

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are

  2. Statistical analysis of dark count rate in Geiger-mode APD FPAs

    Science.gov (United States)

    Itzler, Mark A.; Krishnamachari, Uppili; Chau, Quan; Jiang, Xudong; Entwistle, Mark; Owens, Mark; Slomkowski, Krystyna

    2014-10-01

    We present a temporal statistical analysis of the array-level dark count behavior of Geiger-mode avalanche photodiode (GmAPD) focal plane arrays that distinguishes between Poissonian intrinsic dark count rate and non-Poissonian crosstalk counts by considering "inter-arrival" times between successive counts from the entire array. For 32 x 32 format sensors with 100 μm pixel pitch, we show the reduction of crosstalk for smaller active area sizes within the pixel. We also compare the inter-arrival time behavior for arrays with narrow band (900 - 1100 nm) and broad band (900 - 1600 nm) spectral response. We then consider a similar analysis of larger format 128 x 32 arrays. As a complement to the temporal analysis, we describe the results of a spatial analysis of crosstalk events. Finally, we propose a simple model for the impact of crosstalk events on the Poissonian statistics of intrinsic dark counts that provides a qualitative explanation for the results of the inter-arrival time analysis for arrays with varying degrees of crosstalk.

  3. Statistical connection of peak counts to power spectrum and moments in weak-lensing field

    Science.gov (United States)

    Shirasaki, Masato

    2017-02-01

    The number density of local maxima of weak-lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak-lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field K to a new Gaussian field y, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of K can be reproduced from a single Gaussian field y and monotonic relation between y and K. Therefore, the correct information of two-point clustering and any order of moments in weak-lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to explain weak-lensing peak counts in the absence of shape noise. The prediction by local-Gaussianized transformation underestimates the simulated peak counts with a level of ˜20-30 per cent over a wide range of peak heights. Local-Gaussianized transformation can predict the weak-lensing peak counts with an ˜10 per cent accuracy in the presence of shape noise. Our analyses suggest that the cosmological information beyond power spectrum and its moments would be necessary to predict the weak-lensing peak counts with a percent-level accuracy, which is an expected statistical uncertainty in upcoming wide-field galaxy surveys.

  4. High Reproducibility of ELISPOT Counts from Nine Different Laboratories

    Directory of Open Access Journals (Sweden)

    Srividya Sundararaman

    2015-01-01

    Full Text Available The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable background intensities. Due to the subjective nature of judging maximal and minimal spot sizes, different investigators come up with different numbers. This study aims to determine whether statistics-based, automated size-gating can harmonize the number of spot counts calculated between different laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A Basic Count™ relying on subjective counting parameters set by the respective investigators and (B SmartCount™, an automated counting protocol by the ImmunoSpot® Software that uses statistics-based spot size auto-gating with spot intensity auto-thresholding. The average coefficient of variation (CV for the mean values between independent laboratories was 26.7% when counting with Basic Count™, and 6.7% when counting with SmartCount™. Our data indicates that SmartCount™ allows harmonization of counting ELISPOT results between different laboratories and investigators.

  5. Dual adaptive statistical approach for quantitative noise reduction in photon-counting medical imaging: application to nuclear medicine images.

    Science.gov (United States)

    Hannequin, Pascal Paul

    2015-06-07

    Noise reduction in photon-counting images remains challenging, especially at low count levels. We have developed an original procedure which associates two complementary filters using a Wiener-derived approach. This approach combines two statistically adaptive filters into a dual-weighted (DW) filter. The first one, a statistically weighted adaptive (SWA) filter, replaces the central pixel of a sliding window with a statistically weighted sum of its neighbors. The second one, a statistical and heuristic noise extraction (extended) (SHINE-Ext) filter, performs a discrete cosine transformation (DCT) using sliding blocks. Each block is reconstructed using its significant components which are selected using tests derived from multiple linear regression (MLR). The two filters are weighted according to Wiener theory. This approach has been validated using a numerical phantom and a real planar Jaszczak phantom. It has also been illustrated using planar bone scintigraphy and myocardial single-photon emission computed tomography (SPECT) data. Performances of filters have been tested using mean normalized absolute error (MNAE) between the filtered images and the reference noiseless or high-count images.Results show that the proposed filters quantitatively decrease the MNAE in the images and then increase the signal-to-noise Ratio (SNR). This allows one to work with lower count images. The SHINE-Ext filter is well suited to high-size images and low-variance areas. DW filtering is efficient for low-size images and in high-variance areas. The relative proportion of eliminated noise generally decreases when count level increases. In practice, SHINE filtering alone is recommended when pixel spacing is less than one-quarter of the effective resolution of the system and/or the size of the objects of interest. It can also be used when the practical interest of high frequencies is low. In any case, DW filtering will be preferable.The proposed filters have been applied to nuclear

  6. High Channel Count, Low Cost, Multiplexed FBG Sensor Systems

    Institute of Scientific and Technical Information of China (English)

    J. J. Pan; FengQing Zhou; Kejian Guan; Joy Jiang; Liang Dong; Albert Li; Xiangdong Qiu; Jonathan Zhang

    2003-01-01

    With rich products development experience in WDM telecommunication networks, we introduce a few of high channel count, multiplexed FBG fiber optic sensor systems featured in reliable high performance and low cost.

  7. Full counting statistics of level renormalization in electron transport through double quantum dots.

    Science.gov (United States)

    Luo, JunYan; Jiao, HuJun; Shen, Yu; Cen, Gang; He, Xiao-Ling; Wang, Changrong

    2011-04-13

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  8. Full counting statistics of level renormalization in electron transport through double quantum dots

    Energy Technology Data Exchange (ETDEWEB)

    Luo Junyan; Shen Yu; Cen Gang; He Xiaoling; Wang Changrong [School of Science, Zhejiang University of Science and Technology, Hangzhou 310023 (China); Jiao Hujun, E-mail: jyluo@zust.edu.cn [Department of Physics, Shanxi University, Taiyuan, Shanxi 030006 (China)

    2011-04-13

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  9. Full counting statistics of laser excited Rydberg aggregates in a one-dimensional geometry

    CERN Document Server

    Schempp, H; Robert-de-Saint-Vincent, M; Hofmann, C S; Breyel, D; Komnik, A; Schönleber, D W; Gärttner, M; Evers, J; Whitlock, S; Weidemüller, M

    2014-01-01

    We experimentally study the full counting statistics of few-body Rydberg aggregates excited from a quasi-one-dimensional Rydberg gas. We measure asymmetric excitation spectra and increased second and third order statistical moments of the Rydberg number distribution, from which we determine the average aggregate size. Direct comparisons with numerical simulations reveal the presence of liquid-like spatial correlations, and indicate sequential growth of the aggregates around an initial grain. These findings demonstrate the importance of dissipative effects in strongly correlated Rydberg gases and introduce a way to study spatio-temporal correlations in strongly-interacting many-body quantum systems without imaging.

  10. Counting Vesicular Release Events Reveals Binomial Release Statistics at Single Glutamatergic Synapses.

    Science.gov (United States)

    Malagon, Gerardo; Miki, Takafumi; Llano, Isabel; Neher, Erwin; Marty, Alain

    2016-04-06

    Many central glutamatergic synapses contain a single presynaptic active zone and a single postsynaptic density. However, the basic functional properties of such "simple synapses" remain unclear. One important step toward understanding simple synapse function is to analyze the number of synaptic vesicles released in such structures per action potential, but this goal has remained elusive until now. Here, we describe procedures that allow reliable vesicular release counting at simple synapses between parallel fibers and molecular layer interneurons of rat cerebellar slices. Our analysis involves local extracellular stimulation of single parallel fibers and deconvolution of resulting EPSCs using quantal signals as template. We observed a reduction of quantal amplitudes (amplitude occlusion) in pairs of consecutive EPSCs due to receptor saturation. This effect is larger (62%) than previously reported and primarily reflects receptor activation rather than desensitization. In addition to activation-driven amplitude occlusion, each EPSC reduces amplitudes of subsequent events by an estimated 3% due to cumulative desensitization. Vesicular release counts at simple synapses follow binomial statistics with a maximum that varies from 2 to 10 among experiments. This maximum presumably reflects the number of docking sites at a given synapse. These results show striking similarities, as well as significant quantitative differences, with respect to previous results at simple GABAergic synapses. It is generally accepted that the output signal of individual central synapses saturates at high release probability, but it remains unclear whether the source of saturation is presynaptic, postsynaptic, or both presynaptic and postsynaptic. To clarify this and other issues concerning the function of synapses, we have developed new recording and analysis methods at single central glutamatergic synapses. We find that individual release events engage a high proportion of postsynaptic

  11. Statistical analysis of data from dilution assays with censored correlated counts.

    Science.gov (United States)

    Quiroz, Jorge; Wilson, Jeffrey R; Roychoudhury, Satrajit

    2012-01-01

    Frequently, count data obtained from dilution assays are subject to an upper detection limit, and as such, data obtained from these assays are usually censored. Also, counts from the same subject at different dilution levels are correlated. Ignoring the censoring and the correlation may provide unreliable and misleading results. Therefore, any meaningful data modeling requires that the censoring and the correlation be simultaneously addressed. Such comprehensive approaches of modeling censoring and correlation are not widely used in the analysis of dilution assays data. Traditionally, these data are analyzed using a general linear model on a logarithmic-transformed average count per subject. However, this traditional approach ignores the between-subject variability and risks, providing inconsistent results and unreliable conclusions. In this paper, we propose the use of a censored negative binomial model with normal random effects to analyze such data. This model addresses, in addition to the censoring and the correlation, any overdispersion that may be present in count data. The model is shown to be widely accessible through the use of several modern statistical software. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Optimization of high count rate event counting detector with Microchannel Plates and quad Timepix readout

    Energy Technology Data Exchange (ETDEWEB)

    Tremsin, A.S., E-mail: ast@ssl.berkeley.edu; Vallerga, J.V.; McPhate, J.B.; Siegmund, O.H.W.

    2015-07-01

    Many high resolution event counting devices process one event at a time and cannot register simultaneous events. In this article a frame-based readout event counting detector consisting of a pair of Microchannel Plates and a quad Timepix readout is described. More than 10{sup 4} simultaneous events can be detected with a spatial resolution of ~55 µm, while >10{sup 3} simultaneous events can be detected with <10 µm spatial resolution when event centroiding is implemented. The fast readout electronics is capable of processing >1200 frames/sec, while the global count rate of the detector can exceed 5×10{sup 8} particles/s when no timing information on every particle is required. For the first generation Timepix readout, the timing resolution is limited by the Timepix clock to 10–20 ns. Optimization of the MCP gain, rear field voltage and Timepix threshold levels are crucial for the device performance and that is the main subject of this article. These devices can be very attractive for applications where the photon/electron/ion/neutron counting with high spatial and temporal resolution is required, such as energy resolved neutron imaging, Time of Flight experiments in lidar applications, experiments on photoelectron spectroscopy and many others.

  13. Automated counting of morphologically normal red blood cells by using digital holographic microscopy and statistical methods

    Science.gov (United States)

    Moon, Inkyu; Yi, Faliu

    2015-09-01

    In this paper we overview a method to automatically count morphologically normal red blood cells (RBCs) by using off-axis digital holographic microscopy and statistical methods. Three kinds of RBC are used as training and testing data. All of the RBC phase images are obtained with digital holographic microscopy (DHM) that is robust to transparent or semitransparent biological cells. For the determination of morphologically normal RBCs, the RBC's phase images are first segmented with marker-controlled watershed transform algorithm. Multiple features are extracted from the segmented cells. Moreover, the statistical method of Hotelling's T-square test is conducted to show that the 3D features from 3D imaging method can improve the discrimination performance for counting of normal shapes of RBCs. Finally, the classifier is designed by using statistical Bayesian algorithm and the misclassification rates are measured with leave-one-out technique. Experimental results show the feasibility of the classification method for calculating the percentage of each typical normal RBC shape.

  14. High quantum efficiency S-20 photocathodes for photon counting applications

    CERN Document Server

    Orlov, Dmitry A; Pinto, Serge Duarte; Glazenborg, Rene; Kernen, Emilie

    2016-01-01

    Based on conventional S-20 processes, a new series of high quantum efficiency (QE) photocathodes has been developed that can be specifically tuned for use in the ultraviolet, blue or green regions of the spectrum. The QE values exceed 30% at maximum response, and the dark count rate is found to be as low as 30 Hz/cm2 at room temperature. This combination of properties along with a fast temporal response makes these photocathodes ideal for application in photon counting detectors.

  15. Statistical connection of peak counts to power spectrum and moments in weak lensing field

    CERN Document Server

    Shirasaki, Masato

    2016-01-01

    The number density of local maxima of weak lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field $\\cal K$ to a new Gaussian field $y$, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of $\\cal K$ can be reproduced from a single Gaussian field $y$ and monotonic relation between $y$ and $\\cal K$. Therefore, the correct information of two-point clustering and any order of moments in weak lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to ...

  16. A Statistical Method to Constrain Faint Radio Source Counts Below the Detection Threshold

    CERN Document Server

    Mitchell-Wynne, Ketron; Afonso, Jose; Jarvis, Matt J

    2013-01-01

    We present a statistical method based on a maximum likelihood approach to constrain the number counts of extragalactic sources below the nominal flux-density limit of continuum imaging surveys. We extract flux densities from a radio map using positional information from an auxiliary catalogue and show that we can model the number counts of this undetected population down to flux density levels well below the detection threshold of the radio survey. We demonstrate the capabilities that our method will have with future generation wide-area radio surveys by performing simulations over various sky areas with a power-law dN/dS model. We generate a simulated power-law distribution with flux densities ranging from 0.1 \\sigma to 2 \\sigma, convolve this distribution with a Gaussian noise distribution rms of 10 micro-Jy/beam, and are able to recover the counts from the noisy distribution. We then demonstrate the application of our method using data from the Faint Images of the Radio Sky at Twenty-Centimeters survey (FI...

  17. It's not the voting that's democracy, it's the counting: Statistical detection of systematic election irregularities

    CERN Document Server

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2012-01-01

    Democratic societies are built around the principle of free and fair elections, that each citizen's vote should count equal. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies certain statistical consequences for the polling results which can be used to identify election irregularities. Using a suitable data collapse, we find that vote distributions of elections with alleged fraud show a kurtosis of hundred times more than normal elections. As an example we show that reported irregularities in the 2011 Duma election are indeed well explained by systematic ballot stuffing and develop a parametric model quantifying to which extent fraudulent mechanisms are present. We show that if specific statistical properties are present in an election, the results do not represent the will of the people. We formulate a parametric test detecting these stati...

  18. Full counting statistics of renormalized dynamics in open quantum transport system

    Energy Technology Data Exchange (ETDEWEB)

    Luo, JunYan, E-mail: jyluo@zust.edu.cn [School of Science, Zhejiang University of Science and Technology, Hangzhou, 310023 (China); Shen, Yu; He, Xiao-Ling [School of Science, Zhejiang University of Science and Technology, Hangzhou, 310023 (China); Li, Xin-Qi [Department of Chemistry, Hong Kong University of Science and Technology, Kowloon, Hong Kong SAR (China); State Key Laboratory for Superlattices and Microstructures, Institute of Semiconductors, Chinese Academy of Sciences, P.O. Box 912, Beijing 100083 (China); Department of Physics, Beijing Normal University, Beijing 100875 (China); Yan, YiJing [Department of Chemistry, Hong Kong University of Science and Technology, Kowloon, Hong Kong SAR (China)

    2011-11-28

    The internal dynamics of a double quantum dot system is renormalized due to coupling respectively with transport electrodes and a dissipative heat bath. Their essential differences are identified unambiguously in the context of full counting statistics. The electrode coupling caused level detuning renormalization gives rise to a fast-to-slow transport mechanism, which is not resolved at all in the average current, but revealed uniquely by pronounced super-Poissonian shot noise and skewness. The heat bath coupling introduces an interdot coupling renormalization, which results in asymmetric Fano factor and an intriguing change of line shape in the skewness. -- Highlights: ► We study full counting statistics of electron transport through double quantum dots. ► Essential differences due to coupling to the electrodes and heat bath are identified. ► Level detuning induced by electrodes results in strongly enhanced shot noise and skewness. ► Interdot coupling renormalization due to heat bath leads to asymmetric noise and intriguing skewness.

  19. Statistical analysis of the fluctuating counts of fecal bacteria in the water of Lake Kinneret.

    Science.gov (United States)

    Hadas, Ora; Corradini, Maria G; Peleg, Micha

    2004-01-01

    Counts of E. coli, Enteroccoci and fecal coliforms in four sites around Lake Kinneret (The Sea of Galilee), collected every 2-4 weeks for about 5 years during 1995-2002 showed irregular fluctuations punctuated by aperiodic outbursts of variable magnitude. Because of the haphazard nature of fecal contamination and large intervals between successive counts, these patterns were described by probabilistic models, based on the truncated Laplace or Extreme Value distribution. Their applicability was tested by comparing the predicted frequencies of counts exceeding different levels calculated from the first half of each record with those actually observed in its second half. Despite the records imperfections and minor violations of the underlying models' assumptions, there was a reasonable agreement between the estimated and actual frequencies. This demonstrated that it is possible to translate the irregular fluctuation pattern into a set of probabilities of future high counts. In principle, such probabilities can be used to quantify the water's fecal contamination pattern and as a tool to assess the efficacy of preventive measures to reduce it.

  20. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration.

  1. Note: A high count rate real-time digital processing method for PGNAA data acquisition system

    Science.gov (United States)

    Liu, Yuzhe; Chen, Lian; Li, Feng; Liang, Futian; Jin, Ge

    2017-07-01

    The prompt gamma neutron activation analysis (PGNAA) technique is a real-time online method to analyze the composition of industrial materials. This paper presents a data acquisition system with a high count rate and real-time digital processing method for PGNAA. Limited by the decay time of the detector, the ORTEC multi-channel analyzer (MCA) can normally achieve an average count rate of 100 kcps. However, this system uses an electrical technique to increase the average count rate and reduce dead time, and guarantees good accuracy. Since the measuring time is usually limited to about 120 s, in order to accelerate the accumulation rate of spectrum and reduce the statistical error, the average count rate is expected to reach more than 500 kcps.

  2. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    Science.gov (United States)

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  3. Unifying quantum heat transfer in a nonequilibrium spin-boson model with full counting statistics

    Science.gov (United States)

    Wang, Chen; Ren, Jie; Cao, Jianshu

    2017-02-01

    To study the full counting statistics of quantum heat transfer in a driven nonequilibrium spin-boson model, we develop a generalized nonequilibrium polaron-transformed Redfield equation with an auxiliary counting field. This enables us to study the impact of qubit-bath coupling ranging from weak to strong regimes. Without external modulations, we observe maximal values of both steady-state heat flux and noise power in moderate coupling regimes, below which we find that these two transport quantities are enhanced by the finite-qubit-energy bias. With external modulations, the geometric-phase-induced heat flux shows a monotonic decrease upon increasing the qubit-bath coupling at zero qubit energy bias (without bias). While under the finite-qubit-energy bias (with bias), the geometric-phase-induced heat flux exhibits an interesting reversal behavior in the strong coupling regime. Our results unify the seemingly contradictory results in weak and strong qubit-bath coupling regimes and provide detailed dissections for the quantum fluctuation of nonequilibrium heat transfer.

  4. RCT: Module 2.03, Counting Errors and Statistics, Course 8768

    Energy Technology Data Exchange (ETDEWEB)

    Hillmer, Kurt T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-01

    Radiological sample analysis involves the observation of a random process that may or may not occur and an estimation of the amount of radioactive material present based on that observation. Across the country, radiological control personnel are using the activity measurements to make decisions that may affect the health and safety of workers at those facilities and their surrounding environments. This course will present an overview of measurement processes, a statistical evaluation of both measurements and equipment performance, and some actions to take to minimize the sources of error in count room operations. This course will prepare the student with the skills necessary for radiological control technician (RCT) qualification by passing quizzes, tests, and the RCT Comprehensive Phase 1, Unit 2 Examination (TEST 27566) and by providing in the field skills.

  5. Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland   First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...

  6. High quantum efficiency S-20 photocathodes in photon counting detectors

    Science.gov (United States)

    Orlov, D. A.; DeFazio, J.; Duarte Pinto, S.; Glazenborg, R.; Kernen, E.

    2016-04-01

    Based on conventional S-20 processes, a new series of high quantum efficiency (QE) photocathodes has been developed that can be specifically tuned for use in the ultraviolet, blue or green regions of the spectrum. The QE values exceed 30% at maximum response, and the dark count rate is found to be as low as 30 Hz/cm2 at room temperature. This combination of properties along with a fast temporal response makes these photocathodes ideal for application in photon counting detectors, which is demonstrated with an MCP photomultiplier tube for single and multi-photoelectron detection.

  7. Statistical Methods for Unusual Count Data: Examples From Studies of Microchimerism.

    Science.gov (United States)

    Guthrie, Katherine A; Gammill, Hilary S; Kamper-Jørgensen, Mads; Tjønneland, Anne; Gadi, Vijayakrishna K; Nelson, J Lee; Leisenring, Wendy

    2016-10-21

    Natural acquisition of small amounts of foreign cells or DNA, referred to as microchimerism, occurs primarily through maternal-fetal exchange during pregnancy. Microchimerism can persist long-term and has been associated with both beneficial and adverse human health outcomes. Quantitative microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per total cell equivalents tested utilizes all available data and facilitates a comparison of rates between groups. We found that both the marginalized zero-inflated Poisson model and the negative binomial model can provide unbiased and consistent estimates of the overall association of exposure or study group with microchimerism detection rates. The negative binomial model remains the more accessible of these 2 approaches; thus, we conclude that the negative binomial model may be most appropriate for analyzing quantitative microchimerism data. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Full-counting statistics of energy transport of molecular junctions in the polaronic regime

    Science.gov (United States)

    Tang, Gaomin; Yu, Zhizhou; Wang, Jian

    2017-08-01

    We investigate the full-counting statistics (FCS) of energy transport carried by electrons in molecular junctions for the Anderson-Holstein model in the polaronic regime. Using the two-time quantum measurement scheme, the generating function (GF) for the energy transport is derived and expressed as a Fredholm determinant in terms of Keldysh nonequilibrium Green’s function in the time domain. Dressed tunneling approximation is used in decoupling the phonon cloud operator in the polaronic regime. This formalism enables us to analyze the time evolution of energy transport dynamics after a sudden switch-on of the coupling between the dot and the leads towards the stationary state. The steady state energy current cumulant GF in the long time limit is obtained in the energy domain as well. Universal relations for steady state energy current FCS are derived under a finite temperature gradient with zero bias and this enabled us to express the equilibrium energy current cumulant by a linear combination of lower order cumulants. The behaviors of energy current cumulants in steady state under temperature gradient and external bias are numerically studied and explained. The transient dynamics of energy current cumulants is numerically calculated and analyzed. Universal scaling of normalized transient energy cumulants is found under both temperature gradient and external bias.

  9. On temporal correlations in high-resolution frequency counting

    CERN Document Server

    Dunker, Tim; Rønningen, Ole Petter

    2016-01-01

    We analyze noise properties of time series of frequency data from different counting modes of a Keysight 53230A frequency counter. We use a 10 MHz reference signal from a passive hydrogen maser connected via phase-stable Huber+Suhner Sucoflex 104 cables to the reference and input connectors of the counter. We find that the high resolution gap-free (CONT) frequency counting process imposes long-term correlations in the output data, resulting in a modified Allan deviation that is characteristic of random walk phase noise. Equally important, the CONT mode results in a frequency bias. In contrast, the counter's undocumented raw continuous mode (RCON) yields unbiased frequency stability estimates with white phase noise characteristics, and of a magnitude consistent with the counter's 20 ps single-shot resolution. Furthermore, we demonstrate that a 100-point running average filter in conjunction with the RCON mode yields resolution enhanced frequency estimates with flicker phase noise characteristics. For instance,...

  10. High-fidelity spatially resolved multiphoton counting for quantum imaging applications

    CERN Document Server

    Chrapkiewicz, Radoslaw; Banaszek, Konrad

    2015-01-01

    We present a method for spatially resolved multiphoton counting based on an intensified camera with the retrieval of multimode photon statistics fully accounting for non-linearities in the detection process. The scheme relies on one-time quantum tomographic calibration of the detector. Faithful, high-fidelity reconstruction of single- and two-mode statistics of multiphoton states is demonstrated for coherent states and their statistical mixtures. The results consistently exhibit classical values of Mandel and Fano parameters in contrast to raw statistics of camera photo-events. Detector operation is reliable for illumination levels up to the average of one photon per an event area, substantially higher than in previous approaches to characterize quantum statistical properties of light with spatial resolution.

  11. Full-counting statistics of heat transport in harmonic junctions: transient, steady states, and fluctuation theorems.

    Science.gov (United States)

    Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2012-05-01

    We study the statistics of heat transferred in a given time interval t_{M}, through a finite harmonic chain, called the center, which is connected to two heat baths, the left (L) and the right (R), that are maintained at two temperatures. The center atoms are driven by external time-dependent forces. We calculate the cumulant generating function (CGF) for the heat transferred out of the left lead, Q_{L}, based on the two-time quantum measurement concept and using the nonequilibrium Green's function method. The CGF can be concisely expressed in terms of Green's functions of the center and an argument-shifted self-energy of the lead. The expression of the CGF is valid in both transient and steady-state regimes. We consider three initial conditions for the density operator and show numerically, for a one-atom junction, how their transient behaviors differ from each other but, finally, approach the same steady state, independent of the initial distributions. We also derive the CGF for the joint probability distribution P(Q_{L},Q_{R}), and discuss the correlations between Q_{L} and Q_{R}. We calculate the CGF for total entropy production in the reservoirs. In the steady state we explicitly show that the CGFs obey steady-state fluctuation theorems. We obtain classical results by taking ℏ→0. We also apply our method to the counting of the electron number and electron energy, for which the associated self-energy is obtained from the usual lead self-energy by multiplying a phase and shifting the contour time, respectively.

  12. High-resolution neutron microtomography with noiseless neutron counting detector

    Energy Technology Data Exchange (ETDEWEB)

    Tremsin, A.S., E-mail: ast@ssl.berkeley.edu [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); McPhate, J.B.; Vallerga, J.V.; Siegmund, O.H.W. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Feller, W.B. [Nova Scientific Inc., 10 Picker Road, Sturbridge, MA 01566 (United States); Lehmann, E. [Paul Scherrer Institute, CH-5232 Villigen (Switzerland); Butler, L.G. [Louisiana State University, Baton Rouge, LA 70803 (United States); Dawson, M. [Helmholtz Centre Berlin for Materials and Energy (Germany)

    2011-10-01

    The improved collimation and intensity of thermal and cold neutron beamlines combined with recent advances in neutron imaging devices enable high-resolution neutron radiography and microtomography, which can provide information on the internal structure of objects not achievable with conventional X-ray imaging techniques. Neutron detection efficiency, spatial and temporal resolution (important for the studies of dynamic processes) and low background count rate are among the crucial parameters defining the quality of radiographic images and tomographic reconstructions. The unique capabilities of neutron counting detectors with neutron-sensitive microchannel plates (MCPs) and with Timepix CMOS readouts providing high neutron detection efficiency ({approx}70% for cold neutrons), spatial resolutions ranging from 15 to 55 {mu}m and a temporal resolution of {approx}1 {mu}s-combined with the virtual absence of readout noise-make these devices very attractive for high-resolution microtomography. In this paper we demonstrate the capabilities of an MCP-Timepix detection system applied to microtomographic imaging, performed at the ICON cold neutron facility of the Paul Scherrer Institute. The high resolution and the absence of readout noise enable accurate reconstruction of texture in a relatively opaque wood sample, differentiation of internal tissues of a fly and imaging of individual {approx}400 {mu}m grains in an organic powder encapsulated in a {approx}700 {mu}m thick metal casing.

  13. High resolution cross strip anodes for photon counting detectors

    Science.gov (United States)

    Siegmund, O. H. W.; Tremsin, A. S.; Vallerga, J. V.; Abiad, R.; Hull, J.

    2003-05-01

    A new photon counting, imaging readout for microchannel plate sensors, the cross strip (XS) anode, has been investigated. Charge centroiding of signals detected on two orthogonal layers of sense strip sets are used to derive photon locations. The XS anode spatial resolution (<3 μm FWHM) exceeds the spatial resolution of most direct charge sensing anodes, and does so at low gain (<2×10 6). The image linearity and fidelity are high enough to resolve and map 7 μm MCP pores, offering new possibilities for astronomical and other applications.

  14. Energy harvesting using AC machines with high effective pole count

    Science.gov (United States)

    Geiger, Richard Theodore

    In this thesis, ways to improve the power conversion of rotating generators at low rotor speeds in energy harvesting applications were investigated. One method is to increase the pole count, which increases the generator back-emf without also increasing the I2R losses, thereby increasing both torque density and conversion efficiency. One machine topology that has a high effective pole count is a hybrid "stepper" machine. However, the large self inductance of these machines decreases their power factor and hence the maximum power that can be delivered to a load. This effect can be cancelled by the addition of capacitors in series with the stepper windings. A circuit was designed and implemented to automatically vary the series capacitance over the entire speed range investigated. The addition of the series capacitors improved the power output of the stepper machine by up to 700%. At low rotor speeds, with the addition of series capacitance, the power output of the hybrid "stepper" was more than 200% that of a similarly sized PMDC brushed motor. Finally, in this thesis a hybrid lumped parameter / finite element model was used to investigate the impact of number, shape and size of the rotor and stator teeth on machine performance. A typical off-the-shelf hybrid stepper machine has significant cogging torque by design. This cogging torque is a major problem in most small energy harvesting applications. In this thesis it was shown that the cogging and ripple torque can be dramatically reduced. These findings confirm that high-pole-count topologies, and specifically the hybrid stepper configuration, are an attractive choice for energy harvesting applications.

  15. Ecotoxicology is not normal: A comparison of statistical approaches for analysis of count and proportion data in ecotoxicology.

    Science.gov (United States)

    Szöcs, Eduard; Schäfer, Ralf B

    2015-09-01

    Ecotoxicologists often encounter count and proportion data that are rarely normally distributed. To meet the assumptions of the linear model, such data are usually transformed or non-parametric methods are used if the transformed data still violate the assumptions. Generalized linear models (GLMs) allow to directly model such data, without the need for transformation. Here, we compare the performance of two parametric methods, i.e., (1) the linear model (assuming normality of transformed data), (2) GLMs (assuming a Poisson, negative binomial, or binomially distributed response), and (3) non-parametric methods. We simulated typical data mimicking low replicated ecotoxicological experiments of two common data types (counts and proportions from counts). We compared the performance of the different methods in terms of statistical power and Type I error for detecting a general treatment effect and determining the lowest observed effect concentration (LOEC). In addition, we outlined differences on a real-world mesocosm data set. For count data, we found that the quasi-Poisson model yielded the highest power. The negative binomial GLM resulted in increased Type I errors, which could be fixed using the parametric bootstrap. For proportions, binomial GLMs performed better than the linear model, except to determine LOEC at extremely low sample sizes. The compared non-parametric methods had generally lower power. We recommend that counts in one-factorial experiments should be analyzed using quasi-Poisson models and proportions from counts by binomial GLMs. These methods should become standard in ecotoxicology.

  16. Variability in faecal egg counts – a statistical model to achieve reliable determination of anthelmintic resistance in livestock

    DEFF Research Database (Denmark)

    Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret

    statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL...... was shown to be unaffected by single outlier horses on the farms, while traditional calculations were strongly biased. The statistical model combines information between farms to distinguish between variability and genuine reduction in efficacy and can be adapted to handle FECRT data obtained from other......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...

  17. Scalable Intersample Interpolation Architecture for High-channel-count Beamformers

    DEFF Research Database (Denmark)

    Tomov, Borislav Gueorguiev; Nikolov, Svetoslav I; Jensen, Jørgen Arendt

    2011-01-01

    Modern ultrasound scanners utilize digital beamformers that operate on sampled and quantized echo signals. Timing precision is of essence for achieving good focusing. The direct way to achieve it is through the use of high sampling rates, but that is not economical, so interpolation between echo...... samples is used. This paper presents a beamformer architecture that combines a band-pass filter-based interpolation algorithm with the dynamic delay-and-sum focusing of a digital beamformer. The reduction in the number of multiplications relative to a linear perchannel interpolation and band-pass per......-channel interpolation architecture is respectively 58 % and 75 % beamformer for a 256-channel beamformer using 4-tap filters. The approach allows building high channel count beamformers while maintaining high image quality due to the use of sophisticated intersample interpolation....

  18. OPTIMA A Photon Counting High-Speed Photometer

    CERN Document Server

    Straubmeier, C; Schrey, F

    2001-01-01

    OPTIMA is a small, versatile high-speed photometer which is primarily intended for time resolved observations of young high energy pulsars at optical wavelengths. The detector system consists of eight fiber fed photon counters based on avalanche photodiodes, a GPS timing receiver, an integrating CCD camera to ensure the correct pointing of the telescope and a computerized control unit. Since January 1999 OPTIMA proves its scientific potential by measuring a very detailed lightcurve of the Crab Pulsar as well as by observing cataclysmic variable stars on very short timescales. In this article we describe the design of the detector system focussing on the photon counting units and the software control which correlates the detected photons with the GPS timing signal.

  19. A prototype High Purity Germanium detector for high resolution gamma-ray spectroscopy at high count rates

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.J., E-mail: rjcooper@lbl.gov [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Amman, M.; Luke, P.N. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Vetter, K. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Department of Nuclear Engineering, University of California, Berkeley, CA 94720 (United States)

    2015-09-21

    Where energy resolution is paramount, High Purity Germanium (HPGe) detectors continue to provide the optimum solution for gamma-ray detection and spectroscopy. Conventional large-volume HPGe detectors are typically limited to count rates on the order of ten thousand counts per second, however, limiting their effectiveness for high count rate applications. To address this limitation, we have developed a novel prototype HPGe detector designed to be capable of achieving fine energy resolution and high event throughput at count rates in excess of one million counts per second. We report here on the concept, design, and initial performance of the first prototype device.

  20. Variability in faecal egg counts – a statistical model to achieve reliable determination of anthelmintic resistance in livestock

    DEFF Research Database (Denmark)

    Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret;

    arithmetic calculations classified nine farms (14.1 %) as resistant and 11 farms (17.2 %) as suspect resistant. Using 10000 Monte Carlo simulated data sets, our methodology provides a reliable classification of farms into different resistance categories with a false discovery rate of 1.02 %. The methodology...... statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...

  1. The joint statistics of mildly non-linear cosmological densities and slopes in count-in-cells

    CERN Document Server

    Bernardeau, Francis; Pichon, Christophe

    2015-01-01

    In the context of count-in-cells statistics, the joint probability distribution of the density in two concentric spherical shells is predicted from first first principle for sigmas of the order of one. The agreement with simulation is found to be excellent. This statistics allows us to deduce the conditional one dimensional probability distribution function of the slope within under dense (resp. overdense) regions, or of the density for positive or negative slopes. The former conditional distribution is likely to be more robust in constraining the cosmological parameters as the underlying dynamics is less evolved in such regions. A fiducial dark energy experiment is implemented on such counts derived from Lambda-CDM simulations.

  2. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  3. Certification Can Count: The Case of Aircraft Mechanics. Issues in Labor Statistics. Summary 02-03.

    Science.gov (United States)

    Bureau of Labor Statistics, Washington, DC.

    This document is a summary of aerospace industry technician statistics gathered by the Occupational Employment Statistics Survey for the year 2000 by the Department of Labor, Bureau of Labor Statistics. The data includes the following: (1) a comparison of wages earned by Federal Aviation Administration (FAA) certified and non-FAA certified…

  4. The Invention of Counting: The Statistical Measurement of Literacy in Nineteenth-Century England

    Science.gov (United States)

    Vincent, David

    2014-01-01

    This article examines the invention of counting literacy on a national basis in nineteenth-century Britain. Through an analysis of Registrar Generals' reports, it describes how the early statisticians wrestled with the implications of their new-found capacity to describe a nation's communications skills in a single table and how they were unable…

  5. The Invention of Counting: The Statistical Measurement of Literacy in Nineteenth-Century England

    Science.gov (United States)

    Vincent, David

    2014-01-01

    This article examines the invention of counting literacy on a national basis in nineteenth-century Britain. Through an analysis of Registrar Generals' reports, it describes how the early statisticians wrestled with the implications of their new-found capacity to describe a nation's communications skills in a single table and how they were unable…

  6. High Reproducibility of ELISPOT Counts from Nine Different Laboratories

    DEFF Research Database (Denmark)

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem

    2015-01-01

    laboratories. We plated PBMC at four different concentrations, 24 replicates each, in an IFN-γ ELISPOT assay with HCMV pp65 antigen. The ELISPOT plate, and an image file of the plate was counted in nine different laboratories using ImmunoSpot® Analyzers by (A) Basic Count™ relying on subjective counting...

  7. Chi-square and Poissonian Data: Biases Even in the High-Count Regime and How to Avoid them

    CERN Document Server

    Humphrey, Philip J; Buote, David A

    2008-01-01

    We demonstrate that two approximations to the chi^2 statistic as popularly employed by observational astronomers for fitting Poisson-distributed data can give rise to intrinsically biased model parameter estimates, even in the high counts regime, unless care is taken over the parameterization of the problem. For a small number of problems, previous studies have shown that the fractional bias introduced by these approximations is often small when the counts are high. However, we show that for a broad class of problem, unless the number of data bins is far smaller than \\sqrt{N_c}, where N_c is the total number of counts in the dataset, the bias will still likely be comparable to, or even exceed, the statistical error. Conversely, we find that fits using Cash's C-statistic give comparatively unbiased parameter estimates when the counts are high. Taking into account their well-known problems in the low count regime, we conclude that these approximate chi^2 methods should not routinely be used for fitting an arbit...

  8. Full-counting statistics and phase transition in an open quantum system of non-interacting electrons

    Science.gov (United States)

    Medvedyeva, Mariya; Kehrein, Stefan

    2014-03-01

    We develop a method for calculating the full-counting statistics for a non-interacting fermionic system coupled to memory-less reservoirs. The evolution of the system is described by the Lindblad equation. We introduce the counting field in the Lindblad equation which yields the generating function and allows us to obtain all cumulants of the charge transport. In a uniform system the cumulants of order k are independent of the system size for systems longer than k+1 sites. The counting statistics from the Lindblad approach does not take into account the interference in the reservoirs which gives a decreased value of noise in comparison to the Green function approach which describes phase coherent leads. The two methods yield the same value for the current, which is due to current conservation. The Fano factors are different (and linearly related) and allow us to distinguish between memory-less and phase coherent reservoirs. We also consider the influence of dissipation along the chain allowing for both tunneling into and out of the chain along its length. Infinitesimally small dissipation along the chain induces a quantum phase transition which manifests itself as a discontinuity in transport properties and entropy.

  9. Investigation of Detector Behaviour At High Count Rates for the Purple Crow Lidar

    Science.gov (United States)

    Sica, R. J.; McCullough, E. M.; Jalali, A.; Hartery, S.; Farhani, G.; Argall, P.; Argall, S.

    2013-12-01

    Temperature measurements in the middle and upper atmosphere are an important complement to similar measurements in the lower atmosphere. Even modest size Rayleigh-scatter lidars are capable of high quality measurements of temperature in the stratosphere (above 25 km) and lower mesosphere. The most commonly reported uncertainty, that due to counting statistics, is well understood and affects temperatures at the greatest heights (i.e. lowest signal rates). Counting statistics have a lesser effect on temperatures at the lower range of measurements, where the photocount rate is larger. However, if a lidar's dynamic range is increased by combining analog and digital counting profiles into a 'glued' profile, the gluing introduces a systematic uncertainty. In this presentation we will show the effect of the uncertainty due to gluing on our temperature measurements. The Purple Crow Lidar (PCL), located at the The University of Western Ontario's Echo Base Field Station near London, Canada, has undergone considerable modifications to its transmitter (now a Litron Nd:YAG laser outputting 1 J/pulse at 532 nm with a repetition rate of 30 Hz) as well as to its data acquisition system. The PCL has retained its 2.6 m diameter liquid mercury mirror, giving the system a large power-aperture product. Such a large throughput requires simultaneous analog-digital detection to obtain Rayleigh-scatter temperatures from 25 to above 100 km. The analog and digital profiles must be combined into a single continuous profile, a process called gluing. Several excellent methods for gluing profiles have been presented, but prior to now systematic uncertainties due to the procedure have not been quantified. We will present a detailed characterization of the analog and digital counting channels, using a variety of tests which will show the effect of the gluing procedure on the retrieved temperature.

  10. Cosmological constraints with weak lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, Francois; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-01-01

    Peak statistics in weak lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. To prepare for the high precision afforded by next-generation weak lensing surveys, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how CAMELUS---a fast stochastic model for predicting peaks---can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. We measure the abundance histogram of peaks in a mock shear catalogue of approximately 5,000 deg2 using a multiscale mass map filtering technique, and we then constrain the parameters of the mock survey using CAMELUS combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. We find that peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, indicating the need to better understand and control the model's systematics before applying it to a real survey of this size or larger. We perform a calibration of the model to remove the bias and compare results to those from the two-point correlation functions (2PCF) measured on the same field. In this case, we find the derived parameter Σ8 = σ8(Ωm/0.27)α = 0.76 (-0.03 +0.02) with α = 0.65 for peaks, while for 2PCF the values are Σ8 = 0.76 (-0.01 +0.02) and α = 0.70. We conclude that the constraining power can therefore be comparable between the two weak lensing observables in large-field surveys. Furthermore, the tilt in the σ8-Ωm degeneracy direction for peaks with respect to that of 2PCF suggests that a combined analysis would yield tighter constraints than either measure alone. As expected, w0de cannot be

  11. High performance universal analog and counting photodetector for LIDAR applications

    Science.gov (United States)

    Linga, Krishna; Krutov, Joseph; Godik, Edward; Seemungal, Wayne; Shushakov, Dmitry; Shubin, V. E.

    2005-08-01

    We demonstrate the feasibility of applying the emerging technology of internal discrete amplification to create an efficient, ultra low noise, universal analog and counting photodetector for LIDAR remote sensing. Photodetectors with internal discrete amplification can operate in the linear detection mode with a gain-bandwidth product of up to 1015 and in the photon counting mode with count rates of up to 109 counts/sec. Detectors based on this mechanism could have performance parameters superior to those of conventional avalanche photodiodes and photomultiplier tubes. For silicon photodetector prototypes, measured excess noise factor is as low as 1.02 at gains greater than 100,000. This gives the photodetectors and, consequently, the LIDAR systems new capabilities that could lead to important advances in LIDAR remote sensing.

  12. Optimization of statistical methods for HpGe gamma-ray spectrometer used in wide count rate ranges

    Science.gov (United States)

    Gervino, G.; Mana, G.; Palmisano, C.

    2016-07-01

    The need to perform γ-ray measurements with HpGe detectors is a common technique in many fields such as nuclear physics, radiochemistry, nuclear medicine and neutron activation analysis. The use of HpGe detectors is chosen in situations where isotope identification is needed because of their excellent resolution. Our challenge is to obtain the "best" spectroscopy data possible in every measurement situation. "Best" is a combination of statistical (number of counts) and spectral quality (peak, width and position) over a wide range of counting rates. In this framework, we applied Bayesian methods and the Ellipsoidal Nested Sampling (a multidimensional integration technique) to study the most likely distribution for the shape of HpGe spectra. In treating these experiments, the prior information suggests to model the likelihood function with a product of Poisson distributions. We present the efforts that have been done in order to optimize the statistical methods to HpGe detector outputs with the aim to evaluate to a better order of precision the detector efficiency, the absolute measured activity and the spectra background. Reaching a more precise knowledge of statistical and systematic uncertainties for the measured physical observables is the final goal of this research project.

  13. Actinic defect counting statistics over 1 cm2 area of EUVL mask blank

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Seongtae; Lai, Chih-Wei; Rekawa, Seno; Walton, Chris W.; Bokor, Jeffrey

    2000-02-18

    As a continuation of comparison experiments between EUV inspection and visible inspection of defects on EUVL mask blanks, we report on the result of an experiment where the EUV defect inspection tool is used to perform at-wavelength defect counting over 1 cm{sup 2} of EUVL mask blank. Initial EUV inspection found five defects over the scanned area and the subsequent optical scattering inspection was able to detect all of the five defects. Therefore, if there are any defects that are only detectable by EUV inspection, the density is lower than the order of unity per cm2. An upgrade path to substantially increase the overall throughput of the EUV inspection system is also identified in the manuscript.

  14. Full counting statistics of transport electrons through a two-level quantum dot with spin–orbit coupling

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.M. [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China); Xue, H.B. [College of Physics and Optoelectronics, Taiyuan University of Technology, Taiyuan 030024 (China); Xue, N.T. [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China); Liang, J.-Q., E-mail: jqliang@sxu.edu.cn [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China)

    2015-02-15

    We study the full counting statistics of transport electrons through a semiconductor two-level quantum dot with Rashba spin–orbit (SO) coupling, which acts as a nonabelian gauge field and thus induces the electron transition between two levels along with the spin flip. By means of the quantum master equation approach, shot noise and skewness are obtained at finite temperature with two-body Coulomb interaction. We particularly demonstrate the crucial effect of SO coupling on the super-Poissonian fluctuation of transport electrons, in terms of which the SO coupling can be probed by the zero-frequency cumulants. While the charge currents are not sensitive to the SO coupling.

  15. Finite-time full counting statistics and factorial cumulants for transport through a quantum dot with normal and superconducting leads

    Science.gov (United States)

    Droste, Stephanie; Governale, Michele

    2016-04-01

    We study the finite-time full counting statistics for subgap transport through a single-level quantum dot tunnel-coupled to one normal and one superconducting lead. In particular, we determine the factorial and the ordinary cumulants both for finite times and in the long-time limit. We find that the factorial cumulants violate the sign criterion, indicating a non-binomial distribution, even in absence of Coulomb repulsion due to the presence of superconducting correlations. At short times the cumulants exhibit oscillations which are a signature of the coherent transfer of Cooper pairs between the dot and the superconductor.

  16. Study of Distortions in Statistics of Counts in CCD Observations using the Fano Factor

    CERN Document Server

    Afanasieva, I V

    2016-01-01

    Factors distorting the statistics of photocounts when acquiring objects with low fluxes were considered here. Measurements of the Fano factor for existing CCD systems were conducted. The study allows one to conclude on the quality of the CCD video signal processing channel. The optimal strategy for faint object observations was suggested.

  17. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  18. The statistical distribution of the number of counted scintillation photons in digital silicon photomultipliers: model and validation.

    Science.gov (United States)

    van Dam, Herman T; Seifert, Stefan; Schaart, Dennis R

    2012-08-07

    In the design and application of scintillation detectors based on silicon photomultipliers (SiPMs), e.g. in positron emission tomography imaging, it is important to understand and quantify the non-proportionality of the SiPM response due to saturation, crosstalk and dark counts. A new type of SiPM, the so-called digital silicon photomultiplier (dSiPM), has recently been introduced. Here, we develop a model of the probability distribution of the number of fired microcells, i.e. the number of counted scintillation photons, in response to a given amount of energy deposited in a scintillator optically coupled to a dSiPM. Based on physical and functional principles, the model elucidates the statistical behavior of dSiPMs. The model takes into account the photon detection efficiency of the detector; the light yield, excess variance and time profile of the scintillator; and the crosstalk probability, dark count rate, integration time and the number of microcells of the dSiPM. Furthermore, relations for the expectation value and the variance of the number of fired cells are deduced. These relations are applied in the experimental validation of the model using a dSiPM coupled to a LSO:Ce,Ca scintillator. Finally, we propose an accurate method for the correction of energy spectra measured with dSiPM-based scintillation detectors.

  19. A distortion of very-high-redshift galaxy number counts by gravitational lensing.

    Science.gov (United States)

    Wyithe, J Stuart B; Yan, Haojing; Windhorst, Rogier A; Mao, Shude

    2011-01-13

    The observed number counts of high-redshift galaxy candidates have been used to build up a statistical description of star-forming activity at redshift z ≳ 7, when galaxies reionized the Universe. Standard models predict that a high incidence of gravitational lensing will probably distort measurements of flux and number of these earliest galaxies. The raw probability of this happening has been estimated to be ∼0.5 per cent (refs 11, 12), but can be larger owing to observational biases. Here we report that gravitational lensing is likely to dominate the observed properties of galaxies with redshifts of z ≳ 12, when the instrumental limiting magnitude is expected to be brighter than the characteristic magnitude of the galaxy sample. The number counts could be modified by an order of magnitude, with most galaxies being part of multiply imaged systems, located less than 1 arcsec from brighter foreground galaxies at z ≈ 2. This lens-induced association of high-redshift and foreground galaxies has perhaps already been observed among a sample of galaxy candidates identified at z ≈ 10.6. Future surveys will need to be designed to account for a significant gravitational lensing bias in high-redshift galaxy samples.

  20. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  1. An innovative method to reduce count loss from pulse pile-up in a photon-counting pixel for high flux X-ray applications

    Science.gov (United States)

    Lee, D.; Lim, K.; Park, K.; Lee, C.; Alexander, S.; Cho, G.

    2017-03-01

    In this study, an innovative fast X-ray photon-counting pixel for high X-ray flux applications is proposed. A computed tomography system typically uses X-ray fluxes up to 108 photons/mm2/sec at the detector and thus a fast read-out is required in order to process individual X-ray photons. Otherwise, pulse pile-up can occur at the output of the signal processing unit. These superimposed signals can distort the number of incident X-ray photons leading to count loss. To minimize such losses, a cross detection method was implemented in the photon-counting pixel. A maximum count rate under X-ray tube voltage of 90 kV was acquired which reflect electrical test results of the proposed photon counting pixel. A maximum count of 780 kcps was achieved with a conventional photon-counting pixel at the pulse processing time of 500 ns, which is the time for a pulse to return to the baseline from the initial rise. In contrast, the maximum count of about 8.1 Mcps was achieved with the proposed photon-counting pixel. From these results, it was clear that the maximum count rate was increased by approximately a factor 10 times by adopting the cross detection method. Therefore, it is an innovative method to reduce count loss from pulse pile-up in a photon-counting pixel while maintaining the pulse processing time.

  2. Using Poisson statistics to analyze supernova remnant emission in the low counts X-ray regime

    Science.gov (United States)

    Roper, Quentin Jeffrey

    We utilize a Poisson likelihood in a maximum likelihood statistical analysis to analyze X-ray spectragraphic data. Specifically, we examine four extragalactic supernova remnants (SNR). IKT 5 (SNR 0047-73.5), IKT 25 (SNR 0104-72.3), and DEM S 128 (SNR 0103-72.4) which are designated as Type Ia in the literature due to their spectra and morphology. This is troublesome because of their asymmetry, a trait not usually associated with young Type Ia remnants. We present Chandra X-ray Observatory data on these three remnants, and perform a maximum likelihood analysis on their spectra. We find that the X-ray emission is dominated by interactions with the interstellar medium. In spite of this, we find a significant Fe overabundance in all three remnants. Through examination of radio, optical, and infrared data, we conclude that these three remnants are likely not "classical" Type Ia SNR, but may be examples of so-called "prompt" Type Ia SNR. We detect potential point sources that may be members of the progenitor systems of both DEM S 128 and IKT 5, which could suggest a new subclass of prompt Type Ia SNR, Fe-rich CC remnants. In addition, we examine IKT 18. This remnant is positionally coincident with the X-ray point source HD 5980. Due to an outburst in 1994, in which its brightness changed by 3 magnitudes (corrsponding to an increase in luminosity by a factor of 16) HD 5980 was classified as a luminous blue variable star. We examine this point source and the remnant IKT 18 in the X-ray, and find that its non-thermal photon index has decreased from 2002 to 2013, corresponding to a larger proportion of more energetic X-rays, which is unexpected.

  3. Lysozyme activity and L(+)-lactic acid production in saliva in schoolchildren with high Lactobacillus counts.

    Science.gov (United States)

    Twetman, S; Dahllöf, G; Wikner, S

    1987-04-01

    Out of 374 schoolchildren, aged 13-15 yr, 42 with high counts of salivary lactobacilli (greater than or equal to 10(5] were selected for this study. Lysozyme activity in saliva and L(+)-lactic acid (LA) production after addition of glucose were determined. The mean values of lysozyme activity and LA concentration were 19.4 micrograms/ml and 1.4 mmol/l respectively. The levels of LA produced without addition of glucose were less than 0.2 mmol/l. The results showed a statistically significant (P less than 0.05) negative correlation between lysozyme activity and the levels of LA produced. The findings of this study suggest that lysozyme may be of importance in limiting acid production in saliva.

  4. Enhanced Fabrication Processes Development for High Actuator Count Deformable Mirrors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to advance manufacturing science and technology to improve yield and optical surface figure in high actuator count, high-resolution deformable mirrors...

  5. A Statistical Perspective on Highly Accelerated Testing

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Edward V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning

  6. A Statistical Perspective on Highly Accelerated Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Edward V.

    2015-02-01

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning

  7. Statistical properties of high-lying chaotic eigenstates

    CERN Document Server

    Li, B; Li, Baowen; Robnik, Marko

    1995-01-01

    We study the statistical properties of the high-lying chaotic eigenstates (200,000 and above) which are deep in the semiclassical regime. The system we are analyzing is the billiard system inside the region defined by the quadratic (complex) conformal map of the unit disk as introduced by Robnik (1983). We are using Heller's method of plane wave decomposition of the numerical eigenfunctions, and perform extensive statistical analysis with the following conclusions: (1) The local average probability density is in excellent agreement with the microcanonical assumption and all statistical properties are also in excellent agreement with the Gaussian random model; \\qquad (2) The autocorrelation function is found to be strongly direction dependent and only after averaging over all directions agrees well with Berry's (1977) prediction; \\qquad (3) Although the scars of unstable classical periodic orbits (in such ergodic regime) are expected to exist, so far we have not found any (around 200,000th state) but a scar-li...

  8. High Triglycerides Are Associated with Low Thrombocyte Counts and High VEGF in Nephropathia Epidemica.

    Science.gov (United States)

    Martynova, Ekaterina V; Valiullina, Aygul H; Gusev, Oleg A; Davidyuk, Yuriy N; Garanina, Ekaterina E; Shakirova, Venera G; Khaertynova, Ilsiyar; Anokhin, Vladimir A; Rizvanov, Albert A; Khaiboullina, Svetlana F

    2016-01-01

    Nephropathia epidemica (NE) is a mild form of hemorrhagic fever with renal syndrome. Several reports have demonstrated a severe alteration in lipoprotein metabolism. However, little is known about changes in circulating lipids in NE. The objectives of this study were to evaluate changes in serum total cholesterol, high density cholesterol (HDCL), and triglycerides. In addition to evaluation of serum cytokine activation associations, changes in lipid profile and cytokine activation were determined for gender, thrombocyte counts, and VEGF. Elevated levels of triglycerides and decreased HDCL were observed in NE, while total cholesterol did not differ from controls. High triglycerides were associated with both the lowest thrombocyte counts and high serum VEGF, as well as a high severity score. Additionally, there were higher levels of triglycerides in male than female NE patients. Low triglycerides were associated with upregulation of IFN-γ and IL-12, suggesting activation of Th1 helper cells. Furthermore, levels of IFN-γ and IL-12 were increased in patients with lower severity scores, suggesting that a Th1 type immune response is playing protective role in NE. These combined data advance the understanding of NE pathogenesis and indicate a role for high triglycerides in disease severity.

  9. Effect of finite Coulomb interaction on full counting statistics of electronic transport through single-molecule magnet

    Energy Technology Data Exchange (ETDEWEB)

    Xue Haibin, E-mail: xhb98326110@163.co [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Nie, Y.-H., E-mail: nieyh@sxu.edu.c [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Li, Z.-J.; Liang, J.-Q. [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China)

    2011-01-17

    We study the full counting statistics (FCS) in a single-molecule magnet (SMM) with finite Coulomb interaction U. For finite U the FCS, differing from U{yields}{infinity}, shows a symmetric gate-voltage-dependence when the coupling strengths with two electrodes are interchanged, which can be observed experimentally just by reversing the bias-voltage. Moreover, we find that the effect of finite U on shot noise depends on the internal level structure of the SMM and the coupling asymmetry of the SMM with two electrodes as well. When the coupling of the SMM with the incident-electrode is stronger than that with the outgoing-electrode, the super-Poissonian shot noise in the sequential tunneling regime appears under relatively small gate-voltage and relatively large finite U, and dose not for U{yields}{infinity}; while it occurs at relatively large gate-voltage for the opposite coupling case. The formation mechanism of super-Poissonian shot noise can be qualitatively attributed to the competition between fast and slow transport channels.

  10. Full counting statistics of phonon-assisted Andreev tunneling through a quantum dot coupled to normal and superconducting leads

    Science.gov (United States)

    Dong, Bing; Ding, G. H.; Lei, X. L.

    2017-01-01

    We present a theoretical investigation for the full counting statistics of the Andreev tunneling through a quantum dot (QD) embedded between superconducting (SC) and normal leads in the presence of a strong on-site electron-phonon interaction using nonequilibrium Green function method. For this purpose, we generalize the dressed tunneling approximation (DTA) recently developed in dealing with inelastic tunneling in a normal QD system to the Andreev transport issue. This method takes account of vibrational effect in evaluation of electronic tunneling self energy in comparison with other simple approaches and meanwhile allows us to derive an explicit analytical formula for the cumulant generating function at the subgap region. We then analyze the interplay of polaronic and SC proximity effects on the Andreev reflection spectrum, current-voltage characteristics, and current fluctuations of the hybrid system. Our main findings include: (1) no phonon side peaks in the linear Andreev conductance; (2) a negative differential conductance stemming from the suppressed Andreev reflection spectrum; (3) a novel inelastic resonant peak in the differential conductance due to phonon assisted Andreev reflection; (4) enhancement or suppression of shot noise for the symmetric or asymmetric tunnel-coupling system, respectively.

  11. Homeless High School Students in America: Who Counts?

    Science.gov (United States)

    Cumming, John M.; Gloeckner, Gene W.

    2012-01-01

    After interviewing homeless high school students, the research team in a Colorado school district discovered that many students had not revealed their true living conditions (homelessness) to anyone in the school district. This research team developed an anonymous survey written around the homeless categories identified in the McKinney-Vento…

  12. High impact  =  high statistical standards? Not necessarily so.

    Directory of Open Access Journals (Sweden)

    Patrizio E Tressoldi

    Full Text Available What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  13. High impact  =  high statistical standards? Not necessarily so.

    Science.gov (United States)

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  14. High dimensional data driven statistical mechanics.

    Science.gov (United States)

    Adachi, Yoshitaka; Sadamatsu, Sunao

    2014-11-01

    In "3D4D materials science", there are five categories such as (a) Image acquisition, (b) Processing, (c) Analysis, (d) Modelling, and (e) Data sharing. This presentation highlights the core of these categories [1]. Analysis and modellingA three-dimensional (3D) microstructure image contains topological features such as connectivity in addition to metric features. Such more microstructural information seems to be useful for more precise property prediction. There are two ways for microstructure-based property prediction (Fig. 1A). One is 3D image data based modelling such as micromechanics or crystal plasticity finite element method. The other one is a numerical microstructural features driven machine learning approach such as artificial neural network or Bayesian estimation method. It is the key to convert the 3D image data into numerals in order to apply the dataset to property prediction. As a numerical feature of microstructures, grain size, number of density, of particles, connectivity of particles, grain boundary connectivity, stacking degree, clustering etc. should be taken into consideration. These microstructural features are so-called "materials genome". Among those materials genome, we have to find out dominant factors to determine a focused property. The dominant factorzs are defined as "descriptor(s)" in high dimensional data driven statistical mechanics.jmicro;63/suppl_1/i4/DFU086F1F1DFU086F1Fig. 1.(a) A concept of 3D4D materials science. (b) Fully-automated serial sectioning 3D microscope "Genus_3D". (c) Materials Genome Archive (JSPS). Image acquisitionIt is important for researchers to choice a 3D microscope from various microscopes depending on a length-scale of a focused microstructure. There is a long term request to acquire a 3D microstructure image more conveniently. Therefore a fully automated serial sectioning 3D optical microscope "Genus_3D" (Fig. 1B) has been developed and nowadays it is commercially available. A user can get a good

  15. Counting Extra Dimensions Magnetic Cherenkov Radiation from High Energy Neutrinos

    CERN Document Server

    Domokos, Gabor K; Kövesi-Domokos, S; Erdas, Andrea

    2003-01-01

    In theories which require a space of dimension d > 4, there is a natural mechanism of suppressing neutrino masses: while Standard Model fields are confined to a 3-brane, right handed neutrinos live in the bulk. Due to Kaluza-Klein excitations, the effective magnetic moments of neutrinos are enhanced. The effective magnetic moment is a monotonically growing function of the energy of the neutrino: consequently, high energy neutrinos can emit observable amounts of magnetic Cherenkov radiation. By observing the energy dependence of the magnetic Cherenkov radiation, one may be able to determine the number of compactified dimensions.

  16. Highly able pupils in Scotland: Making a curriculum change count

    Directory of Open Access Journals (Sweden)

    Sutherland Margaret

    2011-01-01

    Full Text Available In line with many countries Scotland is seeking to develop citizens fit to deal with the challenges of the 21st century (Scottish Executive, 2006. It also wants to ensure that children’s abilities and talents are recognised and extended. One way it has sought to do this is to develop a new curriculum framework - Curriculum for Excellence (CfE. CfE endeavors to provide a coordinated approach to curriculum reform for the age range 3-18. It seeks to move away from a prescriptive model towards a more teacher centred model which relies on teacher educators adapting national guidelines to meet local needs. This paper will outline the legislative context for highly able pupils in Scotland and then consider the relative merits of the new curriculum framework for this cohort of pupils. It will examine what is considered optimal curriculum provision for highly able pupils in relation to the process model of curriculum development (Stenhouse, 1975.

  17. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation

  18. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu

  19. Cosmological constraints with weak-lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, François; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-03-01

    Peak statistics in weak-lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. Next-generation galaxy surveys, with their advanced optics and large areas, will measure the cosmic weak-lensing signal with unprecedented precision. To prepare for these anticipated data sets, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how Camelus, a fast stochastic model for predicting peaks, can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. Considering peaks with a signal-to-noise ratio ≥ 1, we measure the abundance histogram in a mock shear catalogue of approximately 5000 deg2 using a multiscale mass-map filtering technique. We constrain the parameters of the mock survey using Camelus combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. Peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, as measured by the width ΔΣ8 of the 1σ contour. We find Σ8 = σ8(Ωm/ 0.27)α = 0.77-0.05+0.06 with α = 0.75 for a flat ΛCDM model. The strong bias indicates the need to better understand and control the model systematics before applying it to a real survey of this size or larger. We perform a calibration of the model and compare results to those from the two-point correlation functions ξ± measured on the same field. We calibrate the ξ± result as well, since its contours are also biased, although not as severely as for peaks. In this case, we find for peaks Σ8 = 0.76-0.03+0.02 with α = 0.65, while for the combined ξ+ and ξ- statistics the values are Σ8 = 0.76-0.01+0.02 and α = 0

  20. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  1. Effect of high and low antral follicle count in pubertal beef heifers on IVF

    Science.gov (United States)

    Pubertal heifers can be classified between those with high (n = 25) or low (n = 15) antral follicle counts (AFC). The objective of this study was to determine oocyte development and maturation (e.g. fertility) in an IVF system for high- and low-AFC heifers. From a pool of 120 heifers, 10 high- and 1...

  2. Atom-counting in High Resolution Electron Microscopy:TEM or STEM - That's the question.

    Science.gov (United States)

    Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S

    2016-10-27

    In this work, a recently developed quantitative approach based on the principles of detection theory is used in order to determine the possibilities and limitations of High Resolution Scanning Transmission Electron Microscopy (HR STEM) and HR TEM for atom-counting. So far, HR STEM has been shown to be an appropriate imaging mode to count the number of atoms in a projected atomic column. Recently, it has been demonstrated that HR TEM, when using negative spherical aberration imaging, is suitable for atom-counting as well. The capabilities of both imaging techniques are investigated and compared using the probability of error as a criterion. It is shown that for the same incoming electron dose, HR STEM outperforms HR TEM under common practice standards, i.e. when the decision is based on the probability function of the peak intensities in HR TEM and of the scattering cross-sections in HR STEM. If the atom-counting decision is based on the joint probability function of the image pixel values, the dependence of all image pixel intensities as a function of thickness should be known accurately. Under this assumption, the probability of error may decrease significantly for atom-counting in HR TEM and may, in theory, become lower as compared to HR STEM under the predicted optimal experimental settings. However, the commonly used standard for atom-counting in HR STEM leads to a high performance and has been shown to work in practice.

  3. Statistics of High-level Scene Context

    Directory of Open Access Journals (Sweden)

    Michelle R. Greene

    2013-10-01

    Full Text Available Context is critical to our ability to recognize environments and to search for objects within them: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48,167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell, Torralba, Muphy & Freeman, 2008. From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed things in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human rapid scene categorization is discussed. Ensemble statistics were found to be the most informative (per feature, and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. Some objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help researchers in visual cognition design new data

  4. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points.

    Science.gov (United States)

    Achcar, J A; Martinez, E Z; Ruffino-Netto, A; Paulino, C D; Soares, P

    2008-12-01

    We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software.

  5. Multivariate High Order Statistics of Measurements of the Temporal Evolution of Fission Chain-Reactions

    Energy Technology Data Exchange (ETDEWEB)

    Mattingly, J.K.

    2001-03-08

    The development of high order statistical analyses applied to measurements of the temporal evolution of fission chain-reactions is described. These statistics are derived via application of Bayes' rule to conditional probabilities describing a sequence of events in a fissile system beginning with the initiation of a chain-reaction by source neutrons and ending with counting events in a collection of neutron-sensitive detectors. Two types of initiating neutron sources are considered: (1) a directly observable source introduced by the experimenter (active initiation), and (2) a source that is intrinsic to the system and is not directly observable (passive initiation). The resulting statistics describe the temporal distribution of the population of prompt neutrons in terms of the time-delays between members of a collection (an n-tuplet) of correlated detector counts, that, in turn, may be collectively correlated with a detected active source neutron emission. These developments are a unification and extension of Rossi-a, pulsed neutron, and neutron noise methods, each of which measure the temporal distribution of pairs of correlated events, to produce a method that measures the temporal distribution of n-tuplets of correlated counts of arbitrary dimension n. In general the technique should expand present capabilities in the analysis of neutron counting measurements.

  6. High preoperative monocyte count/high-density lipoprotein ratio is associated with postoperative atrial fibrillation and mortality in coronary artery bypass grafting.

    Science.gov (United States)

    Saskin, Hüseyin; Serhan Ozcan, Kazim; Yilmaz, Seyhan

    2017-03-01

    The monocyte to high-density lipoprotein ratio has recently emerged as an indicator of inflammation and oxidative stress. The aim of this study was to evaluate the association of the monocyte to high-density lipoprotein ratio with postoperative atrial fibrillation and mortality in coronary artery bypass grafting. Six hundred and sixty-two patients who were in sinus rhythm preoperatively and who had isolated coronary artery bypass grafting were retrospectively included in the study. Patients who had atrial fibrillation in the early postoperative period were enrolled in group 1 ( n  = 153); patients who remained in sinus rhythm in the early postoperative period were included in group 2 ( n  = 509). The clinical and demographic data of the patients, biochemical and complete blood count parameters, preoperative monocyte count/high-density lipoprotein cholesterol ratio, and operative and postoperative data were recorded. Preoperative monocyte counts ( P  = 0.0001), monocyte count/high-density lipoprotein cholesterol ratio ( P = 0.0001) and C-reactive protein levels ( P  = 0.0001) were significantly increased in group 1. In the first month, 8 patients in group 1 (5.2%) and 5 patients in group 2 (1.0%) died, which was statistically significant ( P  = 0.003). In univariate and multivariate logistic regression analyses, an elevated preoperative monocyte count/high-density lipoprotein cholesterol ratio ( P  = 0.03) and C-reactive protein levels ( P  = 0.0001) were predictors of postoperative atrial fibrillation. Preoperative monocyte counts ( P  = 0.001), monocyte count/high-density lipoprotein cholesterol ratio ( P  = 0.0001) and the use of inotropic support ( P  = 0.0001) were also predictors of mortality in the early postoperative period. We have observed that high preoperative monocyte count/ high-density lipoprotein ratio was associated with postoperative atrial fibrillation and mortality after coronary artery bypass grafting

  7. High-channel-count plasmonic filter with the metal-insulator-metal Fibonacci-sequence gratings.

    Science.gov (United States)

    Gong, Yongkang; Liu, Xueming; Wang, Leiran

    2010-02-01

    Fibonacci-sequence gratings based on metal-insulator-metal waveguides are proposed. The spectrum properties of this structure are numerically investigated by using the transfer matrix method. Numerical results demonstrate that the proposed structure can generate high-channel-count plasmonic stop bands and can find significant applications in highly integrated dense wavelength division multiplexing networks.

  8. Performance of Drift-Tube Detectors at High Counting Rates for High-Luminosity LHC Upgrades

    CERN Document Server

    Bittner, Bernhard; Kortner, Oliver; Kroha, Hubert; Manfredini, Alessandro; Nowak, Sebastian; Ott, Sebastian; Richter, Robert; Schwegler, Philipp; Zanzi, Daniele; Biebel, Otmar; Hertenberger, Ralf; Ruschke, Alexander; Zibell, Andre

    2016-01-01

    The performance of pressurized drift-tube detectors at very high background rates has been studied at the Gamma Irradiation Facility (GIF) at CERN and in an intense 20 MeV proton beam at the Munich Van-der-Graaf tandem accelerator for applications in large-area precision muon tracking at high-luminosity upgrades of the Large Hadron Collider (LHC). The ATLAS muon drifttube (MDT) chambers with 30 mm tube diameter have been designed to cope with and neutron background hit rates of up to 500 Hz/square cm. Background rates of up to 14 kHz/square cm are expected at LHC upgrades. The test results with standard MDT readout electronics show that the reduction of the drift-tube diameter to 15 mm, while leaving the operating parameters unchanged, vastly increases the rate capability well beyond the requirements. The development of new small-diameter muon drift-tube (sMDT) chambers for LHC upgrades is completed. Further improvements of tracking e?ciency and spatial resolution at high counting rates will be achieved with ...

  9. Use of the HPI Model 2080 pulsed neutron detector at the LANSCE complex - vulnerabilities and counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Jones, K.W. [Los Alamos National Lab., NM (United States); Browman, A. [Amparo Corp., Sante Fe, NM (United States)

    1997-01-01

    The BPI Model 2080 Pulsed Neutron Detector has been used for over seven years as an area radiation monitor and dose limiter at the LANSCE accelerator complex. Operating experience and changing environments over this time have revealed several vulnerabilities (susceptibility to electrical noise, paralysis in high dose rate fields, etc.). Identified vulnerabilities have been connected; these modifications include component replacement and circuit design changes. The data and experiments leading to these modifications will be presented and discussed. Calibration of the instrument is performed in mixed static gamma and neutron source fields. The statistical characteristics of the Geiger-Muller tubes coupled with significantly different sensitivity to gamma and neutron doses require that careful attention be paid to acceptable fluctuations in dose rate over time during calibration. The performance of the instrument has been modeled using simple Poisson statistics and the operating characteristics of the Geiger-Muller tubes. The results are in excellent agreement with measurements. The analysis and comparison with experimental data will be presented.

  10. An automatic attenuator device for x-ray detectors at high counting rate

    Science.gov (United States)

    Alvarez, J.; Paiser, E.; Capitan, M. J.

    2002-07-01

    In this article we describe an attenuator device for reducing/controlling the pulse detector counting losses at a high counting rate. The electronics are based on a direct measure of the detector dead time from the analog output signal at the end of the detection chain. Taking into account this parameter the attenuator device decides to reduce/enhance the number of photons that arrive at the detector by inserting/extracting the necessary number of attenuation foils in the x-ray beam path. In that way the number of events in the incoming signal are reduced and the "apparent dynamic range" of the detector is increased.

  11. Optimal experimental design for nano-particle atom-counting from high-resolution STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De Backer, A.; De wael, A.; Gonnissen, J.; Van Aert, S., E-mail: sandra.vanaert@uantwerpen.be

    2015-04-15

    In the present paper, the principles of detection theory are used to quantify the probability of error for atom-counting from high resolution scanning transmission electron microscopy (HR STEM) images. Binary and multiple hypothesis testing have been investigated in order to determine the limits to the precision with which the number of atoms in a projected atomic column can be estimated. The probability of error has been calculated when using STEM images, scattering cross-sections or peak intensities as a criterion to count atoms. Based on this analysis, we conclude that scattering cross-sections perform almost equally well as images and perform better than peak intensities. Furthermore, the optimal STEM detector design can be derived for atom-counting using the expression for the probability of error. We show that for very thin objects LAADF is optimal and that for thicker objects the optimal inner detector angle increases.

  12. Counting Penguins.

    Science.gov (United States)

    Perry, Mike; Kader, Gary

    1998-01-01

    Presents an activity on the simplification of penguin counting by employing the basic ideas and principles of sampling to teach students to understand and recognize its role in statistical claims. Emphasizes estimation, data analysis and interpretation, and central limit theorem. Includes a list of items for classroom discussion. (ASK)

  13. A high-throughput, multi-channel photon-counting detector with picosecond timing

    CERN Document Server

    Lapington, J S; Miller, G M; Ashton, T J R; Jarron, P; Despeisse, M; Powolny, F; Howorth, J; Milnes, J

    2009-01-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchanne...

  14. Development of Fast High-Resolution Muon Drift-Tube Detectors for High Counting Rates

    CERN Document Server

    INSPIRE-00287945; Dubbert, J.; Horvat, S.; Kortner, O.; Kroha, H.; Legger, F.; Richter, R.; Adomeit, S.; Biebel, O.; Engl, A.; Hertenberger, R.; Rauscher, F.; Zibell, A.

    2011-01-01

    Pressurized drift-tube chambers are e?cient detectors for high-precision tracking over large areas. The Monitored Drift-Tube (MDT) chambers of the muon spectrometer of the ATLAS detector at the Large Hadron Collider (LHC) reach a spatial resolution of 35 micons and almost 100% tracking e?ciency with 6 layers of 30 mm diameter drift tubes operated with Ar:CO2 (93:7) gas mixture at 3 bar and a gas gain of 20000. The ATLAS MDT chambers are designed to cope with background counting rates due to neutrons and gamma-rays of up to about 300 kHz per tube which will be exceeded for LHC luminosities larger than the design value of 10-34 per square cm and second. Decreasing the drift-tube diameter to 15 mm while keeping the other parameters, including the gas gain, unchanged reduces the maximum drift time from about 700 ns to 200 ns and the drift-tube occupancy by a factor of 7. New drift-tube chambers for the endcap regions of the ATLAS muon spectrometer have been designed. A prototype chamber consisting of 12 times 8 l...

  15. Review of robust multivariate statistical methods in high dimension.

    Science.gov (United States)

    Filzmoser, Peter; Todorov, Valentin

    2011-10-31

    General ideas of robust statistics, and specifically robust statistical methods for calibration and dimension reduction are discussed. The emphasis is on analyzing high-dimensional data. The discussed methods are applied using the packages chemometrics and rrcov of the statistical software environment R. It is demonstrated how the functions can be applied to real high-dimensional data from chemometrics, and how the results can be interpreted.

  16. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  17. High-speed readout solution for single-photon counting ASICs

    Science.gov (United States)

    Kmon, P.; Szczygiel, R.; Maj, P.; Grybos, P.; Kleczek, R.

    2016-02-01

    We report on the analysis, simulations and measurements of both noise and high-count rate performance of a single photon counting integrated circuit called UFXC32k designed for hybrid pixel detectors for various applications in X-ray imaging. The dimensions of the UFCX32k designed in CMOS 130 nm technology are 9.63 mm × 20.15 mm. The integrated circuit core is a matrix of 128 × 256 squared readout pixels with a pitch of 75 μm. Each readout pixel contains a charge sensitive amplifier (CSA), a shaper, two discriminators and two 14-bit ripple counters. The UFXC32k was bump-bonded to a silicon pixel detector with the thickness of 320 μm and characterized with the X-ray radiation source. The CSA feedback based on the Krummenacher circuit determines both the count rate performance and the noise of the readout front-end electronics. For the default setting of the CSA feedback, the measured front-end electronics dead time is 232 ns (paralyzable model) and the equivalent noise charge (ENC) is equal to 123 el. rms. For the high count rate setting of the CSA feedback, the dead time is only 101 ns and the ENC is equal to 163 el. rms.

  18. Calibration of the Accuscan II IN Vivo System for High Energy Lung Counting

    Energy Technology Data Exchange (ETDEWEB)

    Ovard R. Perry; David L. Georgeson

    2011-07-01

    This report describes the April 2011 calibration of the Accuscan II HpGe In Vivo system for high energy lung counting. The source used for the calibration was a NIST traceable lung set manufactured at the University of Cincinnati UCLL43AMEU & UCSL43AMEU containing Am-241 and Eu-152 with energies from 26 keV to 1408 keV. The lung set was used in conjunction with a Realistic Torso phantom. The phantom was placed on the RMC II counting table (with pins removed) between the v-ridges on the backwall of the Accuscan II counter. The top of the detector housing was positioned perpendicular to the junction of the phantom clavicle with the sternum. This position places the approximate center line of the detector housing with the center of the lungs. The energy and efficiency calibrations were performed using a Realistic Torso phantom (Appendix I) and the University of Cincinnati lung set. This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for high energy lung counting and verified in accordance with ANSI/HPS N13.30-1996 criteria.

  19. Effect of high altitude exposure on spermatogenesis and epididymal sperm count in male rats.

    Science.gov (United States)

    Gasco, M; Rubio, J; Chung, A; Villegas, L; Gonzales, G F

    2003-12-01

    The present study was designed to determine the effect of exposure to high altitude on spermatogenesis using transillumination technique and sperm count in male rats. In addition, the effect of oral intubation for intragastric administration of vehicle on testicular parameters in adult male rats in a schedule of 42 days was assessed. Male rats were exposed to Cerro de Pasco (Peru) at 4340 m for 3, 7, 14, 21, 28, 35 and 42 days resulting in a modification of the pattern of the seminiferous tubule stages. At day 3, stages I, IV-V, VI, VII and IX-XI were relatively shorter at high altitude than at sea level. At day 7, stages VIII, IX-XI, XII and XIII-XIV were reduced. At day 14, stages VII, VIII and IX-XI were reduced. At day 21 and 28, stages VIII, XII and XIII-XIV were significantly increased at high altitude. At day 35 an increase in stage XIII-XIV was observed. At day 42, stages II-III, IX-XI and XII were significantly increased at high altitude. Epididymal sperm count was significantly reduced at day 7 of exposure to high altitude and maintained low levels with respect to sea level up to 42 days. In conclusion, high altitude exposure affects spermatogenesis, particularly onset of mitosis and spermiation. This in turn affects epididymal sperm count.

  20. Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.

    Science.gov (United States)

    Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio

    2016-01-01

    The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.

  1. Development of low read noise high conversion gain CMOS image sensor for photon counting level imaging

    Science.gov (United States)

    Seo, Min-Woong; Kawahito, Shoji; Kagawa, Keiichiro; Yasutomi, Keita

    2016-05-01

    A CMOS image sensor with deep sub-electron read noise and high pixel conversion gain has been developed. Its performance is recognized through image outputs from an area image sensor, confirming the capability of photoelectroncounting- level imaging. To achieve high conversion gain, the proposed pixel has special structures to reduce the parasitic capacitances around FD node. As a result, the pixel conversion gain is increased due to the optimized FD node capacitance, and the noise performance is also improved by removing two noise sources from power supply. For the first time, high contrast images from the reset-gate-less CMOS image sensor, with less than 0.3e- rms noise level, have been generated at an extremely low light level of a few electrons per pixel. In addition, the photon-counting capability of the developed CMOS imager is demonstrated by a measurement, photoelectron-counting histogram (PCH).

  2. A Hardware-Efficient Scalable Spike Sorting Neural Signal Processor Module for Implantable High-Channel-Count Brain Machine Interfaces.

    Science.gov (United States)

    Yang, Yuning; Boling, Sam; Mason, Andrew J

    2017-08-01

    Next-generation brain machine interfaces demand a high-channel-count neural recording system to wirelessly monitor activities of thousands of neurons. A hardware efficient neural signal processor (NSP) is greatly desirable to ease the data bandwidth bottleneck for a fully implantable wireless neural recording system. This paper demonstrates a complete multichannel spike sorting NSP module that incorporates all of the necessary spike detector, feature extractor, and spike classifier blocks. To meet high-channel-count and implantability demands, each block was designed to be highly hardware efficient and scalable while sharing resources efficiently among multiple channels. To process multiple channels in parallel, scalability analysis was performed, and the utilization of each block was optimized according to its input data statistics and the power, area and/or speed of each block. Based on this analysis, a prototype 32-channel spike sorting NSP scalable module was designed and tested on an FPGA using synthesized datasets over a wide range of signal to noise ratios. The design was mapped to 130 nm CMOS to achieve 0.75 μW power and 0.023 mm(2) area consumptions per channel based on post synthesis simulation results, which permits scalability of digital processing to 690 channels on a 4×4 mm(2) electrode array.

  3. High platelet counts increase metastatic risk in huge hepatocellular carcinoma undergoing transarterial chemoembolization.

    Science.gov (United States)

    Xue, Tong-Chun; Ge, Ning-Ling; Xu, Xin; Le, Fan; Zhang, Bo-Heng; Wang, Yan-Hong

    2016-09-01

    Accumulating evidence suggests platelets play critical roles in tumor metastasis. Moreover, the role of platelets in metastasis is partially correlated with inflammation. However, evidence regarding the contribution of platelets to hepatocellular carcinoma (HCC) metastasis is lacking. This study investigated the association between platelets and metastatic risk in HCC. We used huge HCC (diameter over 10 cm), a tumor subgroup with a strong inflammatory background, as a model to evaluate the potential predictive role of platelets and platelet-related biomarkers for metastasis in HCC patients undergoing transarterial chemoembolization. A logistic regression model was used to analyze risk factors for metastasis. Patients with huge HCC (n = 178) were enrolled, and 24.7% (44/178) of patients had remote metastases after treatment. Univariate analyses showed high platelet counts (P = 0.012), pretreatment platelet-to-lymphocyte ratios (pre-PLR) of 100 or more (P = 0.018) and post-PLR of 100 or more (P = 0.013) were potential risk factors for metastasis. Furthermore, multivariate analyses showed high platelet counts (odds ratio, 2.18; 95% confidence interval, 1.074-4.443; P = 0.031) and platelet-related biomarkers were independent risk factors for HCC metastasis. Particularly, the risk of metastasis in patients with high post-PLR values was significantly greater than patients with low post-PLR values. For tumor response and survival, patients with high platelet counts had faster disease progression (P = 0.002) and worse survival (P huge HCC undergoing chemoembolization, which supply clinical verification of the association between high platelet counts and HCC metastasis. © 2016 The Japan Society of Hepatology.

  4. High energy X-ray photon counting imaging using linear accelerator and silicon strip detectors

    Science.gov (United States)

    Tian, Y.; Shimazoe, K.; Yan, X.; Ueda, O.; Ishikura, T.; Fujiwara, T.; Uesaka, M.; Ohno, M.; Tomita, H.; Yoshihara, Y.; Takahashi, H.

    2016-09-01

    A photon counting imaging detector system for high energy X-rays is developed for on-site non-destructive testing of thick objects. One-dimensional silicon strip (1 mm pitch) detectors are stacked to form a two-dimensional edge-on module. Each detector is connected to a 48-channel application specific integrated circuit (ASIC). The threshold-triggered events are recorded by a field programmable gate array based counter in each channel. The detector prototype is tested using 950 kV linear accelerator X-rays. The fast CR shaper (300 ns pulse width) of the ASIC makes it possible to deal with the high instant count rate during the 2 μs beam pulse. The preliminary imaging results of several metal and concrete samples are demonstrated.

  5. High energy X-ray photon counting imaging using linear accelerator and silicon strip detectors

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Y., E-mail: cycjty@sophie.q.t.u-tokyo.ac.jp [Department of Bioengineering, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Shimazoe, K.; Yan, X. [Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Ueda, O.; Ishikura, T. [Fuji Electric Co., Ltd., Fuji, Hino, Tokyo 191-8502 (Japan); Fujiwara, T. [National Institute of Advanced Industrial Science and Technology, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568 (Japan); Uesaka, M.; Ohno, M. [Nuclear Professional School, the University of Tokyo, 2-22 Shirakata-shirane, Tokai, Ibaraki 319-1188 (Japan); Tomita, H. [Department of Quantum Engineering, Nagoya University, Furo, Chikusa, Nagoya 464-8603 (Japan); Yoshihara, Y. [Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Takahashi, H. [Department of Bioengineering, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan); Department of Nuclear Engineering and Management, the University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2016-09-11

    A photon counting imaging detector system for high energy X-rays is developed for on-site non-destructive testing of thick objects. One-dimensional silicon strip (1 mm pitch) detectors are stacked to form a two-dimensional edge-on module. Each detector is connected to a 48-channel application specific integrated circuit (ASIC). The threshold-triggered events are recorded by a field programmable gate array based counter in each channel. The detector prototype is tested using 950 kV linear accelerator X-rays. The fast CR shaper (300 ns pulse width) of the ASIC makes it possible to deal with the high instant count rate during the 2 μs beam pulse. The preliminary imaging results of several metal and concrete samples are demonstrated.

  6. Counting forbidden patterns in irregularly sampled time series. II. Reliability in the presence of highly irregular sampling

    Science.gov (United States)

    Sakellariou, Konstantinos; McCullough, Michael; Stemler, Thomas; Small, Michael

    2016-12-01

    We are motivated by real-world data that exhibit severe sampling irregularities such as geological or paleoclimate measurements. Counting forbidden patterns has been shown to be a powerful tool towards the detection of determinism in noisy time series. They constitute a set of ordinal symbolic patterns that cannot be realised in time series generated by deterministic systems. The reliability of the estimator of the relative count of forbidden patterns from irregularly sampled data has been explored in two recent studies. In this paper, we explore highly irregular sampling frequency schemes. Using numerically generated data, we examine the reliability of the estimator when the sampling period has been drawn from exponential, Pareto and Gamma distributions of varying skewness. Our investigations demonstrate that some statistical properties of the sampling distribution are useful heuristics for assessing the estimator's reliability. We find that sampling in the presence of large chronological gaps can still yield relatively accurate estimates as long as the time series contains sufficiently many densely sampled areas. Furthermore, we show that the reliability of the estimator of forbidden patterns is poor when there is a high number of sampling intervals, which are larger than a typical correlation time of the underlying system.

  7. Relationship of long-term highly active antiretroviral therapy on salivary flow rate and CD4 Count among HIV-infected patients

    Directory of Open Access Journals (Sweden)

    J Vijay Kumar

    2015-01-01

    Full Text Available Objectives: To determine if long-term highly active antiretroviral therapy (HAART therapy alters salivary flow rate and also to compare its relation of CD4 count with unstimulated and stimulated whole saliva. Materials and Methods: A cross-sectional study was performed on 150 individuals divided into three groups. Group I (50 human immunodeficiency virus (HIV seropositive patients, but not on HAART therapy, Group II (50 HIV-infected subjects and on HAART for less than 3 years called short-term HAART, Group III (50 HIV-infected subjects and on HAART for more than or equal to 3 years called long-term HAART. Spitting method proposed by Navazesh and Kumar was used for the measurement of unstimulated and stimulated salivary flow rate. Chi-square test and analysis of variance (ANOVA were used for statistical analysis. Results: The mean CD4 count was 424.78 ΁ 187.03, 497.82 ΁ 206.11 and 537.6 ΁ 264.00 in the respective groups. Majority of the patients in all the groups had a CD4 count between 401 and 600. Both unstimulated and stimulated whole salivary (UWS and SWS flow rates in Group I was found to be significantly higher than in Group II (P < 0.05. Unstimulated salivary flow rate between Group II and III subjects were also found to be statistically significant (P < 0.05. ANOVA performed between CD4 count and unstimulated and stimulated whole saliva in each group demonstrated a statistically significant relationship in Group II (P < 0.05. There were no significant results found between CD4 count and stimulated whole saliva in each groups. Conclusion:The reduction in CD4 cell counts were significantly associated with salivary flow rates of HIV-infected individuals who are on long-term HAART.

  8. A high-throughput, multi-channel photon-counting detector with picosecond timing

    Science.gov (United States)

    Lapington, J. S.; Fraser, G. W.; Miller, G. M.; Ashton, T. J. R.; Jarron, P.; Despeisse, M.; Powolny, F.; Howorth, J.; Milnes, J.

    2009-06-01

    High-throughput photon counting with high time resolution is a niche application area where vacuum tubes can still outperform solid-state devices. Applications in the life sciences utilizing time-resolved spectroscopies, particularly in the growing field of proteomics, will benefit greatly from performance enhancements in event timing and detector throughput. The HiContent project is a collaboration between the University of Leicester Space Research Centre, the Microelectronics Group at CERN, Photek Ltd., and end-users at the Gray Cancer Institute and the University of Manchester. The goal is to develop a detector system specifically designed for optical proteomics, capable of high content (multi-parametric) analysis at high throughput. The HiContent detector system is being developed to exploit this niche market. It combines multi-channel, high time resolution photon counting in a single miniaturized detector system with integrated electronics. The combination of enabling technologies; small pore microchannel plate devices with very high time resolution, and high-speed multi-channel ASIC electronics developed for the LHC at CERN, provides the necessary building blocks for a high-throughput detector system with up to 1024 parallel counting channels and 20 ps time resolution. We describe the detector and electronic design, discuss the current status of the HiContent project and present the results from a 64-channel prototype system. In the absence of an operational detector, we present measurements of the electronics performance using a pulse generator to simulate detector events. Event timing results from the NINO high-speed front-end ASIC captured using a fast digital oscilloscope are compared with data taken with the proposed electronic configuration which uses the multi-channel HPTDC timing ASIC.

  9. High spatial and temporal resolution photon/electron counting detector for synchrotron radiation research

    Science.gov (United States)

    Tremsin, A. S.; Lebedev, G. V.; Siegmund, O. H. W.; Vallerga, J. V.; Hull, J. S.; McPhate, J. B.; Jozwiak, C.; Chen, Y.; Guo, J. H.; Shen, Z. X.; Hussain, Z.

    2007-10-01

    This paper reports on the development of a high resolution electron/photon/ion imaging system which detects events with a timing accuracy of <160 ps FWHM and a two-dimensional spatial accuracy of ˜50 μm FWHM. The event counting detector uses microchannel plates for signal amplification and can sustain counting rates exceeding 1.5 MHz for evenly distributed events (0.4 MHz with 10% dead time for randomly distributed events). The detector combined with a time-of-flight angular resolved photoelectron energy analyzer was tested at a synchrotron beamline. The results of these measurements illustrate the unique capabilities of the analytical system, allowing simultaneous imaging of photoelectrons in momentum space and measurement of the energy spectrum, as well as filtering the data in user defined temporal and/or spatial windows.

  10. Photon-counting digital radiography using high-pressure xenon filled detectors

    CERN Document Server

    Li, Maozhen; Johns, P C

    2001-01-01

    Digital radiography overcomes many of the limitations of the traditional screen/film system. Further enhancements in the digital radiography image are possible if the X-ray image receptor could measure the energy of individual photons instead of simply integrating their energy, as is the case at present. A prototype photon counting scanned projection radiography system has been constructed, which combines a Gas Electron Multiplier (GEM) and a Gas Microstrip Detector (GMD) using Xe : CH sub 4 (90 : 10) at high pressure. With the gain contribution from the GEM, the GMD can be operated at lower and safer voltages making the imaging system more reliable. Good energy resolution, and spatial resolution comparable to that of screen/film, have been demonstrated for the GEM/GMD hybrid imaging system in photon counting mode for X-ray spectra up to 50 kV.

  11. Application of the Cluster Counting/Timing techniques to improve the performances of high transparency Drift Chamber for modern HEP experiments

    Science.gov (United States)

    Chiarello, G.; Chiri, C.; Cocciolo, G.; Corvaglia, A.; Grancagnolo, F.; Panareo, M.; Pepino, A.; Renga, F.; Tassielli, G. F.; Voena, C.

    2017-07-01

    Ultra-low mass and high granularity Drift Chambers seems to be a better choice for modern HEP experiments, to achieve a good momentum resolution on the charged particle. We present how, in Helium based gas mixture, by counting and measuring the arrival time of each individual ionization cluster and by using statistical tools, it is possible to reconstruct a bias free estimate of the impact parameter and a more discriminant Particle Identification.

  12. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  13. Low-noise multichannel ASIC for high count rate X-ray diffractometry applications

    Energy Technology Data Exchange (ETDEWEB)

    Szczygiel, R. [AGH University of Science and Technology, Department of Measurement and Instrumentation, al. Mickiewicza 30, Krakow (Poland)], E-mail: robert.szczygiel@agh.edu.pl; Grybos, P.; Maj, P. [AGH University of Science and Technology, Department of Measurement and Instrumentation, al. Mickiewicza 30, Krakow (Poland); Tsukiyama, A.; Matsushita, K.; Taguchi, T. [Rigaku Corporation, 3-9-12 Matsubara-cho, Akishima-shi, Tokyo (Japan)

    2009-08-01

    RG64 is a 64-channel ASIC designed for the silicon strip detector readout and optimized for high count rate X-ray imaging applications. In this paper we report on the test results referring to the RG64 noise level, channel uniformity and the operation with a high rate of input signals. The parameters of the RG64-based diffractometry system are compared with the ones based on the scintillation counter. Diffractometry measurement results with silicon strip detectors of different strip lengths and strip pitch are also presented.

  14. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  15. Complete blood cell count in psittaciformes by using high-throughput image cytometry: a pilot study.

    Science.gov (United States)

    Beaufrère, Hugues; Ammersbach, Mélanie; Tully, Thomas N

    2013-09-01

    The avian hemogram is usually performed in veterinary diagnostic laboratories by using manual cell counting techniques and differential counts determined by light microscopy. There is no standard automated technique for avian blood cell count and differentiation to date. These shortcomings in birds are primarily because erythrocytes and thrombocytes are nucleated, which precludes the use of automated analyzers programmed to perform mammal complete blood cell counts. In addition, there is no standard avian antibody panel, which would allow cell differentiation by immunophenotyping across all commonly seen bird species. We report an alternative hematologic approach for quantification and differentiation of avian blood cells by using high-throughput image cytometry on blood smears in psittacine bird species. A pilot study was designed with 70 blood smears of different psittacine bird species stained with a Wright-Giemsa stain. The slides were scanned at 0.23 microm/pixel. The open-source softwares CellProfiler and CellProfiler Analyst were used for analyzing and sorting each cell by image cytometry. A "pipeline" was constructed in the CellProfiler by using different modules to identify and export hundreds of measures per cell for shape, intensity, and texture. Rules for classifying the different blood cell phenotypes were then determined based on these measurements by iterative feedback and machine learning by using CellProfiler Analyst. Although this approach shows promises, avian Leukopet results could not be duplicated when using this technique as is. Further studies and more standardized prospective investigations may be needed to refine the "pipeline" strategy and the machine learning algorithm.

  16. High-speed counting and sizing of cells in an impedance flow microcytometer with compact electronic instrumentation

    DEFF Research Database (Denmark)

    Castillo-Fernandez, Oscar; Rodriguez-Trujíllo, Romén; Gomila, Gabriel

    2014-01-01

    Here we describe a high-throughput impedance flow cytometer on a chip. This device was built using compact and inexpensive electronic instrumentation. The system was used to count and size a mixed cell sample containing red blood cells and white blood cells. It demonstrated a counting capacity of...

  17. Count response model for the CMB spots

    CERN Document Server

    Giovannini, Massimo

    2010-01-01

    The statistics of the curvature quanta generated during a stage of inflationary expansion is used to derive a count response model for the large-scale phonons determining, in the concordance lore, the warmer and the cooler spots of the large-scale temperature inhomogeneities. The multiplicity distributions for the counting statistics are shown to be generically overdispersed in comparison with conventional Poissonian regressions. The generalized count response model deduced hereunder accommodates an excess of correlations in the regime of high multiplicities and prompts dedicated analyses with forthcoming data collected by instruments of high angular resolution and high sensitivity to temperature variations per pixel.

  18. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  19. Development of a high-performance multichannel system for time-correlated single photon counting

    Science.gov (United States)

    Peronio, P.; Cominelli, A.; Acconcia, G.; Rech, I.; Ghioni, M.

    2017-05-01

    Time-Correlated Single Photon Counting (TCSPC) is one of the most effective techniques for measuring weak and fast optical signals. It outperforms traditional "analog" techniques due to its high sensitivity along with high temporal resolution. Despite those significant advantages, a main drawback still exists, which is related to the long acquisition time needed to perform a measurement. In past years many TCSPC systems have been developed with higher and higher number of channels, aimed to dealing with that limitation. Nevertheless, modern systems suffer from a strong trade-off between parallelism level and performance: the higher the number of channels the poorer the performance. In this work we present the design of a 32x32 TCSPC system meant for overtaking the existing trade-off. To this aim different technologies has been employed, to get the best performance both from detectors and sensing circuits. The exploitation of different technologies will be enabled by Through Silicon Vias (TSVs) which will be investigated as a possible solution for connecting the detectors to the sensing circuits. When dealing with a high number of channels, the count rate is inevitably set by the affordable throughput to the external PC. We targeted a throughput of 10Gb/s, which is beyond the state of the art, and designed the number of TCSPC channels accordingly. A dynamic-routing logic will connect the detectors to the lower number of acquisition chains.

  20. Gravitational wave source counts at high redshift and in models with extra dimensions

    Science.gov (United States)

    García-Bellido, Juan; Nesseris, Savvas; Trashorras, Manuel

    2016-07-01

    Gravitational wave (GW) source counts have been recently shown to be able to test how gravitational radiation propagates with the distance from the source. Here, we extend this formalism to cosmological scales, i.e. the high redshift regime, and we discuss the complications of applying this methodology to high redshift sources. We also allow for models with compactified extra dimensions like in the Kaluza-Klein model. Furthermore, we also consider the case of intermediate redshifts, i.e. 0 < z lesssim 1, where we show it is possible to find an analytical approximation for the source counts dN/d(S/N). This can be done in terms of cosmological parameters, such as the matter density Ωm,0 of the cosmological constant model or the cosmographic parameters for a general dark energy model. Our analysis is as general as possible, but it depends on two important factors: a source model for the black hole binary mergers and the GW source to galaxy bias. This methodology also allows us to obtain the higher order corrections of the source counts in terms of the signal-to-noise S/N. We then forecast the sensitivity of future observations in constraining GW physics but also the underlying cosmology by simulating sources distributed over a finite range of signal-to-noise with a number of sources ranging from 10 to 500 sources as expected from future detectors. We find that with 500 events it will be possible to provide constraints on the matter density parameter at present Ωm,0 on the order of a few percent and with the precision growing fast with the number of events. In the case of extra dimensions we find that depending on the degeneracies of the model, with 500 events it may be possible to provide stringent limits on the existence of the extra dimensions if the aforementioned degeneracies can be broken.

  1. High-Dimensional Statistical Learning: Roots, Justifications, and Potential Machineries

    Science.gov (United States)

    Zollanvari, Amin

    2015-01-01

    High-dimensional data generally refer to data in which the number of variables is larger than the sample size. Analyzing such datasets poses great challenges for classical statistical learning because the finite-sample performance of methods developed within classical statistical learning does not live up to classical asymptotic premises in which the sample size unboundedly grows for a fixed dimensionality of observations. Much work has been done in developing mathematical–statistical techniques for analyzing high-dimensional data. Despite remarkable progress in this field, many practitioners still utilize classical methods for analyzing such datasets. This state of affairs can be attributed, in part, to a lack of knowledge and, in part, to the ready-to-use computational and statistical software packages that are well developed for classical techniques. Moreover, many scientists working in a specific field of high-dimensional statistical learning are either not aware of other existing machineries in the field or are not willing to try them out. The primary goal in this work is to bring together various machineries of high-dimensional analysis, give an overview of the important results, and present the operating conditions upon which they are grounded. When appropriate, readers are referred to relevant review articles for more information on a specific subject. PMID:27081307

  2. High-Dimensional Statistical Learning: Roots, Justifications, and Potential Machineries.

    Science.gov (United States)

    Zollanvari, Amin

    2015-01-01

    High-dimensional data generally refer to data in which the number of variables is larger than the sample size. Analyzing such datasets poses great challenges for classical statistical learning because the finite-sample performance of methods developed within classical statistical learning does not live up to classical asymptotic premises in which the sample size unboundedly grows for a fixed dimensionality of observations. Much work has been done in developing mathematical-statistical techniques for analyzing high-dimensional data. Despite remarkable progress in this field, many practitioners still utilize classical methods for analyzing such datasets. This state of affairs can be attributed, in part, to a lack of knowledge and, in part, to the ready-to-use computational and statistical software packages that are well developed for classical techniques. Moreover, many scientists working in a specific field of high-dimensional statistical learning are either not aware of other existing machineries in the field or are not willing to try them out. The primary goal in this work is to bring together various machineries of high-dimensional analysis, give an overview of the important results, and present the operating conditions upon which they are grounded. When appropriate, readers are referred to relevant review articles for more information on a specific subject.

  3. Equilibrium Statistical-Thermal Models in High-Energy Physics

    CERN Document Server

    Tawfik, Abdel Nasser

    2014-01-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics, that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948 an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analysed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-par...

  4. High redshift galaxies in the ALHAMBRA survey . I. Selection method and number counts based on redshift PDFs

    Science.gov (United States)

    Viironen, K.; Marín-Franch, A.; López-Sanjuan, C.; Varela, J.; Chaves-Montero, J.; Cristóbal-Hornillos, D.; Molino, A.; Fernández-Soto, A.; Vilella-Rojo, G.; Ascaso, B.; Cenarro, A. J.; Cerviño, M.; Cepa, J.; Ederoclite, A.; Márquez, I.; Masegosa, J.; Moles, M.; Oteo, I.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Castander, J. F.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so-called dropout technique or Ly-α selection. However, the availability of multifilter data now allows the dropout selections to be replaced by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims: Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing to the study of the brightest, least frequent, high redshift galaxies. Methods: The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reaching both a complete and clean sample with this method is challenging. Hence, a method to derive statistical properties by summing the zPDFs of all the galaxies in the redshift bin of interest is introduced. Results: Using this methodology we derive the galaxy rest frame UV number counts in five redshift bins centred at z = 2.5,3.0,3.5,4.0, and 4.5, being complete up to the limiting magnitude at mUV(AB) = 24, where mUV refers to the first ALHAMBRA filter redwards of the Ly-α line. With the wide field ALHAMBRA data we especially contribute to the study of the brightest ends of these counts, accurately sampling the surface densities down to mUV(AB) = 21-22. Conclusions: We show that using the zPDFs it is easy to select a very clean sample of high redshift galaxies. We also show that it is better to do statistical analysis of the properties of galaxies using a probabilistic approach, which takes into account both the incompleteness and contamination issues in a

  5. Gravitational wave source counts at high redshift and in models with extra dimensions

    CERN Document Server

    García-Bellido, Juan; Trashorras, Manuel

    2016-01-01

    Gravitational wave (GW) source counts have been recently shown to be able to test how gravitational radiation propagates with the distance from the source. Here, we extend this formalism to cosmological scales, i.e. the high redshift regime, and we also allow for models with large or compactified extra dimensions like in the Kaluza-Klein (KK) model. We found that in the high redshift regime one would potentially expect two windows where observations above the minimum signal-to-noise threshold can be made, assuming there are no higher order corrections in the redshift dependence of the signal-to-noise $S/N(z)$ for the expected prediction. Furthermore, we also considered the case of intermediate redshifts, i.e. $0counts $\\frac{dN}{S/N}$ in terms of the cosmological parameters, like the matter density $\\Omega_{m,0}$ in the cosmological constant model and also the cosmographic parameters $(q_0,j_0,s_0)$ for a general ...

  6. Detecting liquid threats with x-ray diffraction imaging (XDi) using a hybrid approach to navigate trade-offs between photon count statistics and spatial resolution

    Science.gov (United States)

    Skatter, Sondre; Fritsch, Sebastian; Schlomka, Jens-Peter

    2016-05-01

    The performance limits were explored for an X-ray Diffraction based explosives detection system for baggage scanning. This XDi system offers 4D imaging that comprises three spatial dimensions with voxel sizes in the order of ~(0.5cm)3, and one spectral dimension for material discrimination. Because only a very small number of photons are observed for an individual voxel, material discrimination cannot work reliably at the voxel level. Therefore, an initial 3D reconstruction is performed, which allows the identification of objects of interest. Combining all the measured photons that scattered within an object, more reliable spectra are determined on the object-level. As a case study we looked at two liquid materials, one threat and one innocuous, with very similar spectral characteristics, but with 15% difference in electron density. Simulations showed that Poisson statistics alone reduce the material discrimination performance to undesirable levels when the photon counts drop to 250. When additional, uncontrolled variation sources are considered, the photon count plays a less dominant role in detection performance, but limits the performance also for photon counts of 500 and higher. Experimental data confirmed the presence of such non-Poisson variation sources also in the XDi prototype system, which suggests that the present system can still be improved without necessarily increasing the photon flux, but by better controlling and accounting for these variation sources. When the classification algorithm was allowed to use spectral differences in the experimental data, the discrimination between the two materials improved significantly, proving the potential of X-ray diffraction also for liquid materials.

  7. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection.

    Directory of Open Access Journals (Sweden)

    Priya Choudhry

    Full Text Available Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays.

  8. Extragalactic millimeter-wave point-source catalog, number counts and statistics from 771 deg{sup 2} of the SPT-SZ survey

    Energy Technology Data Exchange (ETDEWEB)

    Mocanu, L. M.; Crawford, T. M.; Benson, B. A.; Bleem, L. E.; Carlstrom, J. E.; Chang, C. L.; Crites, A. T. [Kavli Institute for Cosmological Physics, University of Chicago, Chicago, IL 60637 (United States); Vieira, J. D. [California Institute of Technology, Pasadena, CA 91125 (United States); Aird, K. A. [University of Chicago, Chicago, IL 60637 (United States); Aravena, M. [European Southern Observatory, Alonso de Córdova 3107, Vitacura Santiago (Chile); Austermann, J. E.; Everett, W. B.; Halverson, N. W. [Department of Astrophysical and Planetary Sciences and Department of Physics, University of Colorado, Boulder, CO 80309 (United States); Béthermin, M. [Laboratoire AIM-Paris-Saclay, CEA/DSM/Irfu-CNRS-Université Paris Diderot, CEA-Saclay, Orme des Merisiers, F-91191 Gif-sur-Yvette (France); Bothwell, M. [Cavendish Laboratory, University of Cambridge, 19 J.J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Chapman, S. [Department of Physics and Atmospheric Science, Dalhousie University, Halifax NS B3H 3J5 (Canada); Cho, H.-M. [NIST Quantum Devices Group, Boulder, CO 80305 (United States); De Haan, T.; Dobbs, M. A. [Department of Physics, McGill University, Montreal, Quebec H3A 2T8 (Canada); George, E. M., E-mail: lmocanu@uchicago.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States); and others

    2013-12-10

    We present a point-source catalog from 771 deg{sup 2} of the South Pole Telescope Sunyaev-Zel'dovich survey at 95, 150, and 220 GHz. We detect 1545 sources above 4.5σ significance in at least one band. Based on their relative brightness between survey bands, we classify the sources into two populations, one dominated by synchrotron emission from active galactic nuclei, and one dominated by thermal emission from dust-enshrouded star-forming galaxies. We find 1238 synchrotron and 307 dusty sources. We cross-match all sources against external catalogs and find 189 unidentified synchrotron sources and 189 unidentified dusty sources. The dusty sources without counterparts are good candidates for high-redshift, strongly lensed submillimeter galaxies. We derive number counts for each population from 1 Jy down to roughly 11, 4, and 11 mJy at 95, 150, and 220 GHz. We compare these counts with galaxy population models and find that none of the models we consider for either population provide a good fit to the measured counts in all three bands. The disparities imply that these measurements will be an important input to the next generation of millimeter-wave extragalactic source population models.

  9. Mu-Spec - A High Performance Ultra-Compact Photon Counting spectrometer for Space Submillimeter Astronomy

    Science.gov (United States)

    Moseley, H.; Hsieh, W.-T.; Stevenson, T.; Wollack, E.; Brown, A.; Benford, D.; Sadleir; U-Yen, I.; Ehsan, N.; Zmuidzinas, J.; Bradford, M.

    2011-01-01

    We have designed and are testing elements of a fully integrated submillimeter spectrometer based on superconducting microstrip technology. The instrument can offer resolving power R approximately 1500, and its high frequency cutoff is set by the gap of available high performance superconductors. All functions of the spectrometer are integrated - light is coupled to the microstrip circuit with a planar antenna, the spectra discrimination is achieved using a synthetic grating, orders are separated using planar filter, and detected using photon counting MKID detector. This spectrometer promises to revolutionize submillimeter spectroscopy from space. It replaces instruments with the scale of 1m with a spectrometer on a 10 cm Si wafer. The reduction in mass and volume promises a much higher performance system within available resource in a space mission. We will describe the system and the performance of the components that have been fabricated and tested.

  10. Scaling analysis of high-frequency time series of gamma-ray counts

    Science.gov (United States)

    Barbosa, Susana; Azevedo, Eduardo

    2017-04-01

    Gamma radiation is being monitored in a dedicated campaign set-up at the Eastern North Atlantic (ENA) facility located in the Graciosa island (Azores), a fixed site of the Atmospheric Radiation Measurement programme (ARM), established and supported by the Department of Energy (DOE) of the United States of America with the collaboration of the Government of the Autonomous Region of the Azores and University of the Azores. The temporal variability of gamma radiation is mainly determined by the time-varying concentration of radon progeny, which in turn is influenced by meteorological conditions and precipitation scavenging. The resulting time series of high-frequency (1-minute) gamma-ray counts displays therefore a complex temporal structure on multiple time scales, including long-range dependent behavior. This work addresses the scaling properties of the time series of gamma-ray counts from the ENA site (data freely available from the ARM data archive) using both wavelet and model-based methods for the estimation of the scaling exponent. The time series is dominated by sharp peaks associated with events of strong precipitation. The effect of these peaks on the estimation of the scaling exponent, as well as the effect of temporal aggregation (1-minute versus 15-minute aggregated data) is further addressed.

  11. A model of the high count rate performance of NaI(Tl)-based PET detectors

    Energy Technology Data Exchange (ETDEWEB)

    Wear, J.A.; Karp, J.S.; Freifelder, R. [Univ. of Pennsylvania, Philadelphia, PA (United States). Dept. of Radiology; Mankoff, D.A. [Univ. of Washington, Seattle, WA (United States). Dept. of Radiology; Muehllehner, G. [UGM Medical Systems, Philadelphia, PA (United States)

    1998-06-01

    A detailed model of the response of large-area NaI(Tl) detectors used in PET and their triggering and data acquisition electronics has been developed. This allows one to examine the limitations of the imaging system`s performance due to degradation in the detector performance from light pile-up and deadtime from triggering and event processing. Comparisons of simulation results to measurements from the HEAD PENN-PET scanner have been performed to validate the Monte Carlo model. The model was then used to predict improvements in the high count rate performance of the HEAD PENN-PET scanner using different signal integration times, light response functions, and detectors.

  12. Two dimensional localization of electrons and positrons under high counting rate

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, A.F.; Anjos, J.C.; Sanchez-Hernandez, A. [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Pepe, I.M.; Barros, N. [Bahia Univ., Salvador, BA (Brazil). Inst. de Fisica

    1997-12-01

    The construction of two wire chambers for the experiment E831 at Fermilab is reported. Each chamber includes three wire planes - one anode and two orthogonal cathodes - in which the wires operate as independent proportional counters. One of the chambers is rotated with respect to the other, so that four position coordinates may be encoded for a charged particle crossing both chambers. Spatial resolution is determined by the wire pitch: 1 mm for cathodes, 2 mm for anodes. 320 electronic channels are involved in the detection system readout. Global counting rates in excess to 10{sup 7} events per second have been measured, while the average electron-positron beam intensity may be as high as 3 x 10{sup 7} events per second. (author) 5 refs., 9 figs.

  13. Nuclear photonics at ultra-high counting rates and higher multipole excitations

    Energy Technology Data Exchange (ETDEWEB)

    Thirolf, P. G.; Habs, D.; Filipescu, D.; Gernhaeuser, R.; Guenther, M. M.; Jentschel, M.; Marginean, N.; Pietralla, N. [Fakultaet f. Physik, Ludwig-Maximilians-Universitaet Muenchen, Garching (Germany); Fakultaet f. Physik, Ludwig-Maximilians-Universitaet Muenchen, Garching, Germany and Max-Planck-Institute f. Quantum Optics, Garching (Germany); IFIN-HH, Bucharest-Magurele (Romania); Physik Department E12,Technische Universitaet Muenchen, Garching (Germany); Max-Planck-Institute f. Quantum Optics, Garching (Germany); Institut Laue-Langevin, Grenoble (France); Physik Department E12,Technische Universitaet Muenchen, Garching (Germany); Institut f. Kernphysik, Technische Universitaet Darmstadt (Germany)

    2012-07-09

    Next-generation {gamma} beams from laser Compton-backscattering facilities like ELI-NP (Bucharest)] or MEGa-Ray (Livermore) will drastically exceed the photon flux presently available at existing facilities, reaching or even exceeding 10{sup 13}{gamma}/sec. The beam structure as presently foreseen for MEGa-Ray and ELI-NP builds upon a structure of macro-pulses ({approx}120 Hz) for the electron beam, accelerated with X-band technology at 11.5 GHz, resulting in a micro structure of 87 ps distance between the electron pulses acting as mirrors for a counterpropagating intense laser. In total each 8.3 ms a {gamma} pulse series with a duration of about 100 ns will impinge on the target, resulting in an instantaneous photon flux of about 10{sup 18}{gamma}/s, thus introducing major challenges in view of pile-up. Novel {gamma} optics will be applied to monochromatize the {gamma} beam to ultimately {Delta}E/E{approx}10{sup -6}. Thus level-selective spectroscopy of higher multipole excitations will become accessible with good contrast for the first time. Fast responding {gamma} detectors, e.g. based on advanced scintillator technology (e.g. LaBr{sub 3}(Ce)) allow for measurements with count rates as high as 10{sup 6}-10{sup 7}{gamma}/s without significant drop of performance. Data handling adapted to the beam conditions could be performed by fast digitizing electronics, able to sample data traces during the micro-pulse duration, while the subsequent macro-pulse gap of ca. 8 ms leaves ample time for data readout. A ball of LaBr{sub 3} detectors with digital readout appears to best suited for this novel type of nuclear photonics at ultra-high counting rates.

  14. Experimental characterization of the COndensation PArticle counting System for high altitude aircraft-borne application [Discussion paper

    OpenAIRE

    Weigel, Ralf; Hermann, Markus; Curtius, Joachim; Voigt, Christiane; Walter, Saskia; Böttger, Thomas; Lepukhov, Boris; Belyaev, Gennady; Borrmann, Stephan

    2008-01-01

    his study aims at a detailed characterization of an ultra-fine aerosol particle counting system for operation on board the Russian high altitude research aircraft M-55 "Geophysica" (maximum ceiling of 21 km). The COndensation PArticle counting Systems (COPAS) consists of an aerosol inlet and two dual-channel continuous flow Condensation Particle Counters (CPCs). The aerosol inlet, adapted for COPAS measurements on board the M-55 "Geophysica", is described concerning aspiration, transmissio...

  15. Management practices associated with low, medium, and high somatic cell counts in bulk milk.

    Science.gov (United States)

    Barkema, H W; Schukken, Y H; Lam, T J; Beiboer, M L; Benedictus, G; Brand, A

    1998-07-01

    Management practices associated with bulk milk somatic cell counts (SCC) were studied for 201 dairy herds grouped into three categories according to bulk milk SCC. The cumulative production of fat-corrected milk over 305 d of lactation and category for bulk milk SCC were highly correlated; herds within the low category had the highest milk production. Differences in bulk milk SCC among the categories were well explained by the management practices studied. This correlation was not only true for the difference between the high (250,000 to 400,000) and low (teat disinfection, and antibiotic treatment of clinical mastitis, were also found to be important in the explanation of the difference between herds in the medium and low categories for bulk milk SCC. More attention was paid to hygiene for herds in the low category than for herds in the medium or high category. Supplementation of the diet with minerals occurred more frequently for cows in the low category for bulk milk SCC than for cows in the medium and high categories.

  16. Experimental characterization of the COndensation PArticle counting System for high altitude aircraft-borne application

    Directory of Open Access Journals (Sweden)

    S. Borrmann

    2009-06-01

    Full Text Available A characterization of the ultra-fine aerosol particle counter COPAS (COndensation PArticle counting System for operation on board the Russian high altitude research aircraft M-55 Geophysika is presented. The COPAS instrument consists of an aerosol inlet and two dual-channel continuous flow Condensation Particle Counters (CPCs operated with the chlorofluorocarbon FC-43. It operates at pressures between 400 and 50 hPa for aerosol detection in the particle diameter (dp range from 6 nm up to 1 μm. The aerosol inlet, designed for the M-55, is characterized with respect to aspiration, transmission, and transport losses. The experimental characterization of counting efficiencies of three CPCs yields dp50 (50% detection particle diameter of 6 nm, 11 nm, and 15 nm at temperature differences (ΔT between saturator and condenser of 17°C, 30°C, and 33°C, respectively. Non-volatile particles are quantified with a fourth CPC, with dp50=11 nm. It includes an aerosol heating line (250°C to evaporate H2SO4-H2O particles of 11 nm<dp<200 nm at pressures between 70 and 300 hPa. An instrumental in-flight inter-comparison of the different COPAS CPCs yields correlation coefficients of 0.996 and 0.985. The particle emission index for the M-55 in the range of 1.4–8.4×1016 kg−1 fuel burned has been estimated based on measurements of the Geophysika's own exhaust.

  17. High level tritiated water monitoring by Bremsstrahlung counting using a silicon drift detector

    Energy Technology Data Exchange (ETDEWEB)

    Niemes, S.; Sturm, M.; Michling, R.; Bornschein, B. [Institute for Technical Physics - ITEP, Tritium Laboratory Karlsruhe - TLK, Karlsruhe Institute of Technology - KIT, Karlsruhe (Germany)

    2015-03-15

    The β-ray induced X-ray spectrometry (BIXS) is a promising technique to monitor the tritium concentration in a fuel cycle of a fusion reactor. For in-situ measurements of high level tritiated water by Bremsstrahlung counting, the characteristics of a low-noise silicon drift detector (SDD) have been examined at the Tritium Laboratory Karlsruhe (TLK). In static measurements with constant sample volume and tritium concentration, the Bremsstrahlung spectra of tritiated water samples in a concentration range of 0.02 to 15 MBq/ml have been obtained. The volume has been kept constant at 5 cm{sup 3}. The observed spectra are well above the noise threshold. In addition to X-rays induced by β-rays, the spectra feature X-ray fluorescence peaks of the surrounding materials. No indications of memory effects have been observed. A linear relation between the X-ray intensity and the tritium concentration was obtained and the lower detection limit of the setup has been determined to 1 MBq ml{sup -1}, assessed by the Curie criterion. In addition, the spectra obtained experimentally could be reproduced with high agreement by Monte-Carlo simulations using the GEANT4-tool-kit. It was found that the present detection system is applicable to non-invasive measurements of high-level tritiated water and the SDD is a convenient tool to detect the low energy Bremsstrahlung X-rays. (authors)

  18. Optimization of the ATLAS (s)MDT readout electronics for high counting rates

    Energy Technology Data Exchange (ETDEWEB)

    Kortner, Oliver; Kroha, Hubert; Nowak, Sebastian; Schmidt-Sommerfeld, Korbinian [Max-Planck-Institut fuer Physik (Werner-Heisenberg-Institut), Foehringer Ring 6, 80805 Muenchen (Germany)

    2016-07-01

    In the ATLAS muon spectrometer, Monitored Drift Tube (MDT) chambers are used for precise muon track measurement. For the high background rates expected at HL-LHC, which are mainly due to neutrons and photons produced by interactions of the proton collision products in the detector and shielding, new small-diameter muon drift tube (sMDT)-chambers with half the drift tube diameter of the MDT-chambers and ten times higher rate capability have been developed. The standard MDT readout electronics uses bipolar shaping in front of a discriminator. This shaping leads to an undershoot of same charge but opposite polarity following each pulse. With count rates also the probability of having the subsequent pulse in this undershoot increases, which leads to losses in efficiency and spatial resolution. In order to decrease this effect, discrete prototype electronics including Baseline Restoration has been developed. Results of their tests and data taken with them during muon beamtime measurements at CERN's Gamma Irradiation Facility will be presented. which causes a deterioration of signal pulses by preceding background hits, leading to losses in muon efficiency and drift tube spatial resolution. In order to mitigate these so-called signal pile-up effects, new readout electronics with active baseline restoration (BLR) is under development. Discrete prototype electronics with BLR functionality has been tested in laboratory measurements and in the Gamma Irradiation Facility at CERN under high γ-irradiation rates. Results of the measurements are presented.

  19. Non-Markovian spin-resolved counting statistics and an anomalous relation between autocorrelations and cross correlations in a three-terminal quantum dot

    Science.gov (United States)

    Luo, JunYan; Yan, Yiying; Huang, Yixiao; Yu, Li; He, Xiao-Ling; Jiao, HuJun

    2017-01-01

    We investigate the noise correlations of spin and charge currents through an electron spin resonance (ESR)-pumped quantum dot, which is tunnel coupled to three electrodes maintained at an equivalent chemical potential. A recursive scheme is employed with inclusion of the spin degrees of freedom to account for the spin-resolved counting statistics in the presence of non-Markovian effects due to coupling with a dissipative heat bath. For symmetric spin-up and spin-down tunneling rates, an ESR-induced spin flip mechanism generates a pure spin current without an accompanying net charge current. The stochastic tunneling of spin carriers, however, produces universal shot noises of both charge and spin currents, revealing the effective charge and spin units of quasiparticles in transport. In the case of very asymmetric tunneling rates for opposite spins, an anomalous relationship between noise autocorrelations and cross correlations is revealed, where super-Poissonian autocorrelation is observed in spite of a negative cross correlation. Remarkably, with strong dissipation strength, non-Markovian memory effects give rise to a positive cross correlation of the charge current in the absence of a super-Poissonian autocorrelation. These unique noise features may offer essential methods for exploiting internal spin dynamics and various quasiparticle tunneling processes in mesoscopic transport.

  20. Effect of high and low antral follicle count in pubertal beef heifers on in vitro fertilization (IVF)

    Science.gov (United States)

    Pubertal heifers can be classified between those with high (= 25) and low (= 15) antral follicle counts (AFC). The objective of this study was to determine oocyte development and maturation (e.g., fertility) in an in vitro fertilization (IVF) system for high and low AFC heifers. From a pool of 120...

  1. Terascale Physics Opportunities at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    CERN Document Server

    Adams, T; Bugel, L; Camilleri, L; Conrad, J M; De Gouvêa, A; Fisher, P H; Formaggio, J A; Jenkins, J; Karagiorgi, G; Kobilarcik, T R; Kopp, S; Kyle, G; Loinaz, W A; Mason, D A; Milner, R; Moore, R; Morfín, J G; Nakamura, M; Naples, D; Nienaber, P; Olness, F I; Owens, J F; Pate, S F; Pronin, A; Seligman, W G; Shaevitz, M H; Schellman, H; Schienbein, I; Syphers, M J; Tait, T M P; Takeuchi, T; Tan, C Y; Van de Water, R G; Yamamoto, R K; Yu, J Y

    2008-01-01

    This article presents the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering on Glass). This experiment uses a Tevatron-based neutrino beam to obtain over an order of magnitude higher statistics than presently available for the purely weak processes $\

  2. A relationship between CD4 count and oral manifestations of human immunodeficiency virus-infected patients on highly active antiretroviral therapy in urban population

    Science.gov (United States)

    Satyakiran, Gadavalli Vera Venkata; Bavle, Radhika Manoj; Alexander, Glory; Rao, Saritha; Venugopal, Reshma; Hosthor, Sreelatha S

    2016-01-01

    Introduction: Human immunodeficiency virus (HIV) infection gradually destroys the body's immune system, which makes it harder for the body to fight infections. HIV infection causes a quantitative and qualitative depletion of CD4 lymphocyte count, which increases the risk of opportunistic infections. Thus, CD4 count is one of the key factors in determining both the urgency of highly active antiretroviral therapy (HAART) initiation and the need of prophylaxis for opportunistic infections. Aim: This study aims to evaluate the prevalence and variations in the oral manifestations of HIV/acquired immune deficiency syndrome patients on HAART therapy in urban population and their association with CD4 count. Materials and Methods: A study was conducted by screening eighty patients who were HIV positive in an urban location. Both adult and pediatric patients were screened for oral manifestations and simultaneously CD4 count was also evaluated. Patients with HIV infection for variable time period who are under HAART were considered. Statistical Analysis: Measures of central tendency were used to analyse the data. Results: HIV infection destroys the immune system of an individual, making the patient susceptible to various infections and malignancies. With the advent of antiretroviral therapy, the scenario has changed drastically. We have observed that patients with CD4 counts between 164 and 1286 show relatively few oral manifestations. Long-term HAART therapy causes pigmentation, xerostomia and angular cheilitis but is taken up quite well by the patients. Conclusion: In this study, eighty patients with HAART from urban population showed very minimal oral findings because of good accessibility for treatment and awareness about HIV infections. The patients who were on long-standing HAART treatment also showed minimal oral manifestation such as pigmentation and xerostomia. Hence, we conclude that recognition, significance and treatment of these lesions in patients with HIV

  3. Statistics of High Atwood Number Turbulent Mixing Layers

    Science.gov (United States)

    Baltzer, Jon; Livescu, Daniel

    2015-11-01

    The statistical properties of incompressible shear-driven planar mixing layers between two miscible streams of fluids with different densities are investigated by means of Direct Numerical Simulations. The simulations begin from a thin interface perturbed by a thin broadband random disturbance, and the mixing layers are allowed to develop to self-similar states. The temporal simulations are performed in unprecedented domain sizes, with grid sizes up to 6144 x 2048 x 1536, which allows turbulent structures to grow and merge naturally. This allows the flow to reach states far-removed from the initial disturbances, thereby enabling high-quality statistics to be obtained for higher moments, pdfs, and other quantities critical to developing closure models. A wide range of Atwood numbers are explored, ranging from nearly constant density to At=0.87. The consequences of increasing the density contrast are investigated for global quantities, such as growth rates, and asymmetries that form in statistical profiles. Additional simulations in smaller domains are performed to study the effects of domain size.

  4. Improving EWMA Plans for Detecting Unusual Increases in Poisson Counts

    Directory of Open Access Journals (Sweden)

    R. S. Sparks

    2009-01-01

    adaptive exponentially weighted moving average (EWMA plan is developed for signalling unusually high incidence when monitoring a time series of nonhomogeneous daily disease counts. A Poisson transitional regression model is used to fit background/expected trend in counts and provides “one-day-ahead” forecasts of the next day's count. Departures of counts from their forecasts are monitored. The paper outlines an approach for improving early outbreak data signals by dynamically adjusting the exponential weights to be efficient at signalling local persistent high side changes. We emphasise outbreak signals in steady-state situations; that is, changes that occur after the EWMA statistic had run through several in-control counts.

  5. A Near-Infrared Photon Counting Camera for High Sensitivity Astronomical Observation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is a Near Infrared Photon-Counting Sensor (NIRPCS), an imaging device with sufficient sensitivity to capture the spectral signatures, in the...

  6. High-Sensitivity Semiconductor Photocathodes for Space-Born UV Photon-Counting and Imaging Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Many UV photon-counting and imaging applications, including space-borne astronomy, missile tracking and guidance, UV spectroscopy for chemical/biological...

  7. Highly Sensitive Photon Counting Detectors for Deep Space Optical Communications Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A new type of a photon-counting photodetector is proposed to advance the state-of the-art in deep space optical communications technology. The proposed detector...

  8. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG.

    Science.gov (United States)

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Sertac Artan, N; Froemke, Robert C; Viventi, Jonathan

    2017-04-01

    High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 d. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but also resulted in a small, light headstage that can

  9. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG

    Science.gov (United States)

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Sertac Artan, N.; Froemke, Robert C.; Viventi, Jonathan

    2017-04-01

    Objective. High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. Approach. We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. Main results. The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 d. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. Significance. We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but

  10. Construction and Test of Muon Drift Tube Chambers for High Counting Rates

    CERN Document Server

    Schwegler, Philipp; Dubbert, Jörg

    2010-01-01

    Since the start of operation of the Large Hadron Collider (LHC) at CERN on 20 November 2009, the instantaneous luminosity is steadily increasing. The muon spectrometer of the ATLAS detector at the LHC is instrumented with trigger and precision tracking chambers in a toroidal magnetic field. Monitored Drift-Tube (MDT) chambers are employed as precision tracking chambers, complemented by Cathode Strip Chambers (CSC) in the very forward region where the background counting rate due to neutrons and γ's produced in shielding material and detector components is too high for the MDT chambers. After several upgrades of the CERN accelerator system over the coming decade, the instantaneous luminosity is expected to be raised to about five times the LHC design luminosity. This necessitates replacement of the muon chambers in the regions with the highest background radiation rates in the so-called Small Wheels, which constitute the innermost layers of the muon spectrometer end-caps, by new detectors with higher rate cap...

  11. 32-channel time-correlated-single-photon-counting system for high-throughput lifetime imaging

    Science.gov (United States)

    Peronio, P.; Labanca, I.; Acconcia, G.; Ruggeri, A.; Lavdas, A. A.; Hicks, A. A.; Pramstaller, P. P.; Ghioni, M.; Rech, I.

    2017-08-01

    Time-Correlated Single Photon Counting (TCSPC) is a very efficient technique for measuring weak and fast optical signals, but it is mainly limited by the relatively "long" measurement time. Multichannel systems have been developed in recent years aiming to overcome this limitation by managing several detectors or TCSPC devices in parallel. Nevertheless, if we look at state-of-the-art systems, there is still a strong trade-off between the parallelism level and performance: the higher the number of channels, the poorer the performance. In 2013, we presented a complete and compact 32 × 1 TCSPC system, composed of an array of 32 single-photon avalanche diodes connected to 32 time-to-amplitude converters, which showed that it was possible to overcome the existing trade-off. In this paper, we present an evolution of the previous work that is conceived for high-throughput fluorescence lifetime imaging microscopy. This application can be addressed by the new system thanks to a centralized logic, fast data management and an interface to a microscope. The new conceived hardware structure is presented, as well as the firmware developed to manage the operation of the module. Finally, preliminary results, obtained from the practical application of the technology, are shown to validate the developed system.

  12. Preamplifier development for high count-rate, large dynamic range readout of inorganic scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Keshelashvili, Irakli; Erni, Werner; Steinacher, Michael; Krusche, Bernd; Collaboration: PANDA-Collaboration

    2013-07-01

    Electromagnetic calorimeter are central component of many experiments in nuclear and particle physics. Modern ''trigger less'' detectors run with very high count-rates, require good time and energy resolution, and large dynamic range. In addition photosensors and preamplifiers must work in hostile environments (magnetic fields). Due to later constraints mainly Avalanche Photo Diodes (APD's), Vacuum Photo Triodes (VPT's), and Vacuum Photo Tetrodes (VPTT's) are used. A disadvantage is their low gain which together with other requirements is a challenge for the preamplifier design. Our group has developed special Low Noise / Low Power (LNP) preamplifier for this purpose. They will be used to equip PANDA EMC forward end-cap (dynamic range 15'000, rate 1MHz), where the PWO II crystals and preamplifier have to run in an environment cooled down to -25{sup o}C. Further application is the upgrade of the Crystal Barrel detector at the Bonn ELSA accelerator with APD readout for which special temperature comparison of the APD gain and good time resolution is necessary. Development and all test procedures after the mass production done by our group during past several years in Basel University will be reported.

  13. Depth imaging in highly scattering underwater environments using time-correlated single-photon counting

    Science.gov (United States)

    Maccarone, Aurora; McCarthy, Aongus; Halimi, Abderrahim; Tobin, Rachael; Wallace, Andy M.; Petillot, Yvan; McLaughlin, Steve; Buller, Gerald S.

    2016-10-01

    This paper presents an optical depth imaging system optimized for highly scattering environments such as underwater. The system is based on the time-correlated single-photon counting (TCSPC) technique and the time-of-flight approach. Laboratory-based measurements demonstrate the potential of underwater depth imaging, with specific attention given to environments with a high level of scattering. The optical system comprised a monostatic transceiver unit, a fiber-coupled supercontinuum laser source with a wavelength tunable acousto-optic filter (AOTF), and a fiber-coupled single-element silicon single-photon avalanche diode (SPAD) detector. In the optical system, the transmit and receive channels in the transceiver unit were overlapped in a coaxial optical configuration. The targets were placed in a 1.75 meter long tank, and raster scanned using two galvo-mirrors. Laboratory-based experiments demonstrate depth profiling performed with up to nine attenuation lengths between the transceiver and target. All of the measurements were taken with an average laser power of less than 1mW. Initially, the data was processed using a straightforward pixel-wise cross-correlation of the return timing signal with the system instrumental timing response. More advanced algorithms were then used to process these cross-correlation results. These results illustrate the potential for the reconstruction of images in highly scattering environments, and to permit the investigation of much shorter acquisition time scans. These algorithms take advantage of the data sparseness under the Discrete Cosine Transform (DCT) and the correlation between adjacent pixels, to restore the depth and reflectivity images.

  14. Statistical Mechanics of Optimal Convex Inference in High Dimensions

    Science.gov (United States)

    Advani, Madhu; Ganguli, Surya

    2016-07-01

    A fundamental problem in modern high-dimensional data analysis involves efficiently inferring a set of P unknown model parameters governing the relationship between the inputs and outputs of N noisy measurements. Various methods have been proposed to regress the outputs against the inputs to recover the P parameters. What are fundamental limits on the accuracy of regression, given finite signal-to-noise ratios, limited measurements, prior information, and computational tractability requirements? How can we optimally combine prior information with measurements to achieve these limits? Classical statistics gives incisive answers to these questions as the measurement density α =(N /P )→∞ . However, these classical results are not relevant to modern high-dimensional inference problems, which instead occur at finite α . We employ replica theory to answer these questions for a class of inference algorithms, known in the statistics literature as M-estimators. These algorithms attempt to recover the P model parameters by solving an optimization problem involving minimizing the sum of a loss function that penalizes deviations between the data and model predictions, and a regularizer that leverages prior information about model parameters. Widely cherished algorithms like maximum likelihood (ML) and maximum-a posteriori (MAP) inference arise as special cases of M-estimators. Our analysis uncovers fundamental limits on the inference accuracy of a subclass of M-estimators corresponding to computationally tractable convex optimization problems. These limits generalize classical statistical theorems like the Cramer-Rao bound to the high-dimensional setting with prior information. We further discover the optimal M-estimator for log-concave signal and noise distributions; we demonstrate that it can achieve our high-dimensional limits on inference accuracy, while ML and MAP cannot. Intriguingly, in high dimensions, these optimal algorithms become computationally simpler than

  15. A high resolution laser ranging system based on time-correlated single-photon counting technology

    Science.gov (United States)

    Yang, Yixin; Wang, Huanqin; Huang, Zhe; Cao, Yangyang; Gui, Huaqiao

    2014-12-01

    Laser ranging has become an important method for both distance measurements and acquisition of threedimensional (3D) images. In this paper, a laser ranging system based on Time-Correlated Single-Photon Counting technology (TCSPC) is developed. A Geiger-mode avalanche photodiode (G-APD), which has the ability of detecting single-photon events, is used to capture the weak light scattered from the long-range target. In order to improve the ranging resolution of TCSPC based measurement system, a high repetition frequency of subnanosecond narrow pulse generator circuit based on the avalanche effect of RF-BJT is designed and applied as the light source. Moreover, some optimized optical light designs have been done to improve the system signal to noise rate (SNR), including using a special aspherical lens as projecting lens, adopting a telephoto camera lens with small view angle and short depth of field before detector. Experimental tests for evaluation of the laser raging system performance are described. As a means of echo signal analysis, three different algorithms have been introduced, in which the cross-correlation algorithm was demonstrated to be the most effective algorithm to determining the round trip time to a target, even based on histograms with a significant amount of background noise photons. It was found that centimeter ranging resolution can be achieved thanks to the use of Time-to-Digital Converter (TDC) with picosecond resolution and the Cross-Correlation algorithm. The proposed laser ranging system has advantages of high range resolution, short response time and simple structure, which was potential applications for 3D object recognition, computer vision, reverse engineering and virtual reality.

  16. Statistical analysis of highly correlated systems in biology and physics

    Science.gov (United States)

    Martin, Hector Garcia

    In this dissertation, I present my work on the statistical study of highly correlated systems in three fields of science: ecology, microbial ecology and physics. I propose an explanation for how the highly correlated distribution of species individuals, and an abundance distribution commonly observed in ecological systems, give rise to a power law dependence between a given area and the number of unique species it harbors. This is one of the oldest known ecological patterns: the power-law Species Area Rule. As a natural extension of my studies in ecology, I have undertaken both theoretical research and field work in the developing field of microbial ecology. In particular, I participated in a multidisciplinary study of the impact of microbes on the formation of macroscopic calcium carbonate terraces at Yellowstone National Park Hot Springs. I have used ecological techniques to characterize the biodiversity of our study site and developed a new bootstrap method for extracting abundance information from clone libraries. This has singled out the most abundant microorganisms and paved the way for future studies of the possible non-passive role of microorganisms in carbonate precipitation. The third part of my thesis uses statistical techniques to explore the correlations in rotating Bose-Einstein condensates. I have used finite difference techniques to solve the Gross-Pitaevskii equation in order to obtain the structure of a vortex in a lattice. Surprisingly, I have found that, in order to understand this structure, it is necessary to add a correction to the Gross-Pitaevskii equation which introduces a dependence on the particle scattering length. I have also used Path Integral Monte Carlo techniques to explore the limit of rapid rotations, where the Gross-Pitaevskii equation is no longer valid. Interestingly, the Gross-Pitaevskii equation seems to be valid for much higher densities than expected if properly renormalized. I show that, in accord with the prediction of

  17. In vitro fertilization (IVF) from low or high antral follicle count pubertal beef heifers using semi-defined culture conditions

    Science.gov (United States)

    Antral follicle counts (AFC) vary among pubertal beef heifers. Our objective was to compare the in vitro maturation and fertilization of oocytes collected from low and high AFC heifers. Previously we reported results using serum-based IVF media and in this study report results using semi-defined m...

  18. In vitro fertilization (IVF) using semi-defined culture conditions from low or high antral follicle count pubertal beef heifers

    Science.gov (United States)

    To compare the in vitro fertilization (IVF) and production (IVP) of embryos from low and high antral follicle count (AFC) heifers, AFC were determined on 106 heifers using transrectal ultrasonography. Ten heifers with the lowest AFC (avg. 13.2) and 10 heifers with the highest AFC (avg. 27.4) with ev...

  19. High-speed single photon counting read out electronics for a digital detection system for clinical synchrotron radiation mammography

    Science.gov (United States)

    Bergamaschi, A.; Arfelli, F.; Dreossi, D.; Longo, R.; Olivo, A.; Pani, S.; Rigon, L.; Vallazza, E.; Venanzi, C.; Castelli, E.

    2004-02-01

    The SYRMEP beam line is currently in the upgrading phase for mammographic examinations on patients at Elettra in Trieste. At the same time, a digital detection system, suitable for in -vivo breast imaging, is under development; it consists of a silicon laminar detector array operating in single photon counting mode. The duration of a clinical examination should not exceed a few seconds. Fast read out electronics is therefore necessary with the aim of avoiding losses in image contrast in presence of high counting rates. A custom ASIC working with 100% efficiency for rates up to 100 kHz per pixel has been designed and tested, and other solutions based on commercially available ASICs are currently under test. Several detector prototypes have been assembled, and images of mammographic test objects have been acquired. Image quality, efficiency and contrast losses have been evaluated in all cases as a function of the counting rate.

  20. Renal stone characterization using high resolution imaging mode on a photon counting detector CT system

    Science.gov (United States)

    Ferrero, A.; Gutjahr, R.; Henning, A.; Kappler, S.; Halaweish, A.; Abdurakhimova, D.; Peterson, Z.; Montoya, J.; Leng, S.; McCollough, C.

    2017-03-01

    In addition to the standard-resolution (SR) acquisition mode, a high-resolution (HR) mode is available on a research photon-counting-detector (PCD) whole-body CT system. In the HR mode each detector consists of a 2x2 array of 0.225 mm x 0.225 mm subpixel elements. This is in contrast to the SR mode that consists of a 4x4 array of the same subelements, and results in 0.25 mm isotropic resolution at iso-center for the HR mode. In this study, we quantified ex vivo the capabilities of the HR mode to characterize renal stones in terms of morphology and mineral composition. Forty pure stones - 10 uric acid (UA), 10 cystine (CYS), 10 calcium oxalate monohydrate (COM) and 10 apatite (APA) - and 14 mixed stones were placed in a 20 cm water phantom and scanned in HR mode, at radiation dose matched to that of routine dual-energy stone exams. Data from micro CT provided a reference for the quantification of morphology and mineral composition of the mixed stones. The area under the ROC curve was 1.0 for discriminating UA from CYS, 0.89 for CYS vs COM and 0.84 for COM vs APA. The root mean square error (RMSE) of the percent UA in mixed stones was 11.0% with a medium-sharp kernel and 15.6% with the sharpest kernel. The HR showed qualitatively accurate characterization of stone morphology relative to micro CT.

  1. Counting carbohydrates

    Science.gov (United States)

    Carb counting; Carbohydrate-controlled diet; Diabetic diet; Diabetes-counting carbohydrates ... Many foods contain carbohydrates (carbs), including: Fruit and fruit juice Cereal, bread, pasta, and rice Milk and milk products, soy milk Beans, legumes, ...

  2. Seal Counts

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Database of seal counts from aerial photography. Counts by image, site, species, and date are stored in the database along with information on entanglements and...

  3. Platelet Count

    Science.gov (United States)

    ... their spleen removed surgically Use of birth control pills (oral contraceptives) Some conditions may cause a temporary (transitory) increased ... increased platelet counts include estrogen and birth control pills (oral contraceptives). Mildly decreased platelet counts may be seen in ...

  4. Topics in statistical data analysis for high-energy physics

    CERN Document Server

    Cowan, G

    2013-01-01

    These lectures concern two topics that are becoming increasingly important in the analysis of High Energy Physics (HEP) data: Bayesian statistics and multivariate methods. In the Bayesian approach we extend the interpretation of probability to cover not only the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in HEP in recent years: the boosted decision tree (BDT).

  5. Birth statistics of high birth weight infants (macrosomia in Korea

    Directory of Open Access Journals (Sweden)

    Byung-Ho Kang

    2012-08-01

    Full Text Available &lt;B&gt;Purpose:&lt;/B&gt; The authors analyzed the trend from the birth-related statistics of high birth weight infants (HBWIs over 50 years in Korea from 1960 to 2010. &lt;B&gt;Methods:&lt;/B&gt; We used 2 data sources, namely, the hospital units (1960’s to 1990’s and Statistics Korea (1993 to 2010. The analyses include the incidence of HBWIs, birth weight distribution, sex ratio, and the relationship of HBWI to maternal age. &lt;B&gt;Results:&lt;/B&gt; The hospital unit data indicated the incidence of HBWI as 3 to 7% in the 1960’s and 1970’s and 4 to 7% in the 1980’s and 1990’s. Data from Statistics Korea indicated the percentages of HBWIs among total live births decreased over the years: 6.7% (1993, 6.3% (1995, 5.1 % (2000, 4.5% (2000, and 3.5% (2010. In HBWIs, the birth weight rages and percentage of incidence in infants’ were 4.0 to 4.4 kg (90.3%, 4.5 to 4.9 kg (8.8%, 5.0 to 5.4 kg (0.8%, 5.5 to 5.9 kg (0.1%, and &gt;6.0 kg (0.0% in 2000 but were 92.2%, 7.2%, 0.6%, 0.0%, and 0.0% in 2009. The male to female ratio of HBWIs was 1.89 in 1993 and 1.84 in 2010. In 2010, the mother's age distribution correlated with low (4.9%, normal (91.0%, and high birth weights (3.6%: an increase in mother's age resulted in an increase in the frequency of low birth weight infants (LBWIs and HBWIs. &lt;B&gt;Conclusion:&lt;/B&gt; The incidence of HBWIs for the past 50 years has been dropping in Korea. The older the mother, the higher was the risk of a HBWI and LBWI. We hope that these findings would be utilized as basic data that will aid those managing HBWIs.

  6. High white blood cell count at diagnosis of childhood acute lymphoblastic leukaemia

    DEFF Research Database (Denmark)

    Vaitkeviciene, Goda; Forestier, Erik; Hellebostad, Marit;

    2011-01-01

    Prognostic impact of peripheral blood white blood cell count (WBC) at the diagnosis of childhood acute lymphoblastic leukaemia (ALL) was evaluated in a population-based consecutive series of 2666 children aged 1-15 treated for ALL between 1992 and 2008 in the five Nordic countries (Denmark, Finland...

  7. A proteomic perspective on the changes in milk proteins due to high somatic cell count

    NARCIS (Netherlands)

    Zhang, L.; Boeren, J.A.; Hooijdonk, van A.C.M.; Vervoort, J.J.M.; Hettinga, K.A.

    2015-01-01

    Although cows with subclinical mastitis have no difference in the appearance of their milk, milk composition and milk quality are altered because of the inflammation. To know the changes in milk quality with different somatic cell count (SCC) levels, 5 pooled bovine milk samples with SCC from 105 to

  8. Low preoperative platelet counts predict a high mortality after partial hepatectomy in patients with hepatocellular carcinoma

    Institute of Scientific and Technical Information of China (English)

    Kazuhiro Kaneko; Yoshio Shirai; Toshifumi Wakai; Naoyuki Yokoyama; Kohei Akazawa; Katsuyoshi Hatakeyama

    2005-01-01

    AIM: To assess the validity of our selection criteria for hepatectomy procedures based on indocyanine green disappearance rate (KICG), and to unveil the factors affecting posthepatectomy mortality in patients with hepatocellular carcinoma (HCC).METHODS: A retrospective analysis of 198 consecutive patients with HCC who underwent partial hepatectomies in the past 14 years was conducted. The selection criteria for hepatectomy procedures during the study period were KICG≥0.12 for hemihepatectomy, KICG≥0.10 for bisegmentectomy, KICG≥0.08 for monosegmentectomy, and KICG ≥0.06 for nonanatomic hepatectomy. The hepatectomies were categorized into three types: major hepatectomy (hemihepatectomy or a more extensive procedure),bisegmentectomy, and limited hepatectomy. Univariate (Fishers exact test) and multivariate (the logistic regression model) analyses were used.RESULTS: Postoperative mortality was 5% after major hepatectomy, 3% after bisegmentectomy, and 3% after limited hepatectomy. The three percentages were comparable (P = 0.876). The platelet count of ≤ 10x 104/μL was the strongest independent factor for postoperative mortality on univariate (P = 0.001) and multivariate (risk ratio,12.5; P= 0.029) analyses. No patient with a platelet count of >7.3x 104/μL died of postoperative morbidity, whereas 25% (6/24 patients) of patients with a platelet count of ≤7.3x 104/μL died (P<0.001).CONCLUSION: The selection criteria for hepatectomy procedures based on KICG are generally considered valid,because of the acceptable morbidity and mortality with these criteria. The preoperative platelet count independently affects morbidity and mortality after hepatectomy, suggesting that a combination of KICG and platelet count would further reduce postoperative mortality.

  9. Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Christopher, E-mail: Christopher.Kurz@physik.uni-muenchen.de; Bauer, Julia [Heidelberg Ion-Beam Therapy Center and Department of Radiation Oncology, Heidelberg University Hospital, Heidelberg 69120 (Germany); Conti, Maurizio; Guérin, Laura; Eriksson, Lars [Siemens Healthcare Molecular Imaging, Knoxville, Tennessee 37932 (United States); Parodi, Katia [Heidelberg Ion-Beam Therapy Center and Department of Radiation Oncology, Heidelberg University Hospital, Heidelberg 69120, Germany and Department of Experimental Physics – Medical Physics, Ludwig-Maximilians-University, Munich 85748 (Germany)

    2015-07-15

    Purpose: External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β{sup +}-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small number of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. Methods: The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended

  10. FROST: a low-noise high-rate photon counting ASIC for X-ray applications

    Energy Technology Data Exchange (ETDEWEB)

    Prest, M. E-mail: prest@ts.infn.it; Vallazza, E.; Chiavacci, M.; Mariani, R.; Motto, S.; Neri, M.; Scantamburlo, N.; Arfelli, F.; Conighi, A.; Longo, R.; Olivo, A.; Pani, S.; Poropat, P.; Rashevsky, A.; Rigon, L.; Tromba, G.; Castelli, E

    2001-04-01

    FRONTier RADiography is an R and D project to assess the feasibility of digital mammography with Synchrotron Radiation at the ELETTRA Light Source in Trieste. In order to reach an acceptable time duration of the exam, a fast- and low-noise photon counting ASIC has been developed in collaboration with Aurelia Microelettronica, called Frontrad ReadOut SysTem. It is a multichannel counting system, each channel being made of a low-noise charge-sensitive preamplifier optimized for X-ray energy range (10-100 keV), a CR-RC{sup 2} shaper, a discriminator and a 16-bit counter. In order to set the discriminator threshold, a set of a global 6-bit DAC and a local (per channel) 3-bit DAC has been implemented within the ASIC. We report on the measurements done with the 8-channel prototype chip and the comparison with the simulation results.

  11. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    Statistical inference for sparse signals or low-rank matrices in high-dimensional settings is of significant interest in a range of contemporary applications. It has attracted significant recent attention in many fields including statistics, applied mathematics and electrical engineering. In this thesis, we consider several problems in including sparse signal recovery (compressed sensing under restricted isometry) and low-rank matrix recovery (matrix recovery via rank-one projections and structured matrix completion). The first part of the thesis discusses compressed sensing and affine rank minimization in both noiseless and noisy cases and establishes sharp restricted isometry conditions for sparse signal and low-rank matrix recovery. The analysis relies on a key technical tool which represents points in a polytope by convex combinations of sparse vectors. The technique is elementary while leads to sharp results. It is shown that, in compressed sensing, delta kA nuclear norm minimization. Moreover, for any epsilon > 0, delta kA nuclear norm minimization method for stable recovery of low-rank matrices in the noisy case. The procedure is adaptive to the rank and robust against small perturbations. Both upper and lower bounds for the estimation accuracy under the Frobenius norm loss are obtained. The proposed estimator is shown to be rate-optimal under certain conditions. The estimator is easy to implement via convex programming and performs well numerically. The techniques and main results developed in the chapter also have implications to other related statistical problems. An application to estimation of spiked covariance matrices from one-dimensional random projections is considered. The results demonstrate that it is still possible to accurately estimate the covariance matrix of a high-dimensional distribution based only on one-dimensional projections. For the third part of the thesis, we consider another setting of low-rank matrix completion. Current literature

  12. High-Statistics β+ / EC -Decay Study of 122Xe

    Science.gov (United States)

    Jigmeddorj, Badamsambuu; S1292 Collaboration

    2016-09-01

    The Xe isotopes are centrally located in the Z > 50 , N vibrational structure influenced by proton subshell gaps, perhaps leading to shape-coexistence that could give rise to strong E 0 transitions. Recent work on 124Xe has established nearly identical quadrupole collectivity for the pairing vibrational 03+ band and the ground state band. However, in 122Xe, the 03+ state has not been firmly identified. A high-statistics 122Cs β+ / EC decay experiment to obtain detailed spectroscopic data for low-spin states was performed at the TRIUMF-ISAC facility using the 8 π γ-ray spectrometer and its auxiliary detectors including PACES, an array of five Si(Li) detectors, for conversion electron spectroscopy. The decay scheme has been considerably extended through a γ- γ coincidence analysis, and 0+ states have been identified via γ- γ angular correlations. This work supported by the Natural Sciences and Engineering Research Council of Canada and the National Research Council of Canada.

  13. Statistical-noise reduction in correlation analysis of high-energy nuclear collisions with event-mixing

    CERN Document Server

    Ray, R L

    2016-01-01

    The error propagation and statistical-noise reduction method of Reid and Trainor for two-point correlation applications in high-energy collisions is extended to include particle-pair references constructed by mixing two particles from all event-pair combinations within event subsets of arbitrary size. The Reid-Trainor method is also applied to other particle-pair mixing algorithms commonly used in correlation analysis of particle production from high-energy nuclear collisions. The statistical-noise reduction, inherent in the Reid-Trainor event-mixing procedure, is shown to occur for these other event-mixing algorithms as well. Monte Carlo simulation results are presented which verify the predicted degree of noise reduction. In each case the final errors are determined by the bin-wise particle-pair number, rather than by the bin-wise single-particle count.

  14. Statistical-noise reduction in correlation analysis of high-energy nuclear collisions with event-mixing

    Energy Technology Data Exchange (ETDEWEB)

    Ray, R.L., E-mail: ray@physics.utexas.edu; Bhattarai, P.

    2016-06-11

    The error propagation and statistical-noise reduction method of Reid and Trainor for two-point correlation applications in high-energy collisions is extended to include particle-pair references constructed by mixing two particles from all event-pair combinations within event subsets of arbitrary size. The Reid–Trainor method is also applied to other particle-pair mixing algorithms commonly used in correlation analysis of particle production from high-energy nuclear collisions. The statistical-noise reduction, inherent in the Reid–Trainor event-mixing procedure, is shown to occur for these other event-mixing algorithms as well. Monte Carlo simulation results are presented which verify the predicted degree of noise reduction. In each case the final errors are determined by the bin-wise particle-pair number, rather than by the bin-wise single-particle count.

  15. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  16. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    Science.gov (United States)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  17. High Throughput Method of Extracting and Counting Strontium-90 in Urine

    Energy Technology Data Exchange (ETDEWEB)

    Shkrob, I. [Argonne National Lab. (ANL), Argonne, IL (United States); Kaminski, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Mertz, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Hawkins, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Dietz, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Tisch, A. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    A method has been developed for the rapid extraction of Sr-90 from the urine of individuals exposed to radiation in a terrorist attack. The method employs two chromatographic ion-exchange materials: Diphonix resin and Sr resin, both of which are commercially available. The Diphonix resin reduces the alkali ion concentrations below 10 mM, and the Sr resin concentrates and decontaminates strontium-90. Experimental and calculational data are given for a variety of test conditions. On the basis of these results, a flowsheet has been developed for the rapid concentration and extraction of Sr-90 from human urine samples for subsequent beta-counting.

  18. A new instrument for high statistics measurement of photomultiplier characteristics

    CERN Document Server

    Bozza, C; Costa, M; Di Capua, F; Kulikovskiy, V; Mele, R; Migliozzi, P; Mollo, C M; Pellegrino, C; Riccobene, G; Vivolo, D

    2016-01-01

    Since the early days of experimental particle physics photomultipliers (PMTs) have played an important role in the detector design. Thanks to their capability of fast photon counting, PMTs are extensively used in the new-generation of astroparticle physics experiments, such as air, ice and water Cherenkov detectors. The use of PMTs of 3-inches or smaller diameter was made possible thanks to the capability of building detectors with large photocathode area distributed in a sustainable number of channels. Small size PMTs ($\\leq$ 3-inches) show little sensitivity to the Earth magnetic field, small transit time, stable transit time spread; the price per photocathode area is less comparing to the one for the large area PMTs, typically used so far in such applications. In this paper we report on the design and performance of a new instrument for mass characterisation of PMTs (from 1-inch to 3-inch size), capable to calibrate hundreds of PMTs per day and provide measurements of dark counts, signal amplitude, late-, ...

  19. COPD in HIV-Infected Patients: CD4 Cell Count Highly Correlated

    Science.gov (United States)

    Guillouet-de-Salvador, Francine; Valerio, Laure; Puglièse, Pascal; Naqvi, Alissa; Durant, Jacques; Demonchy, Elisa; Perbost, Isabelle; Cua, Eric; Marquette, Charles-Hugo; Roger, Pierre-Marie

    2017-01-01

    Background COPD is a frequent and significant cause of respiratory morbidity in HIV-infected patients despite the control of HIV. We aimed to analyze the factors correlated with COPD in this population to evaluate the existence of specific indicators of vulnerability in this population. Methods and Findings 623 HIV-infected outpatients were enrolled during one year. This population was characterised by a dedicated questionnaire and electronic patient records. COPD screening was performed according to recommended spirometric criteria. The prevalence of COPD was 9.0%. Age and smoking were independently correlated with COPD (OR, 1.61 per 10 years increase, P = 0.007; OR, 1.28 per 10 pack-year increase, P = 0.003, respectively). Body mass index (BMI) and CD4 cell-count were independently and negatively correlated with COPD (OR, 0.78, P tobacco-smoking and respiratory complaints with a particular concern toward patients with a profound CD4 cell count defect. PMID:28056048

  20. CLARO: an ASIC for high rate single photon counting with multi-anode photomultipliers

    Science.gov (United States)

    Baszczyk, M.; Carniti, P.; Cassina, L.; Cotta Ramusino, A.; Dorosz, P.; Fiorini, M.; Gotti, C.; Kucewicz, W.; Malaguti, R.; Pessina, G.

    2017-08-01

    The CLARO is a radiation-hard 8-channel ASIC designed for single photon counting with multi-anode photomultiplier tubes. Each channel outputs a digital pulse when the input signal from the photomultiplier crosses a configurable threshold. The fast return to baseline, typically within 25 ns, and below 50 ns in all conditions, allows to count up to 107 hits/s on each channel, with a power consumption of about 1 mW per channel. The ASIC presented here is a much improved version of the first 4-channel prototype. The threshold can be precisely set in a wide range, between 30 ke- (5 fC) and 16 Me- (2.6 pC). The noise of the amplifier with a 10 pF input capacitance is 3.5 ke- (0.6 fC) RMS. All settings are stored in a 128-bit configuration and status register, protected against soft errors with triple modular redundancy. The paper describes the design of the ASIC at transistor-level, and demonstrates its performance on the test bench.

  1. Double-counting challenges the accuracy of high-latitude methane inventories

    Science.gov (United States)

    Thornton, Brett F.; Wik, Martin; Crill, Patrick M.

    2016-12-01

    Quantification of the present and future contribution to atmospheric methane (CH4) from lakes, wetlands, fluvial systems, and, potentially, coastal waters remains an important unfinished task for balancing the global CH4 budget. Discriminating between these sources is crucial, especially across climate-sensitive Arctic and subarctic landscapes and waters. Yet basic underlying uncertainties remain, in such areas as total wetland area and definitions of wetlands, which can lead to conflation of wetlands and small ponds in regional studies. We discuss how in situ sampling choices, remote sensing limitations, and isotopic signature overlaps can lead to unintentional double-counting of CH4 emissions and propose that this double-counting can explain a pan-Arctic bottom-up estimate from published sources, 59.7 Tg yr-1 (range 36.9-89.4 Tg yr-1) greatly exceeding the most recent top-down inverse modeled estimate of the pan-Arctic CH4 budget (23 ± 5 Tg yr-1).

  2. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    Science.gov (United States)

    Lu, Wei; Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies (50) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  3. HEPS-BPIX, a single photon counting pixel detector with a high frame rate for the HEPS project

    Science.gov (United States)

    Wei, Wei; Zhang, Jie; Ning, Zhe; Lu, Yunpeng; Fan, Lei; Li, Huaishen; Jiang, Xiaoshan; Lan, Allan K.; Ouyang, Qun; Wang, Zheng; Zhu, Kejun; Chen, Yuanbo; Liu, Peng

    2016-11-01

    China's next generation light source, named the High Energy Photon Source (HEPS), is currently under construction. HEPS-BPIX (HEPS-Beijing PIXel) is a dedicated pixel readout chip that operates in single photon counting mode for X-ray applications in HEPS. Designed using CMOS 0.13 μm technology, the chip contains a matrix of 104×72 pixels. Each pixel measures 150 μm×150 μm and has a counting depth of 20 bits. A bump-bonded prototyping detector module with a 300-μm thick silicon sensor was tested in the beamline of Beijing Synchrotron Radiation Facility. A fast stream of X-ray images was demonstrated, and a frame rate of 1.2 kHz was proven, with a negligible dead time. The test results showed an equivalent noise charge of 115 e- rms after bump bonding and a threshold dispersion of 55 e- rms after calibration.

  4. The naive CD4+ count in HIV-1-infected patients at time of initiation of highly active antiretroviral therapy is strongly associated with the level of immunological recovery

    DEFF Research Database (Denmark)

    Michael, OG; Kirk, O; Mathiesen, Lars Reinhardt

    2002-01-01

    Current antiretroviral therapy can induce considerable, sustained viral suppression followed by immunological recovery, in which naive CD4 + cells are important. Long-term immunological recovery was investigated during the first 3 y of highly active antiretroviral therapy (HAART) in 210 HIV-1...... was sustained. There was no association between plasma viral load and the increase in naive CD4 + cell count. Importantly, baseline naive CD4 + cell count was significantly associated with the change in naive CD4 + cell count, suggesting that the naive cell count at baseline does influence the immunological...

  5. High resolution micro-CT of low attenuating organic materials using large area photon-counting detector

    Science.gov (United States)

    Kumpová, I.; Vavřík, D.; Fíla, T.; Koudelka, P.; Jandejsek, I.; Jakůbek, J.; Kytýř, D.; Zlámal, P.; Vopálenský, M.; Gantar, A.

    2016-02-01

    To overcome certain limitations of contemporary materials used for bone tissue engineering, such as inflammatory response after implantation, a whole new class of materials based on polysaccharide compounds is being developed. Here, nanoparticulate bioactive glass reinforced gelan-gum (GG-BAG) has recently been proposed for the production of bone scaffolds. This material offers promising biocompatibility properties, including bioactivity and biodegradability, with the possibility of producing scaffolds with directly controlled microgeometry. However, to utilize such a scaffold with application-optimized properties, large sets of complex numerical simulations using the real microgeometry of the material have to be carried out during the development process. Because the GG-BAG is a material with intrinsically very low attenuation to X-rays, its radiographical imaging, including tomographical scanning and reconstructions, with resolution required by numerical simulations might be a very challenging task. In this paper, we present a study on X-ray imaging of GG-BAG samples. High-resolution volumetric images of investigated specimens were generated on the basis of micro-CT measurements using a large area flat-panel detector and a large area photon-counting detector. The photon-counting detector was composed of a 010× 1 matrix of Timepix edgeless silicon pixelated detectors with tiling based on overlaying rows (i.e. assembled so that no gap is present between individual rows of detectors). We compare the results from both detectors with the scanning electron microscopy on selected slices in transversal plane. It has been shown that the photon counting detector can provide approx. 3× better resolution of the details in low-attenuating materials than the integrating flat panel detectors. We demonstrate that employment of a large area photon counting detector is a good choice for imaging of low attenuating materials with the resolution sufficient for numerical simulations.

  6. Multiplicity Counting

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pueff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  7. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  8. Reticulocyte count

    Science.gov (United States)

    ... radiation therapy, or infection) Cirrhosis of the liver Anemia caused by low iron levels, or low levels of vitamin B12 or folate Chronic kidney disease Reticulocyte count may be higher during pregnancy.

  9. Effect of electron count and chemical complexity in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor

    Science.gov (United States)

    von Rohr, Fabian; Winiarski, Michał J.; Tao, Jing; Klimczuk, Tomasz; Cava, Robert Joseph

    2016-11-01

    High-entropy alloys are made from random mixtures of principal elements on simple lattices, stabilized by a high mixing entropy. The recently discovered body-centered cubic (BCC) Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor appears to display properties of both simple crystalline intermetallics and amorphous materials; e.g., it has a well-defined superconducting transition along with an exceptional robustness against disorder. Here we show that the valence electron count dependence of the superconducting transition temperature in the high-entropy alloy falls between those of analogous simple solid solutions and amorphous materials and test the effect of alloy complexity on the superconductivity. We propose high-entropy alloys as excellent intermediate systems for studying superconductivity as it evolves between crystalline and amorphous materials.

  10. Use of domestic detergents in the California mastitis test for high somatic cell counts in milk.

    Science.gov (United States)

    Leach, K A; Green, M J; Breen, J E; Huxley, J N; Macaulay, R; Newton, H T; Bradley, A J

    2008-11-08

    The California mastitis test (CMT) is used on farms to identify subclinical mastitis by an indirect estimation of the somatic cell count (SCC) in milk. Four commercially available detergents were compared with a bespoke cmt fluid for their ability to detect milk samples with a scc above 200,000 cells/ml; differences between the interpretation of the results of the tests by eight operators were also investigated. The sensitivity and specificity of the test were affected by the type of detergent, and by the operators' interpretations. When used by the most sensitive operator, suitably diluted Fairy Liquid performed almost identically to cmt fluid in identifying milk samples with more than 200,000 cells/ml. The average sensitivities achieved by the eight operators for detecting this threshold were 82 per cent for Fairy Liquid and 84 per cent for cmt fluid, and the specificities were 93 and 91 per cent respectively. The other detergents contained less anionic surfactants and were less sensitive but similarly specific.

  11. Use of fractional packet counting for high dynamic range imaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Nascetti, A., E-mail: augusto.nascetti@uniroma1.it [Department of Aerospace and Astronautics Engineering, University of Rome ' La Sapienza' , Rome (Italy); Valerio, P., E-mail: valeriop@die.uniroma1.it [Department of Electronic Engineering, University of Rome ' La Sapienza' , Rome (Italy)

    2011-08-21

    An asynchronous self-reset with residue conversion scheme for the readout electronics of an image sensor, further referred to as Fractional Packet Counting (FPC), is proposed. The basic concept of the FPC is to increase the resolution of the conversion both by using a switched integrator and by quantifying its output at the end of the signal integration time. A circuit implementing this principle for CT applications is proposed and simulated. In particular, in the proposed circuit a constant relative resolution is used: this means to use floating point representation with a constant number of significant bits. Simulations show that a dynamic range of 117 dB is achieved, working at 2 kHz frequency. The detectable signal range goes from 24 fA to {approx}400nA. The simulation results have been used to develop a mathematical model for the SNR accounting the different noise sources. The model shows that the floating point representation has no visible impact on the SNR of the circuit.

  12. Microbiological quality and somatic cell count in bulk milk of dromedary camels (Camelus dromedarius): descriptive statistics, correlations, and factors of variation.

    Science.gov (United States)

    Nagy, P; Faye, B; Marko, O; Thomas, S; Wernery, U; Juhasz, J

    2013-09-01

    The objectives of the present study were to monitor the microbiological quality and somatic cell count (SCC) of bulk tank milk at the world's first large-scale camel dairy farm for a 2-yr period, to compare the results of 2 methods for the enumeration of SCC, to evaluate correlation among milk quality indicators, and to determine the effect of specific factors (year, season, stage of lactation, and level of production) on milk quality indicators. The study was conducted from January 2008 to January 2010. Total viable count (TVC), coliform count (CC), California Mastitis Test (CMT) score, and SCC were determined from daily bulk milk samples. Somatic cell count was measured by using a direct microscopic method and with an automatic cell counter. In addition, production parameters [total daily milk production (TDM, kg), number of milking camels (NMC), average milk per camel (AMC, kg)] and stage of lactation (average postpartum days, PPD) were recorded for each test day. A strong correlation (r=0.33) was found between the 2 methods for SCC enumeration; however, values derived using the microscopic method were higher. The geometric means of SCC and TVC were 394×10(3) cells/mL and 5,157 cfu/mL during the observation period, respectively. Somatic cell count was >500×10(3) cells/mL on 14.6% (106/725) and TVC was >10×10(3) cfu/mL on 4.0% (30/742) of the test days. Both milk quality indicators had a distinct seasonal pattern. For log SCC, the mean was lowest in summer and highest in autumn. The seasonal pattern of log TVC was slightly different, with the lowest values being recorded during the spring. The monthly mean TVC pattern showed a clear difference between years. Coliform count was <10 cfu/mL in most of the samples (709/742, 95.6%). A positive correlation was found between log SCC and log TVC (r=0.32), between log SCC and CMT score (r=0.26), and between log TVC and CC in yr 1 (r=0.30). All production parameters and stage of lactation showed strong seasonal

  13. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  14. High Numerates Count Icons and Low Numerates Process Large Areas in Pictographs: Results of an Eye-Tracking Study.

    Science.gov (United States)

    Kreuzmair, Christina; Siegrist, Michael; Keller, Carmen

    2016-08-01

    In two experiments, we investigated the influence of numeracy on individuals' information processing of pictographs depending on numeracy via an eye-tracker. In two conditions, participants from the general population were presented with a scenario depicting the risk of having cancer and were asked to indicate their perceived risk. The risk level was high (63%) in experiment 1 (N = 70) and low (6%) in experiment 2 (N = 69). In the default condition, participants were free to use their default strategy for information processing. In the guiding-toward-the-number condition, they were prompted to count icons in the pictograph by answering with an explicit number. We used eye-tracking parameters related to the distance between sequential fixations to analyze participants' strategies for processing numerical information. In the default condition, the higher the numeracy was, the shorter the distances traversed in the pictograph were, indicating that participants counted the icons. People lower in numeracy performed increased large-area processing by comparing highlighted and nonhighlighted parts of the pictograph. In the guiding-toward-the-number condition, participants used short distances regardless of their numeracy, supporting the notion that short distances represent counting. Despite the different default processing strategies, participants processed the pictograph with a similar depth and derived similar risk perceptions. The results show that pictographs are beneficial for communicating medical risk. Pictographs make the gist salient by making the part-to-whole relationship visually available, and they facilitate low numerates' non-numeric processing of numerical information. Contemporaneously, pictographs allow high numerates to numerically process and rely on the number depicted in the pictograph.

  15. High Performance Negative Feedback Near Infrared Single Photon Counting Detectors & Arrays Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Amplification Technologies Inc ("ATI") proposes to develop the enabling material and device technology for the design of ultra low noise, high gain and high speed...

  16. Technical feasibility proof for high-resolution low-dose photon-counting CT of the breast

    Energy Technology Data Exchange (ETDEWEB)

    Kalender, Willi A.; Kolditz, Daniel; Lueck, Ferdinand [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); CT Imaging GmbH, Erlangen (Germany); Steiding, Christian [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); CT Imaging GmbH, Erlangen (Germany); University Hospital of Erlangen, Institute of Radiology, Erlangen (Germany); Ruth, Veikko; Roessler, Ann-Christin [University of Erlangen-Nuernberg, Institute of Medical Physics (IMP), Erlangen (Germany); Wenkel, Evelyn [University Hospital of Erlangen, Institute of Radiology, Erlangen (Germany)

    2017-03-15

    X-ray computed tomography (CT) has been proposed and evaluated multiple times as a potentially alternative method for breast imaging. All efforts shown so far have been criticized and partly disapproved because of their limited spatial resolution and higher patient dose when compared to mammography. Our concept for a dedicated breast CT (BCT) scanner therefore aimed at novel apparatus and detector design to provide high spatial resolution of about 100 μm and average glandular dose (AGD) levels of 5 mGy or below. Photon-counting technology was considered as a solution to reach these goals. The complete concept was previously evaluated and confirmed by simulations and basic experiments on laboratory setups. We here present measurements of dose, technical image quality parameters and surgical specimen results on such a scanner. For comparison purposes, the specimens were also imaged with digital mammography (DM) and breast tomosynthesis (BT) apparatus. Results show that photon-counting BCT (pcBCT) at 5 mGy AGD offers sufficiently high 3D spatial resolution for reliable detectability of calcifications and soft tissue delineation. (orig.)

  17. An investigation of the trade-off between the count level and image quality in myocardial perfusion SPECT using simulated images: the effects of statistical noise and object variability on defect detectability

    Energy Technology Data Exchange (ETDEWEB)

    He Xin; Links, Jonathan M; Frey, Eric C [Department of Radiology, Division of Medical Imaging Physics, Johns Hopkins School of Medicine, Baltimore MD 21287 (United States)

    2010-09-07

    Quantum noise as well as anatomic and uptake variability in patient populations limits observer performance on a defect detection task in myocardial perfusion SPECT (MPS). The goal of this study was to investigate the relative importance of these two effects by varying acquisition time, which determines the count level, and assessing the change in performance on a myocardial perfusion (MP) defect detection task using both mathematical and human observers. We generated ten sets of projections of a simulated patient population with count levels ranging from 1/128 to around 15 times a typical clinical count level to simulate different levels of quantum noise. For the simulated population we modeled variations in patient, heart and defect size, heart orientation and shape, defect location, organ uptake ratio, etc. The projection data were reconstructed using the OS-EM algorithm with no compensation or with attenuation, detector response and scatter compensation (ADS). The images were then post-filtered and reoriented to generate short-axis slices. A channelized Hotelling observer (CHO) was applied to the short-axis images, and the area under the receiver operating characteristics (ROC) curve (AUC) was computed. For each noise level and reconstruction method, we optimized the number of iterations and cutoff frequencies of the Butterworth filter to maximize the AUC. Using the images obtained with the optimal iteration and cutoff frequency and ADS compensation, we performed human observer studies for four count levels to validate the CHO results. Both CHO and human observer studies demonstrated that observer performance was dependent on the relative magnitude of the quantum noise and the patient variation. When the count level was high, the patient variation dominated, and the AUC increased very slowly with changes in the count level for the same level of anatomic variability. When the count level was low, however, quantum noise dominated, and changes in the count level

  18. An investigation of the trade-off between the count level and image quality in myocardial perfusion SPECT using simulated images: the effects of statistical noise and object variability on defect detectability

    Science.gov (United States)

    He, Xin; Links, Jonathan M.; Frey, Eric C.

    2010-09-01

    Quantum noise as well as anatomic and uptake variability in patient populations limits observer performance on a defect detection task in myocardial perfusion SPECT (MPS). The goal of this study was to investigate the relative importance of these two effects by varying acquisition time, which determines the count level, and assessing the change in performance on a myocardial perfusion (MP) defect detection task using both mathematical and human observers. We generated ten sets of projections of a simulated patient population with count levels ranging from 1/128 to around 15 times a typical clinical count level to simulate different levels of quantum noise. For the simulated population we modeled variations in patient, heart and defect size, heart orientation and shape, defect location, organ uptake ratio, etc. The projection data were reconstructed using the OS-EM algorithm with no compensation or with attenuation, detector response and scatter compensation (ADS). The images were then post-filtered and reoriented to generate short-axis slices. A channelized Hotelling observer (CHO) was applied to the short-axis images, and the area under the receiver operating characteristics (ROC) curve (AUC) was computed. For each noise level and reconstruction method, we optimized the number of iterations and cutoff frequencies of the Butterworth filter to maximize the AUC. Using the images obtained with the optimal iteration and cutoff frequency and ADS compensation, we performed human observer studies for four count levels to validate the CHO results. Both CHO and human observer studies demonstrated that observer performance was dependent on the relative magnitude of the quantum noise and the patient variation. When the count level was high, the patient variation dominated, and the AUC increased very slowly with changes in the count level for the same level of anatomic variability. When the count level was low, however, quantum noise dominated, and changes in the count level

  19. Nonequilibrium quantum transport coefficients and transient dynamics of full counting statistics in the strong-coupling and non-Markovian regimes

    Science.gov (United States)

    Cerrillo, Javier; Buser, Maximilian; Brandes, Tobias

    2016-12-01

    Nonequilibrium transport properties of quantum systems have recently become experimentally accessible in a number of platforms in so-called full-counting experiments that measure transient and steady-state nonequilibrium transport dynamics. We show that the effect of the measurement back-action can be exploited to establish general relationships between transport coefficients in the transient regime which take the form of fluctuation-dissipation theorems in the steady state. This result becomes most conspicuous in the transient dynamics of open quantum systems under strong-coupling to non-Markovian environments in nonequilibrium settings. In order to explore this regime, a new simulation method based in a hierarchy of equations of motion has been developed. We instantiate our proposal with the study of energetic conductance between two baths connected via a few level system.

  20. Neutrino Oscillation Parameters After High Statistics KamLAND Results

    CERN Document Server

    Bandyopadhyay, Abhijit; Goswami, Srubabati; Petcov, S T; Roy, D P

    2008-01-01

    We do a re-analysis to asses the impact of the results of the Borexino experiment and the recent 2.8 KTy KamLAND data on the solar neutrino oscillation parameters. The current Borexino results are found to have no impact on the allowed solar neutrino parameter space. The new KamLAND data causes a significant reduction of the allowed range of $\\Delta m^2_{21}$, determining it with an unprecedented precision of 8.3% at 3$\\sigma$. The precision of $\\Delta m^2_{21}$ is controlled practically by the KamLAND data alone. Inclusion of new KamLAND results also improves the upper bound on $\\sin^2\\theta_{12}$, but the precision of this parameter continues to be controlled by the solar data. The third mixing angle is constrained to be $\\sin^2\\theta_{13} < 0.063$ at $3\\sigma$ from a combined fit to the solar, KamLAND, atmospheric and CHOOZ results. We also address the issue of how much further reduction of allowed range of $\\Delta m^2_{21}$ and $\\sin^2\\theta_{12}$ is possible with increased statistics from KamLAND. We ...

  1. Understanding Blood Counts

    Science.gov (United States)

    ... Lab and Imaging Tests Understanding Blood Counts Understanding Blood Counts Understanding Blood Counts SHARE: Print Glossary Blood cell counts give ... your blood that's occupied by red cells. Normal Blood Counts Normal blood counts fall within a range ...

  2. White Blood Cell Count

    Science.gov (United States)

    ... limited. Home Visit Global Sites Search Help? White Blood Cell Count Share this page: Was this page helpful? ... Count; Leukocyte Count; White Count Formal name: White Blood Cell Count Related tests: Complete Blood Count , Blood Smear , ...

  3. EcoCount

    Directory of Open Access Journals (Sweden)

    Phillip P. Allen

    2014-05-01

    Full Text Available Techniques that analyze biological remains from sediment sequences for environmental reconstructions are well established and widely used. Yet, identifying, counting, and recording biological evidence such as pollen grains remain a highly skilled, demanding, and time-consuming task. Standard procedure requires the classification and recording of between 300 and 500 pollen grains from each representative sample. Recording the data from a pollen count requires significant effort and focused resources from the palynologist. However, when an adaptation to the recording procedure is utilized, efficiency and time economy improve. We describe EcoCount, which represents a development in environmental data recording procedure. EcoCount is a voice activated fully customizable digital count sheet that allows the investigator to continuously interact with a field of view during the data recording. Continuous viewing allows the palynologist the opportunity to remain engaged with the essential task, identification, for longer, making pollen counting more efficient and economical. EcoCount is a versatile software package that can be used to record a variety of environmental evidence and can be installed onto different computer platforms, making the adoption by users and laboratories simple and inexpensive. The user-friendly format of EcoCount allows any novice to be competent and functional in a very short time.

  4. Very High Gain and Low Noise Near Infrared Single Photon Counting Detectors and Arrays Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Amplification Technologies Inc ("ATI") proposes to develop the enabling material and device technology for the design of ultra low noise, high gain and low...

  5. High Performance Negative Feedback Near Infrared Single Photon Counting Detectors & Arrays Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Amplification Technologies Inc ("ATI") proposes to develop the enabling material and device technology for the design of ultra low noise, high gain and...

  6. Counting Populations

    Science.gov (United States)

    Damonte, Kathleen

    2004-01-01

    Scientists use sampling to get an estimate of things they cannot easily count. A population is made up of all the organisms of one species living together in one place at the same time. All of the people living together in one town are considered a population. All of the grasshoppers living in a field are a population. Scientists keep track of the…

  7. Characterization of a photon counting EMCCD for space-based high contrast imaging spectroscopy of extrasolar planets

    CERN Document Server

    Wilkins, Ashlee N; Norton, Timothy J; Rauscher, Bernard J; Rothe, Johannes F; Malatesta, Michael; Hilton, George M; Bubeck, James R; Grady, Carol A; Lindler, Don J

    2014-01-01

    We present the progress of characterization of a low-noise, photon counting Electron Multiplying Charged Coupled Device (EMCCD) operating in optical wavelengths and demonstrate possible solutions to the problems of Clock-Induced Charge (CIC) and other trapped charge through sub-bandgap illumination. Such a detector will be vital to the feasibility of future space-based direct imaging and spectroscopy missions for exoplanet characterization, and is scheduled to fly on-board the AFTA-WFIRST mission. The 512$\\times$512 EMCCD is an e2v detector housed and clocked by a N\\"uv\\"u Cameras controller. Through a multiplication gain register, this detector produces as many as 5000 electrons for a single, incident-photon-induced photoelectron produced in the detector, enabling single photon counting operation with read noise and dark current orders of magnitude below that of standard CCDs. With the extremely high contrasts (Earth-to-Sun flux ratio is $\\sim$ 10$^{-10}$) and extremely faint targets (an Earth analog would m...

  8. A High School Statistics Class Investigates the Death Penalty

    Science.gov (United States)

    Brelias, Anastasia

    2015-01-01

    Recommendations for reforming high school mathematics curricula emphasize the importance of engaging students in mathematical investigations of societal issues (CCSSI [Common Core State Standards Initiative] 2010; NCTM [National Council of Teachers of Mathematics] 2000). Proponents argue that these investigations can positively influence students'…

  9. Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

    CERN Document Server

    Petriş, M; Caragheorgheopol, G.; Deppner, I.; Frühauf, J.; Herrmann, N.; Kiš, M.; Loizeau, P-A.; Petrovici, M.; Rǎdulescu, L.; Simion, V.; Simon, C.

    2016-01-01

    Multi-gap RPC prototypes with readout on a multi-strip electrode were developed for the small polar angle region of the CBM-TOF subdetector, the most demanding zone in terms of granularity and counting rate. The prototypes are based on low resistivity ($\\sim$10$^{10}$ $\\Omega$cm) glass electrodes for performing in high counting rate environment. The strip width/pitch size was chosen such to fulfill the impedance matching with the front-end electronics and the granularity requirements of the innermost zone of the CBM-TOF wall. The in-beam tests using secondary particles produced in heavy ion collisions on a Pb target at SIS18 - GSI Darmstadt and SPS - CERN were focused on the performance of the prototype in conditions similar to the ones expected at SIS100/FAIR. An efficiency larger than 98\\% and a system time resolution in the order of 70~-~80~ps were obtained in high counting rate and high multiplicity environment.

  10. Time and position resolution of high granularity, high counting rate MRPC for the inner zone of the CBM-TOF wall

    Science.gov (United States)

    Petriş, M.; Bartoş, D.; Caragheorgheopol, G.; Deppner, I.; Frühauf, J.; Herrmann, N.; Kiš, M.; Loizeau, P.-A.; Petrovici, M.; Rădulescu, L.; Simion, V.; Simon, C.

    2016-09-01

    Multi-gap RPC prototypes with a multi-strip-electrode readout were developed for the small polar angle region of the CBM-TOF subdetector, the most demanding zone in terms of granularity and counting rate. The prototypes are based on using low resistivity (~ 1010 Ω·cm) glass electrodes for performing in high counting rate environment. The strip width/pitch size was chosen such to fulfill the impedance matching with the front-end electronics and the granularity requirements of the innermost zone of the CBM-TOF wall. The in-beam tests using secondary particles produced in heavy ion collisions on a Pb target at SIS18—GSI Darmstadt and SPS—CERN were focused on the performance of the prototypes in conditions similar to the ones expected at SIS100/FAIR. An efficiency larger than 98% and a system time resolution in the order of 70-80 ps were obtained in high counting rate and high multiplicity environment.

  11. Statistical Machine Learning for Structured and High Dimensional Data

    Science.gov (United States)

    2014-09-17

    estimation robustness, we exploit nonparametric rank-based correlation coefficient estimators , including Spearman’s rho and Kendall’s tau . In high...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate

  12. High current CD4+ T cell count predicts suboptimal adherence to antiretroviral therapy

    NARCIS (Netherlands)

    Pasternak, A.O.; de Bruin, M.; Bakker, M.; Berkhout, B.; Prins, J.M.

    2015-01-01

    High levels of adherence to antiretroviral therapy (ART) are necessary for achieving and maintaining optimal virological suppression, as suboptimal adherence leads to therapy failure and disease progression. It is well known that adherence to ART predicts therapy response, but it is unclear whether

  13. Multi-wire proportional chambers with a central hole and high counting-rate capability

    NARCIS (Netherlands)

    Volkerts, M; Bakker, A; Kalantar-Nayestanaki, N; Fraiquin, H; Eads, A; Rinckel, T; Solberg, K

    1999-01-01

    A set of two multi-wire proportional chambers with a central hole has been designed and built in a KVI-IUCF collaboration. These chambers, used for detecting charged particles with atomic masses up to A = 4 and energies up to 200 MeV, are highly efficient with efficiencies of 97-98% per plane at a c

  14. Competencies That Count: Strategies for Assessing High-Performance Skills. LAB Working Paper No. 2.

    Science.gov (United States)

    Allen, Lili

    This guide provides a "road map" to the various ways that schools and employers assess high-performance competencies, such as problem solving, information management, and communication and negotiation skills. The guide begins with a brief analysis of why it is important to assess these skills in light of the current standards environment…

  15. A high detection probability method for Gm-APD photon counting laser radar

    Science.gov (United States)

    Zhang, Zi-jing; Zhao, Yuan; Zhang, Yong; Wu, Long; Su, Jian-zhong

    2013-08-01

    Since Geiger mode Avalanche Photodiode (GmAPD) device was applied in laser radar system, the performance of system has been enhanced due to the ultra-high sensitivity of GmAPD, even responding a single photon. However, the background noise makes ultra-high sensitive GmAPD produce false alarms, which severely impacts on the detection of laser radar system based on Gm-APD and becomes an urgent problem which needs to be solved. To address this problem, a few times accumulated two-GmAPDs strategy is proposed in this paper. Finally, an experimental measurement is made under the background noise in sunny day. The results show a few times accumulated two- GmAPDs strategy can improve the detection probability and reduce the false alarm probability, and obtain a clear 3D image of target.

  16. Counting highly cited papers for university research assessment: conceptual and technical issues.

    Science.gov (United States)

    Rodríguez-Navarro, Alonso

    2012-01-01

    A Kuhnian approach to research assessment requires us to consider that the important scientific breakthroughs that drive scientific progress are infrequent and that the progress of science does not depend on normal research. Consequently, indicators of research performance based on the total number of papers do not accurately measure scientific progress. Similarly, those universities with the best reputations in terms of scientific progress differ widely from other universities in terms of the scale of investments made in research and in the higher concentrations of outstanding scientists present, but less so in terms of the total number of papers or citations. This study argues that indicators for the 1% high-citation tail of the citation distribution reveal the contribution of universities to the progress of science and provide quantifiable justification for the large investments in research made by elite research universities. In this tail, which follows a power low, the number of the less frequent and highly cited important breakthroughs can be predicted from the frequencies of papers in the upper part of the tail. This study quantifies the false impression of excellence produced by multinational papers, and by other types of papers that do not contribute to the progress of science. Many of these papers are concentrated in and dominate lists of highly cited papers, especially in lower-ranked universities. The h-index obscures the differences between higher- and lower-ranked universities because the proportion of h-core papers in the 1% high-citation tail is not proportional to the value of the h-index.

  17. Statistical validation of high-dimensional models of growing networks

    CERN Document Server

    Medo, Matus

    2013-01-01

    The abundance of models of complex networks and the current insufficient validation standards make it difficult to judge which models are strongly supported by data and which are not. We focus here on likelihood maximization methods for models of growing networks with many parameters and compare their performance on artificial and real datasets. While high dimensionality of the parameter space harms the performance of direct likelihood maximization on artificial data, this can be improved by introducing a suitable penalization term. Likelihood maximization on real data shows that the presented approach is able to discriminate among available network models. To make large-scale datasets accessible to this kind of analysis, we propose a subset sampling technique and show that it yields substantial model evidence in a fraction of time necessary for the analysis of the complete data.

  18. High-Voltage Clock Driver for Photon-Counting CCD Characterization

    Science.gov (United States)

    Baker, Robert

    2013-01-01

    A document discusses the CCD97 from e2v technologies as it is being evaluated at Goddard Space Flight Center's Detector Characterization Laboratory (DCL) for possible use in ultra-low background noise space astronomy applications, such as Terrestrial Planet Finder Coronagraph (TPF-C). The CCD97 includes a photoncounting mode where the equivalent output noise is less than one electron. Use of this mode requires a clock signal at a voltage level greater than the level achievable by the existing CCD (charge-coupled-device) electronics. A high-voltage waveform generator has been developed in code 660/601 to support the CCD97 evaluation. The unit generates required clock waveforms at voltage levels from -20 to +50 V. It deals with standard and arbitrary waveforms and supports pixel rates from 50 to 500 kHz. The system is designed to interface with existing Leach CCD electronics.

  19. Resistance to penicillin of Staphylococcus aureus isolates from cows with high somatic cell counts in organic and conventional dairy herds in Denmark

    Directory of Open Access Journals (Sweden)

    Vaarst Mette

    2006-11-01

    Full Text Available Abstract Background Quarter milk samples from cows with high risk of intramammary infection were examined to determine the prevalence of Staphylococcus aureus (SA and penicillin resistant SA (SAr in conventional and organic dairy herds and herds converting to organic farming in a combined longitudinal and cross-sectional study. Methods 20 conventional herds, 18 organic herds that converted before 1995, and 19 herds converting to organic farming in 1999 or 2000 were included in the study. Herds converting to organic farming were sampled three times one year apart; the other herds were sampled once. Risk of infection was estimated based on somatic cell count, milk production, breed, age and lactation stage. Results The high-risk cows represented about 49 % of the cows in the herds. The overall prevalence of SA and SAr among these cows was 29% (95% confidence interval: 24%–34% and 4% (95% confidence interval: 2%–5% respectively. The prevalence of penicillin resistance among SA infected cows was 12% (95% confidence interval: 6%–19% when calculated from the first herd visits. No statistically significant differences were observed in the prevalence of SAr or the proportion of isolates resistant to penicillin between herd groups. Conclusion The proportion of isolates resistant to penicillin was low compared to studies in other countries except Norway and Sweden. Based on the low prevalence of penicillin resistance of SA, penicillin should still be the first choice of antimicrobial agent for treatment of bovine intramammary infection in Denmark.

  20. [Sperm count and seminal biochemistry of high altitude inhabitants and patients with chronic altitude sickness].

    Science.gov (United States)

    García-Hjarles, M A

    1989-04-01

    Semen analysis has been studied in 9 healthy adult males from sea level (150 m), age 19-32 years old and 15 healthy males from high altitude (NA), 9 from Cerro de Pasco (4,300 m) and 6 from Morococha (4,540 m), ages 19-45 years old. Five patients with chronic mountain sickness (MMC), whose ages ranged from 23 to 52 years old were also studied. The volume and motility were similar in NA and MMC, however both were below than in sea level subjects, but still in the normal range; the number of spermatozoa per 1 ml was lower at sea level than in NA and MMC, although the total number was higher at sea level due to the higher semen volume. Fructose at sea level was 356 +/- 53 mg/100 ml (mean +/- S.E.) which is similar to NA 237 +/- 45 whereas a MMC was significantly lower, 142 +/- 60. Citric acid was lower at sea level than in NA and MMC. Na, K and Cl, were similar among the three groups. The lower concentration of fructose in MMC parallels the decreased testicular function already found in these groups. However it is worthy to point out that the fertility is preserved in all the groups. The normal reproductive function in MMC is against the concept that this process occurs as a consequence of environmental disadaptation.

  1. Intraoperative detection of ¹⁸F-FDG-avid tissue sites using the increased probe counting efficiency of the K-alpha probe design and variance-based statistical analysis with the three-sigma criteria.

    Science.gov (United States)

    Povoski, Stephen P; Chapman, Gregg J; Murrey, Douglas A; Lee, Robert; Martin, Edward W; Hall, Nathan C

    2013-03-04

    Intraoperative detection of (18)F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of (18)F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Of 58 patients undergoing (18)F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine (18)F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each (18)F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2-15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0-2.1) and 1.0 (± 0, range 1.0-1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical

  2. A high-resolution imaging technique using a whole-body, research photon counting detector CT system

    Science.gov (United States)

    Leng, S.; Yu, Z.; Halaweish, A.; Kappler, S.; Hahn, K.; Henning, A.; Li, Z.; Lane, J.; Levin, D. L.; Jorgensen, S.; Ritman, E.; McCollough, C.

    2016-03-01

    A high-resolution (HR) data collection mode has been introduced to a whole-body, research photon-counting-detector CT system installed in our laboratory. In this mode, 64 rows of 0.45 mm x 0.45 mm detector pixels were used, which corresponded to a pixel size of 0.25 mm x 0.25 mm at the iso-center. Spatial resolution of this HR mode was quantified by measuring the MTF from a scan of a 50 micron wire phantom. An anthropomorphic lung phantom, cadaveric swine lung, temporal bone and heart specimens were scanned using the HR mode, and image quality was subjectively assessed by two experienced radiologists. High spatial resolution of the HR mode was evidenced by the MTF measurement, with 15 lp/cm and 20 lp/cm at 10% and 2% modulation. Images from anthropomorphic phantom and cadaveric specimens showed clear delineation of small structures, such as lung vessels, lung nodules, temporal bone structures, and coronary arteries. Temporal bone images showed critical anatomy (i.e. stapes superstructure) that was clearly visible in the PCD system. These results demonstrated the potential application of this imaging mode in lung, temporal bone, and vascular imaging. Other clinical applications that require high spatial resolution, such as musculoskeletal imaging, may also benefit from this high resolution mode.

  3. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  4. Identification and characterization of elevated microbial counts in bulk tank raw milk.

    Science.gov (United States)

    Hayes, M C; Ralyea, R D; Murphy, S C; Carey, N R; Scarlett, J M; Boor, K J

    2001-01-01

    The bacterial composition of bulk tank milk from 13 farms was examined over a 2-wk period to characterize sudden elevations in the total bacterial count referred to as "spikes." Bulk tank milk samples collected at each pick-up were analyzed for standard plate count, Petrifilm aerobic count, somatic cell count, gram-negative organisms, and streptococci. Twenty standard plate count spikes were observed: 12 associated with streptococci, 4 associated with gram-negative organisms, 2 associated with streptococci and gram-negative organisms, and 2 that were not definitively characterized. Spikes ranged from 14,000 to 600,000 cfu/ml. Streptococcus uberis was isolated as the predominant organism from 11 spikes, and Escherichia coli was isolated from 4 spikes. Statistical analysis of total bacterial counts indicated a high correlation (r = 0.94) between standard plate counts and Petrifilm aerobic count. Regression analysis of standard plate counts and Petrifilm aerobic counts yielded the equation log10 (standard plate count) = 0.73 + 0.85log10 (Petrifilm aerobic count), indicating that the correlation, although strong, is not one to one. In a related pilot study, triplicate bulk tank milk samples were collected and analyzed for total bacterial count and presumptive streptococcus, gram-negative, and staphylococcus counts. Two-way ANOVA of these triplicate data indicated a lack of significant variation among the triplicate samples, suggesting that one sample can reliably gauge the microbial status of the entire bulk tank.

  5. Behavioral and cellular consequences of high-electrode count Utah Arrays chronically implanted in rat sciatic nerve

    Science.gov (United States)

    Wark, H. A. C.; Mathews, K. S.; Normann, R. A.; Fernandez, E.

    2014-08-01

    Objective. Before peripheral nerve electrodes can be used for the restoration of sensory and motor functions in patients with neurological disorders, the behavioral and histological consequences of these devices must be investigated. These indices of biocompatibility can be defined in terms of desired functional outcomes; for example, a device may be considered for use as a therapeutic intervention if the implanted subject retains functional neurons post-implantation even in the presence of a foreign body response. The consequences of an indwelling device may remain localized to cellular responses at the device-tissue interface, such as fibrotic encapsulation of the device, or they may affect the animal more globally, such as impacting behavioral or sensorimotor functions. The objective of this study was to investigate the overall consequences of implantation of high-electrode count intrafascicular peripheral nerve arrays, High Density Utah Slanted Electrode Arrays (HD-USEAs; 25 electrodes mm-2). Approach. HD-USEAs were implanted in rat sciatic nerves for one and two month periods. We monitored wheel running, noxious sensory paw withdrawal reflexes, footprints, nerve morphology and macrophage presence at the tissue-device interface. In addition, we used a novel approach to contain the arrays in actively behaving animals that consisted of an organic nerve wrap. A total of 500 electrodes were implanted across all ten animals. Main results. The results demonstrated that chronic implantation (⩽8 weeks) of HD-USEAs into peripheral nerves can evoke behavioral deficits that recover over time. Morphology of the nerve distal to the implantation site showed variable signs of nerve fiber degeneration and regeneration. Cytology adjacent to the device-tissue interface also showed a variable response, with some electrodes having many macrophages surrounding the electrodes, while other electrodes had few or no macrophages present. This variability was also seen along the length

  6. Counting Possibilia

    Directory of Open Access Journals (Sweden)

    Alfredo Tomasetta

    2010-06-01

    Full Text Available Timothy Williamson supports the thesis that every possible entity necessarily exists and so he needs to explain how a possible son of Wittgenstein’s, for example, exists in our world:he exists as a merely possible object (MPO, a pure locus of potential. Williamson presents a short argument for the existence of MPOs: how many knives can be made by fitting together two blades and two handles? Four: at the most two are concrete objects, the others being merely possible knives and merely possible objects. This paper defends the idea that one can avoid reference and ontological commitment to MPOs. My proposal is that MPOs can be dispensed with by using the notion of rules of knife-making. I first present a solution according to which we count lists of instructions - selected by the rules - describing physical combinations between components. This account, however, has its own difficulties and I eventually suggest that one can find a way out by admitting possible worlds, entities which are more commonly accepted - at least by philosophers - than MPOs. I maintain that, in answering Williamson’s questions, we count classes of physically possible worlds in which the same instance of a general rule is applied.

  7. Análisis estadístico para datos de conteo: aplicaciones para el uso de los servicios de salud Statistical analysis for count data: use of healthcare services applications

    Directory of Open Access Journals (Sweden)

    Aarón Salinas-Rodríguez

    2009-10-01

    Full Text Available OBJETIVO: Describir algunos de los modelos estadísticos para el estudio de variables expresadas como un conteo en el contexto del uso de los servicios de salud. MATERIAL Y MÉTODOS: Con base en la Encuesta de Evaluación del Seguro Popular (2005-2006 se calculó el efecto del Seguro Popular sobre el número de consultas externas mediante el uso de los modelos de regresión Poisson, binomial negativo, binomial negativo cero-inflado y Hurdle binomial negativo. Se utilizó el criterio de información de Akaike (AIC para definir el mejor modelo. RESULTADOS: La mejor opción estadística para el análisis del uso de los servicios de salud resultó ser el modelo Hurdle, de acuerdo con sus presuposiciones y el valor del AIC. DISCUSIÓN: La modelación de variables de conteo requiere el empleo de modelos que incluyan una medición de la dispersión. Ante la presencia de exceso de ceros, el modelo Hurdle es una opción apropiada.OBJECTIVE: To describe some of the statistical models for the study of count variables in the context of the use of health services. MATERIAL AND METHODS: We used the Seguro Popular Evaluation Survey to estimate the effect of Seguro Popular on the frequency of use of outpatient health services, using Poisson regression models and negative binomial, zero-inflated negative binomial and the hurdle negative binomial models. We used the Akaike Information Criterion (AIC to define the best model. RESULTS: Results show that the best statistical approach to model the use of health services is the hurdle model, taking into account both the main theoretical assumptions and the statistical results of the AIC. DISCUSSION: The modelling of count data requires the application of statistical models to model data dispersion; in the presence of an excess of zeros, the hurdle model is an appropriate statistical option.

  8. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study

  9. Risk of discontinuation of nevirapine due to toxicities in antiretroviral-naive and -experienced HIV-infected patients with high and low CD4+ T-cell counts

    DEFF Research Database (Denmark)

    Mocroft, Amanda; Staszewski, Schlomo; Weber, Rainer;

    2007-01-01

    It is unknown whether the increased risk of toxicities in antiretroviral-naive HIV-infected patients initiating nevirapine-based (NVPc) combination antiretroviral therapy (cART) with high CD4+ T-cell counts is also observed when NVPc is initiated in cARTexperienced patients.......It is unknown whether the increased risk of toxicities in antiretroviral-naive HIV-infected patients initiating nevirapine-based (NVPc) combination antiretroviral therapy (cART) with high CD4+ T-cell counts is also observed when NVPc is initiated in cARTexperienced patients....

  10. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...

  11. Economic consequences of mastitis and withdrawal of milk with high somatic cell count in Swedish dairy herds.

    Science.gov (United States)

    Nielsen, C; Ostergaard, S; Emanuelson, U; Andersson, H; Berglund, B; Strandberg, E

    2010-10-01

    The main aim was to assess the impact of mastitis on technical and economic results of a dairy herd under current Swedish farming conditions. The second aim was to investigate the effects obtained by withdrawing milk with high somatic cell count (SCC). A dynamic and stochastic simulation model, SimHerd, was used to study the effects of mastitis in a herd with 150 cows. Results given the initial incidence of mastitis (32 and 33 clinical and subclinical cases per 100 cow-years, respectively) were studied, together with the consequences of reducing or increasing the incidence of mastitis by 50%, modelling no clinical mastitis (CM) while keeping the incidence of subclinical mastitis (SCM) constant and vice versa. Six different strategies to withdraw milk with high SCC were compared. The decision to withdraw milk was based on herd-level information in three scenarios: withdrawal was initiated when the predicted bulk tank SCC exceeded 220 000, 200 000 or 180 000 cells/ml, and on cow-level information in three scenarios: withdrawal was initiated when the predicted SCC in an individual cow's milk exceeded 1 000 000, 750 000 or 500 000 cells/ml. The accuracy with which SCC was measured and predicted was assumed to affect the profitability of withdrawing milk with high SCC and this was investigated by applying high, low or no uncertainty to true SCC. The yearly avoidable cost of mastitis was estimated at €8235, assuming that the initial incidence of mastitis could be reduced by 50%. This cost corresponded to 5% of the herd net return given the initial incidence of mastitis. Expressed per cow-year, the avoidable cost of mastitis was €55. The costs per case of CM and SCM were estimated at €278 and €60, respectively. Withdrawing milk with high SCC was never profitable because this generated a substantial amount of milk withdrawal that was not offset by a sufficient increase in the average price per delivered kg milk. It had the most negative impact on net return when

  12. Microbiological screening test validation for detection of tylosin excretion in milk of cows with low and high somatic cell counts.

    Science.gov (United States)

    Litterio, N J; Calvinho, L F; Flores, M M; Tarabla, H D; Boggio, J C

    2007-02-01

    Antibiotic residues in milk above tolerance levels interfere with dairy product processing and pose potential health risks to consumers. Residue avoidance programmes include, among other components, the observance of withdrawal times indicated in label instructions. Persistence of antibiotics in milk following treatment is influenced by drug, dosage, route of administration, body weight and mammary gland health status. Compositional changes that take place during intramammary infection (IMI) can affect antibiotic excretion in milk, thus modifying milk withdrawal time. The objectives of this study were to validate sensitivity and specificity of a qualitative microbiological method (Charm AIM-96) to detect tylosin in bovine composite milk and to determine the influence of subclinical IMI in tylosin excretion following intramuscular administration. For test validation, two groups of approximately 120 cows were used; one received a single intramuscular injection of tylosin tartrate at a dose of 20 mg/kg, while the other group remained as untreated control. Test sensitivity and specificity were 100% and 94.1% respectively. To determine the influence of subclinical IMI in tylosin excretion, two groups of seven cows, one with somatic cell counts (SCC) or =900 000, were administered a single intramuscular injection of tylosin tartrate at a dose of 20 mg/kg. Milk samples were obtained every 12 h for 10 days following treatment. Milk tylosin excretion averaged between 5 and 9 days for cows with low and high SCC respectively (P tylosin, extending the presence of the antibiotic in milk, thus influencing milk withdrawal times.

  13. Alternative Optimizations of X-ray TES Arrays: Soft X-rays, High Count Rates, and Mixed-Pixel Arrays

    Science.gov (United States)

    Kilbourne, C. A.; Bandler, S. R.; Brown, A.-D.; Chervenak, J. A.; Figueroa-Feliciano, E.; Finkbeiner, F. M.; Iyomoto, N.; Kelley, R. L.; Porter, F. S.; Smith, S. J.

    2007-01-01

    We are developing arrays of superconducting transition-edge sensors (TES) for imaging spectroscopy telescopes such as the XMS on Constellation-X. While our primary focus has been on arrays that meet the XMS requirements (of which, foremost, is an energy resolution of 2.5 eV at 6 keV and a bandpass from approx. 0.3 keV to 12 keV), we have also investigated other optimizations that might be used to extend the XMS capabilities. In one of these optimizations, improved resolution below 1 keV is achieved by reducing the heat capacity. Such pixels can be based on our XMS-style TES's with the separate absorbers omitted. These pixels can added to an array with broadband response either as a separate array or interspersed, depending on other factors that include telescope design and science requirements. In one version of this approach, we have designed and fabricated a composite array of low-energy and broad-band pixels to provide high spectral resolving power over a broader energy bandpass than could be obtained with a single TES design. The array consists of alternating pixels with and without overhanging absorbers. To explore optimizations for higher count rates, we are also optimizing the design and operating temperature of pixels that are coupled to a solid substrate. We will present the performance of these variations and discuss other optimizations that could be used to enhance the XMS or enable other astrophysics experiments.

  14. Direct calibration of click-counting detectors

    Science.gov (United States)

    Bohmann, M.; Kruse, R.; Sperling, J.; Silberhorn, C.; Vogel, W.

    2017-03-01

    We introduce and experimentally implement a method for the detector calibration of photon-number-resolving time-bin multiplexing layouts based on the measured click statistics of superconducting nanowire detectors. In particular, the quantum efficiencies, the dark count rates, and the positive operator-valued measures of these measurement schemes are directly obtained with high accuracy. The method is based on the moments of the click-counting statistics for coherent states with different coherent amplitudes. The strength of our analysis is that we can directly conclude—on a quantitative basis—that the detection strategy under study is well described by a linear response function for the light-matter interaction and that it is sensitive to the polarization of the incident light field. Moreover, our method is further extended to a two-mode detection scenario. Finally, we present possible applications for such well-characterized detectors, such as sensing of atmospheric loss channels and phase sensitive measurements.

  15. High Goblet Cell Count Is Inversely Associated with Ploidy Abnormalities and Risk of Adenocarcinoma in Barrett's Esophagus.

    Directory of Open Access Journals (Sweden)

    Amitabh Srivastava

    Full Text Available Goblet cells may represent a potentially successful adaptive response to acid and bile by producing a thick mucous barrier that protects against cancer development in Barrett's esophagus (BE. The aim of this study was to determine the relationship between goblet cells (GC and risk of progression to adenocarcinoma, and DNA content flow cytometric abnormalities, in BE patients.Baseline mucosal biopsies (N=2988 from 213 patients, 32 of whom developed cancer during the follow up period, enrolled in a prospective dynamic cohort of BE patients were scored in a blinded fashion, for the total number (# of GC, mean # of GC/crypt (GC density, # of crypts with ≥ 1 GC, and the proportion of crypts with ≥1 GC, in both dysplastic and non-dysplastic epithelium separately. The relationship between these four GC parameters and DNA content flow cytometric abnormalities and adenocarcinoma outcome was compared, after adjustment for age, gender, and BE segment length.High GC parameters were inversely associated with DNA content flow cytometric abnormalities, such as aneuploidy, ploidy >2.7N, and an elevated 4N fraction > 6%, and with risk of adenocarcinoma. However, a Kaplan-Meier analysis showed that the total # of GC and the total # crypts with ≥1 GC were the only significant GC parameters (p<0.001 and 0.003, respectively.The results of this study show, for the first time, an inverse relationship between high GC counts and flow cytometric abnormalities and risk of adenocarcinoma in BE. Further studies are needed to determine if GC depleted foci within esophageal columnar mucosa are more prone to neoplastic progression or whether loss of GC occurs secondary to underlying genetic abnormalities.

  16. A high resolution, high frame rate detector based on a microchannel plate read out with the Medipix2 counting CMOS pixel chip.

    CERN Document Server

    Mikulec, Bettina; McPhate, J B; Tremsin, A S; Siegmund, O H W; Clark, Allan G; CERN. Geneva

    2005-01-01

    The future of ground-based optical astronomy lies with advancements in adaptive optics (AO) to overcome the limitations that the atmosphere places on high resolution imaging. A key technology for AO systems on future very large telescopes are the wavefront sensors (WFS) which detect the optical phase error and send corrections to deformable mirrors. Telescopes with >30 m diameters will require WFS detectors that have large pixel formats (512x512), low noise (<3 e-/pixel) and very high frame rates (~1 kHz). These requirements have led to the idea of a bare CMOS active pixel device (the Medipix2 chip) functioning in counting mode as an anode with noiseless readout for a microchannel plate (MCP) detector and at 1 kHz continuous frame rate. First measurement results obtained with this novel detector are presented both for UV photons and beta particles.

  17. Quantum optical signatures in strong-field laser physics: Infrared photon counting in high-order-harmonic generation.

    Science.gov (United States)

    Gonoskov, I A; Tsatrafyllis, N; Kominis, I K; Tzallas, P

    2016-09-07

    We analytically describe the strong-field light-electron interaction using a quantized coherent laser state with arbitrary photon number. We obtain a light-electron wave function which is a closed-form solution of the time-dependent Schrödinger equation (TDSE). This wave function provides information about the quantum optical features of the interaction not accessible by semi-classical theories. With this approach we can reveal the quantum optical properties of high harmonic generation (HHG) process in gases by measuring the photon statistics of the transmitted infrared (IR) laser radiation. This work can lead to novel experiments in high-resolution spectroscopy in extreme-ultraviolet (XUV) and attosecond science without the need to measure the XUV light, while it can pave the way for the development of intense non-classical light sources.

  18. Pyogenic arthritis, pyoderma gangrenosum, and acne (PAPA) syndrome: differential diagnosis of septic arthritis by regular detection of exceedingly high synovial cell counts.

    Science.gov (United States)

    Löffler, W; Lohse, P; Weihmayr, T; Widenmayer, W

    2017-03-01

    Pyogenic arthritis, pyoderma gangrenosum and acne syndrome was diagnosed in a 42-year-old patient, after an unusual persistency of high synovial cell counts had been noticed. Clinical peculiarities and problems with diagnosing septic versus non-septic arthritis are discussed.

  19. High plasma fibrinogen concentration and platelet count unfavorably impact survival in non-small cell lung cancer patients with brain metastases.

    Science.gov (United States)

    Zhu, Jian-Fei; Cai, Ling; Zhang, Xue-Wen; Wen, Yin-Sheng; Su, Xiao-Dong; Rong, Tie-Hua; Zhang, Lan-Jun

    2014-02-01

    High expression of fibrinogen and platelets are often observed in non-small cell lung cancer (NSCLC) patients with local regional or distant metastasis. However, the role of these factors remains unclear. The aims of this study were to evaluate the prognostic significance of plasma fibrinogen concentration and platelet count, as well as to determine the overall survival of NSCLC patients with brain metastases. A total of 275 NSCLC patients with brain metastasis were enrolled into this study. Univariate analysis showed that high plasma fibrinogen concentration was associated with age≥65 years (P = 0.011), smoking status (P = 0.009), intracranial symptoms (P = 0.022), clinical T category (P = 0.010), clinical N category (P = 0.003), increased partial thromboplastin time (P low plasma fibrinogen concentration demonstrated longer overall survival compared with those with high plasma fibrinogen concentration (median, 17.3 months versus 11.1 months; P≤0.001). A similar result was observed for platelet counts (median, 16.3 months versus 11.4 months; P = 0.004). Multivariate analysis showed that both plasma fibrinogen concentration and platelet count were independent prognostic factors for NSCLC with brain metastases (R2 = 1.698, P high plasma fibrinogen concentration and platelet count indicate poor prognosis for NSCLC patients with brain metastases. Thus, these two biomarkers might be independent prognostic predictors for this subgroup of NSCLC patients.

  20. Role of quenching on alpha/beta separation in liquid scintillation counting for several high capacity cocktails

    Energy Technology Data Exchange (ETDEWEB)

    Pujol, L.; Sanchez-Cabeza, J.-A. [Universidad Autonoma de Barcelona (Spain). Facultad de Ciencias

    1997-04-01

    The optimization of alpha/beta separation in liquid scintillation using pulse shape analysis is convenient for the simultaneous determination of alpha and beta emitters in natural water and other samples. In this work, alpha/beta separation was studied for different scintillant/vial combinations and it was observed that both the optimum pulse shape discrimination level and the total interference value (that is, the summed relative interference between alpha and beta spectra) were dependent on the sample quenching and independent of the scintillant/vial combination. These results provide a simple method for modifying the counting configuration, such as a change in the cocktail, vial or sample characteristics, without the need to perform exhaustive parameter optimizations. Also, it was observed that, for our counting conditions, the combination of Ultima Gold AB scintillation cocktail with Zinsser low diffusion vials presented the lowest total interference, namely 0.94 {+-} 0.28%, which is insignificant for the counting of environmental samples. (Author).

  1. A user configurable data acquisition and signal processing system for high-rate, high channel count applications

    Energy Technology Data Exchange (ETDEWEB)

    Salim, Arwa, E-mail: arwa.salim@eee.strath.ac.uk [University of Strathclyde, Scotland (United Kingdom); Crockett, Louise [University of Strathclyde, Scotland (United Kingdom); McLean, John; Milne, Peter [D-TACQ Solutions, Scotland (United Kingdom)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer The development of a new digital signal processing platform is described. Black-Right-Pointing-Pointer The system will allow users to configure the real-time signal processing through software routines. Black-Right-Pointing-Pointer The architecture of the DRUID system and signal processing elements is described. Black-Right-Pointing-Pointer A prototype of the DRUID system has been developed for the digital chopper-integrator. Black-Right-Pointing-Pointer The results of acquisition on 96 channels at 500 kSamples/s per channel are presented. - Abstract: Real-time signal processing in plasma fusion experiments is required for control and for data reduction as plasma pulse times grow longer. The development time and cost for these high-rate, multichannel signal processing systems can be significant. This paper proposes a new digital signal processing (DSP) platform for the data acquisition system that will allow users to easily customize real-time signal processing systems to meet their individual requirements. The D-TACQ reconfigurable user in-line DSP (DRUID) system carries out the signal processing tasks in hardware co-processors (CPs) implemented in an FPGA, with an embedded microprocessor ({mu}P) for control. In the fully developed platform, users will be able to choose co-processors from a library and configure programmable parameters through the {mu}P to meet their requirements. The DRUID system is implemented on a Spartan 6 FPGA, on the new rear transition module (RTM-T), a field upgrade to existing D-TACQ digitizers. As proof of concept, a multiply-accumulate (MAC) co-processor has been developed, which can be configured as a digital chopper-integrator for long pulse magnetic fusion devices. The DRUID platform allows users to set options for the integrator, such as the number of masking samples. Results from the digital integrator are presented for a data acquisition system with 96 channels simultaneously acquiring data

  2. High-voltage integrated active quenching circuit for single photon count rate up to 80 Mcounts/s.

    Science.gov (United States)

    Acconcia, Giulia; Rech, Ivan; Gulinatti, Angelo; Ghioni, Massimo

    2016-08-01

    Single photon avalanche diodes (SPADs) have been subject to a fast improvement in recent years. In particular, custom technologies specifically developed to fabricate SPAD devices give the designer the freedom to pursue the best detector performance required by applications. A significant breakthrough in this field is represented by the recent introduction of a red enhanced SPAD (RE-SPAD) technology, capable of attaining a good photon detection efficiency in the near infrared range (e.g. 40% at a wavelength of 800 nm) while maintaining a remarkable timing resolution of about 100ps full width at half maximum. Being planar, the RE-SPAD custom technology opened the way to the development of SPAD arrays particularly suited for demanding applications in the field of life sciences. However, to achieve such excellent performance custom SPAD detectors must be operated with an external active quenching circuit (AQC) designed on purpose. Next steps toward the development of compact and practical multichannel systems will require a new generation of monolithically integrated AQC arrays. In this paper we present a new, fully integrated AQC fabricated in a high-voltage 0.18 µm CMOS technology able to provide quenching pulses up to 50 Volts with fast leading and trailing edges. Although specifically designed for optimal operation of RE-SPAD devices, the new AQC is quite versatile: it can be used with any SPAD detector, regardless its fabrication technology, reaching remarkable count rates up to 80 Mcounts/s and generating a photon detection pulse with a timing jitter as low as 119 ps full width at half maximum. The compact design of our circuit has been specifically laid out to make this IC a suitable building block for monolithically integrated AQC arrays.

  3. Low Counts of Plasmacytoid Dendritic Cells after Engraftment Are Associated with High Early Mortality after Allogeneic Stem Cell Transplantation.

    Science.gov (United States)

    Gonçalves, Matheus Vescovi; Yamamoto, Mihoko; Kimura, Eliza Yurico Sugano; Colturato, Vergílio Antônio Rensi; de Souza, Mair Pedro; Mauad, Marcos; Ikoma, Maura Valerio; Novis, Yana; Rocha, Vanderson; Ginani, Valeria Cortez; Wanderley de Oliveira Felix, Olga Margareth; Seber, Adriana; Kerbauy, Fabio Rodrigues; Hamerschlak, Nelson; Orfao, Alberto; Rodrigues, Celso Arrais

    2015-07-01

    Dendritic cells (DCs) are antigen-presenting cells that drive immune responses and tolerance and are divided in different subsets: myeloid DCs (mDCs: lineage-; HLA-DR+, 11c+), plasmacytoid dendritic cells (pDCs: HLA-DR+, CD123+), and monocyte-derived DCs (moDC: lineage-, 11c+, 16+). After hematopoietic stem cell transplantation (HSCT), low DC counts in the recipients' peripheral blood (PB) have been associated with worse outcomes, but the relevance of DC graft content remains unclear, and there are few data in the setting of unrelated donor HSCT. We evaluated the DC graft content and monitored DC recovery in PB from 111 HSCT recipients (median age, 17 years; range 1 to 74), who received bone marrow (46%), umbilical cord blood (32%), or PB (22%) from unrelated (81%) or related donors (19%). In 86 patients with sustained allogeneic recovery, patients with higher counts of all DC subsets (pDC, mDC, and moDC) 3 weeks after engraftment had lower incidence of nonrelapse mortality (NMR) and acute graft-versus-host disease (aGVHD) and better survival. pDC counts were associated with more striking results: patients with higher pDC counts had much lower incidences of NRM (3% versus 47%, P < .0001), lower incidence of aGVHD (24% versus 67%, P < .0001), and better overall survival (92% versus 45%, P < .0001). In contrast, higher pDC counts in the graft was associated with an increased risk of aGVHD (55% versus 26%, P = .02). Our results indicate that DC counts are closely correlated with HSCT outcomes and warrant further prospective evaluation and possible early therapeutic interventions to ameliorate severe aGVHD and decrease mortality.

  4. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images.

    Science.gov (United States)

    Hu, Qin; Victor, Jonathan D

    2016-09-01

    Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features, but they are challenging to study - largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH) functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis) of the distribution of filter coefficients depends only on the projection of the function onto a 1-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank.

  5. Counting statistics: a Feynman-Kac perspective.

    Science.gov (United States)

    Zoia, A; Dumonteil, E; Mazzolo, A

    2012-01-01

    By building upon a Feynman-Kac formalism, we assess the distribution of the number of collisions in a given region for a broad class of discrete-time random walks in absorbing and nonabsorbing media. We derive the evolution equation for the generating function of the number of collisions, and we complete our analysis by examining the moments of the distribution and their relation to the walker equilibrium density. Some significant applications are discussed in detail: in particular, we revisit the gambler's ruin problem and generalize to random walks with absorption the arcsine law for the number of collisions on the half-line.

  6. Teaching Emotionally Disturbed Students to Count Feelings.

    Science.gov (United States)

    Bartels, Cynthia S.; Calkin, Abigail B.

    The paper describes a program to teach high school students with emotional and behavior problems to count their feelings, thereby improving their self concept. To aid in instruction, a hierarchy was developed which involved four phases: counting tasks completed and tasks not completed, counting independent actions in class, counting perceptions of…

  7. High plasma fibrinogen concentration and platelet count unfavorably impact survival in non-small cell lung cancer patients with brain metastases

    Institute of Scientific and Technical Information of China (English)

    Jian-Fei Zhu; Ling Cai; Xue-Wen Zhang; Yin-Sheng Wen; Xiao-Dong Su; Tie-Hua Rong; Lan-Jun Zhang

    2014-01-01

    High expression of fibrinogen and platelets are often observed in non-smal celllung cancer (NSCLC) patients with local regional or distant metastasis. However, the role of these factors remains unclear. The aims of this study were to evaluate the prognostic significance of plasma fibrinogen concentration and platelet count, as wel as to determine the overal survival of NSCLC patients with brain metastases. A total of 275 NSCLC patients with brain metastasis were enrolled into this study. Univariate analysis showed that high plasma fibrinogen concentration was associated with age≥65 years (P = 0.011), smoking status (P = 0.009), intracranial symptoms (P = 0.022), clinical T category (P = 0.010), clinical N category (P = 0.003), increased partial thromboplastin time (P < 0.001), and platelet count (P < 0.001). Patients with low plasma fibrinogen concentration demonstrated longer overall survival compared with those with high plasma fibrinogen concentration (median, 17.3 months versus 11.1 months;P≤0.001). A similar result was observed for platelet counts (median, 16.3 months versus 11.4 months;P = 0.004). Multivariate analysis showed that both plasma fibrinogen concentration and platelet count were independent prognostic factors for NSCLC with brain metastases (R2 = 1.698,P < 0.001 andR2 = 1.699,P < 0.001, respectively). Our results suggest that high plasma fibrinogen concentration and platelet count indicate poor prognosis for NSCLC patients with brain metastases. Thus, these two biomarkers might be independent prognostic predictors for this subgroup of NSCLC patients.

  8. Every photon counts: improving low, mid, and high-spatial frequency errors on astronomical optics and materials with MRF

    Science.gov (United States)

    Maloney, Chris; Lormeau, Jean Pierre; Dumas, Paul

    2016-07-01

    Many astronomical sensing applications operate in low-light conditions; for these applications every photon counts. Controlling mid-spatial frequencies and surface roughness on astronomical optics are critical for mitigating scattering effects such as flare and energy loss. By improving these two frequency regimes higher contrast images can be collected with improved efficiency. Classically, Magnetorheological Finishing (MRF) has offered an optical fabrication technique to correct low order errors as well has quilting/print-through errors left over in light-weighted optics from conventional polishing techniques. MRF is a deterministic, sub-aperture polishing process that has been used to improve figure on an ever expanding assortment of optical geometries, such as planos, spheres, on and off axis aspheres, primary mirrors and freeform optics. Precision optics are routinely manufactured by this technology with sizes ranging from 5-2,000mm in diameter. MRF can be used for form corrections; turning a sphere into an asphere or free form, but more commonly for figure corrections achieving figure errors as low as 1nm RMS while using careful metrology setups. Recent advancements in MRF technology have improved the polishing performance expected for astronomical optics in low, mid and high spatial frequency regimes. Deterministic figure correction with MRF is compatible with most materials, including some recent examples on Silicon Carbide and RSA905 Aluminum. MRF also has the ability to produce `perfectly-bad' compensating surfaces, which may be used to compensate for measured or modeled optical deformation from sources such as gravity or mounting. In addition, recent advances in MRF technology allow for corrections of mid-spatial wavelengths as small as 1mm simultaneously with form error correction. Efficient midspatial frequency corrections make use of optimized process conditions including raster polishing in combination with a small tool size. Furthermore, a novel MRF

  9. Fast global convergence of gradient methods for high-dimensional statistical recovery

    CERN Document Server

    Agarwal, Alekh; Wainwright, Martin J

    2011-01-01

    Many statistical M-estimators are based on convex optimization problems formed by the combination of a data-dependent loss function with a norm-based regularizer. We analyze the convergence rates of projected gradient methods for solving such problems, working within a high-dimensional framework that allows the data dimension d to grow with (and possibly exceed) the sample size n. This high-dimensional structure precludes the usual global assumptions---namely, strong convexity and smoothness conditions---that underlie much of classical optimization analysis. We define appropriately restricted versions of these conditions, and show that they are satisfied with high probability for various statistical models. Under these conditions, our theory guarantees that projected gradient descent has a globally geometric rate of convergence up to the \\emph{statistical precision} of the model, meaning the typical distance between the true unknown parameter $\\theta^*$ and an optimal solution $\\hat{\\theta}$. This result is s...

  10. Pulse shape discrimination of Cs2LiYCl6:Ce3+ detectors at high count rate based on triangular and trapezoidal filters

    Science.gov (United States)

    Wen, Xianfei; Enqvist, Andreas

    2017-09-01

    Cs2LiYCl6:Ce3+ (CLYC) detectors have demonstrated the capability to simultaneously detect γ-rays and thermal and fast neutrons with medium energy resolution, reasonable detection efficiency, and substantially high pulse shape discrimination performance. A disadvantage of CLYC detectors is the long scintillation decay times, which causes pulse pile-up at moderate input count rate. Pulse processing algorithms were developed based on triangular and trapezoidal filters to discriminate between neutrons and γ-rays at high count rate. The algorithms were first tested using low-rate data. They exhibit a pulse-shape discrimination performance comparable to that of the charge comparison method, at low rate. Then, they were evaluated at high count rate. Neutrons and γ-rays were adequately identified with high throughput at rates of up to 375 kcps. The algorithm developed using the triangular filter exhibits discrimination capability marginally higher than that of the trapezoidal filter based algorithm irrespective of low or high rate. The algorithms exhibit low computational complexity and are executable on an FPGA in real-time. They are also suitable for application to other radiation detectors whose pulses are piled-up at high rate owing to long scintillation decay times.

  11. Statistical behavior of high doses in medical radiodiagnosis; Comportamento estatistico das altas doses em radiodiagnostico medico

    Energy Technology Data Exchange (ETDEWEB)

    Barboza, Adriana Elisa, E-mail: adrianaebarboza@gmail.com, E-mail: elisa@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  12. Research on the statistics index system and evaluation method of high and new technology industry development

    Institute of Scientific and Technical Information of China (English)

    李晓东; 张霞

    2004-01-01

    During the evolution of the global economic, high and new technology industry has become the maindriver to impel the global economy growth via technological advancement, and the main means to guarantee the sustainable development of the global economy. In the view of China's situation, this article analyzes the experiences of OECD in high and new technology industry and gives a statistics index system along with the evaluation method to estimate the development of high and new technology industry.

  13. Total and differential white blood cell counts, high-sensitivity C-reactive protein, and the metabolic syndrome in non-affective psychoses.

    Science.gov (United States)

    Miller, Brian J; Mellor, Andrew; Buckley, Peter

    2013-07-01

    The metabolic syndrome is highly prevalent in patients with schizophrenia, and is associated with a state of chronic, low-grade inflammation. Schizophrenia is also associated with increased inflammation, including aberrant blood levels of pro-inflammatory cytokines and high-sensitivity C-reactive protein (hsCRP). The purpose of this study is to investigate the relationship between total and differential white blood cell (WBC) counts, hsCRP, and the metabolic syndrome in patients with schizophrenia and related non-affective psychoses. Fifty-nine inpatients and outpatients age 18-70 with non-affective psychotic disorders and 22 controls participated in this cross-sectional study. Subjects had a fasting blood draw between 8 and 9 am for glucose, lipids, total and differential WBC counts, and hsCRP. Vital signs and anthropometric measures were obtained. Patients with non-affective psychosis and the metabolic syndrome had significantly higher total WBC counts, monocytes, and hsCRP levels than patients without the metabolic syndrome (p≤0.04 for each). In binary logistic regression analyses, after controlling for potential confounding effects of age, race, sex, age at first hospitalization for psychosis, parental history of diabetes, smoking, and psychotropic medications, total WBC count, monocytes, and hsCRP were significant predictors of metabolic syndrome in patients (p≤0.04 for each). hsCRP was also a significant predictor of increased waist circumference and triglycerides in patients (p≤0.05 for each). Our findings suggest that measurement of total and differential WBC counts and hsCRP blood levels may be germane to the clinical care of patients with schizophrenia and related disorders, and support an association between inflammation and metabolic disturbance in these patients.

  14. Improvement of the Trapezoid Method Using Raw Landsat Image Digital Count Data for Soil Moisture Estimation in the Texas (USA High Plains

    Directory of Open Access Journals (Sweden)

    Sanaz Shafian

    2015-01-01

    Full Text Available Variations in soil moisture strongly affect surface energy balances, regional runoff, land erosion and vegetation productivity (i.e., potential crop yield. Hence, the estimation of soil moisture is very valuable in the social, economic, humanitarian (food security and environmental segments of society. Extensive efforts to exploit the potential of remotely sensed observations to help quantify this complex variable are ongoing. This study aims at developing a new index, the Thermal Ground cover Moisture Index (TGMI, for estimating soil moisture content. This index is based on empirical parameterization of the relationship between raw image digital count (DC data in the thermal infrared spectral band and ground cover (determined from raw image digital count data in the red and near-infrared spectral bands.The index uses satellite-derived information only, and the potential for its operational application is therefore great. This study was conducted in 18 commercial agricultural fields near Lubbock, TX (USA. Soil moisture was measured in these fields over two years and statistically compared to corresponding values of TGMI determined from Landsat image data. Results indicate statistically significant correlations between TGMI and field measurements of soil moisture (R2 = 0.73, RMSE = 0.05, MBE = 0.17 and AAE = 0.049, suggesting that soil moisture can be estimated using this index. It was further demonstrated that maps of TGMI developed from Landsat imagery could be constructed to show the relative spatial distribution of soil moisture across a region.

  15. Improvment of the Trapezoid Method Using Raw Landsat Image Digital Count Data for Soil Moisture Estimation in the Texas (usa) High Plains

    Science.gov (United States)

    Shafian, S.; Maas, S. J.

    2015-12-01

    Variations in soil moisture strongly affect surface energy balances, regional runoff, land erosion and vegetation productivity (i.e., potential crop yield). Hence, the estimation of soil moisture is very valuable in the social, economic, humanitarian (food security) and environmental segments of society. Extensive efforts to exploit the potential of remotely sensed observations to help quantify this complex variable are ongoing. This study aims at developing a new index, the Thermal Ground cover Moisture Index (TGMI), for estimating soil moisture content. This index is based on empirical parameterization of the relationship between raw image digital count (DC) data in the thermal infrared spectral band and ground cover (determined from raw image digital count data in the red and near-infrared spectral bands).The index uses satellite-derived information only, and the potential for its operational application is therefore great. This study was conducted in 18 commercial agricultural fields near Lubbock, TX (USA). Soil moisture was measured in these fields over two years and statistically compared to corresponding values of TGMI determined from Landsat image data. Results indicate statistically significant correlations between TGMI and field measurements of soil moisture (R2 = 0.73, RMSE = 0.05, MBE = 0.17 and AAE = 0.049), suggesting that soil moisture can be estimated using this index. It was further demonstrated that maps of TGMI developed from Landsat imagery could be constructed to show the relative spatial distribution of soil moisture across a region.

  16. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  17. Statistics-based reconstruction method with high random-error tolerance for integral imaging.

    Science.gov (United States)

    Zhang, Juan; Zhou, Liqiu; Jiao, Xiaoxue; Zhang, Lei; Song, Lipei; Zhang, Bo; Zheng, Yi; Zhang, Zan; Zhao, Xing

    2015-10-01

    A three-dimensional (3D) digital reconstruction method for integral imaging with high random-error tolerance based on statistics is proposed. By statistically analyzing the points reconstructed by triangulation from all corresponding image points in an elemental images array, 3D reconstruction with high random-error tolerance could be realized. To simulate the impacts of random errors, random offsets with different error levels are added to a different number of elemental images in simulation and optical experiments. The results of simulation and optical experiments showed that the proposed statistic-based reconstruction method has relatively stable and better reconstruction accuracy than the conventional reconstruction method. It can be verified that the proposed method can effectively reduce the impacts of random errors on 3D reconstruction of integral imaging. This method is simple and very helpful to the development of integral imaging technology.

  18. An ALMA survey of submillimetre galaxies in the Extended Chandra Deep Field South: High resolution 870um source counts

    CERN Document Server

    Karim, Alexander; Hodge, Jackie; Smail, Ian; Walter, Fabian; Biggs, Andy; Simpson, James; Danielson, Alice; Alexander, David; Bertoldi, Frank; Chapman, Scott; Coppin, Kristen; Dannerbauer, Helmut; Edge, Alastair; Greve, Thomas; Ivison, Rob; Knudsen, Kirsten; Menten, Karl; Schinnerer, Eva; Wardlow, Julie; Weiß, Axel; van der Werf, Paul

    2012-01-01

    We report the first counts of faint submillimetre galaxies (SMG) in the 870-um band derived from arcsecond resolution observations with the Atacama Large Millimeter Array (ALMA). We have used ALMA to map a sample of 122 870-um-selected submillimetre sources drawn from the (0.5x0.5)deg^2 LABOCA Extended Chandra Deep Field South Submillimetre Survey (LESS). These ALMA maps have an average depth of sigma(870um)~0.4mJy, some ~3x deeper than the original LABOCA survey and critically the angular resolution is more than an order of magnitude higher, FWHM of ~1.5" compared to ~19" for the LABOCA discovery map. This combination of sensitivity and resolution allows us to precisely pin-point the SMGs contributing to the submillimetre sources from the LABOCA map, free from the effects of confusion. We show that our ALMA-derived SMG counts broadly agree with the submillimetre source counts from previous, lower-resolution single-dish surveys, demonstrating that the bulk of the submillimetre sources are not caused by blendi...

  19. Common statistical and research design problems in manuscripts submitted to high-impact medical journals

    Directory of Open Access Journals (Sweden)

    Harris Alex HS

    2011-08-01

    Full Text Available Abstract Background To assist educators and researchers in improving the quality of medical research, we surveyed the editors and statistical reviewers of high-impact medical journals to ascertain the most frequent and critical statistical errors in submitted manuscripts. Findings The Editors-in-Chief and statistical reviewers of the 38 medical journals with the highest impact factor in the 2007 Science Journal Citation Report and the 2007 Social Science Journal Citation Report were invited to complete an online survey about the statistical and design problems they most frequently found in manuscripts. Content analysis of the responses identified major issues. Editors and statistical reviewers (n = 25 from 20 journals responded. Respondents described problems that we classified into two, broad themes: A. statistical and sampling issues and B. inadequate reporting clarity or completeness. Problems included in the first theme were (1 inappropriate or incomplete analysis, including violations of model assumptions and analysis errors, (2 uninformed use of propensity scores, (3 failing to account for clustering in data analysis, (4 improperly addressing missing data, and (5 power/sample size concerns. Issues subsumed under the second theme were (1 Inadequate description of the methods and analysis and (2 Misstatement of results, including undue emphasis on p-values and incorrect inferences and interpretations. Conclusions The scientific quality of submitted manuscripts would increase if researchers addressed these common design, analytical, and reporting issues. Improving the application and presentation of quantitative methods in scholarly manuscripts is essential to advancing medical research.

  20. New Image Statistics for Detecting Disturbed Galaxy Morphologies at High Redshift

    CERN Document Server

    Freeman, P E; Lee, A B; Newman, J A; Conselice, C J; Koekemoer, A M; Lotz, J M; Mozena, M

    2013-01-01

    Testing theories of hierarchical structure formation requires estimating the distribution of galaxy morphologies and its change with redshift. One aspect of this investigation involves identifying galaxies with disturbed morphologies (e.g., merging galaxies). This is often done by summarizing galaxy images using, e.g., the CAS and Gini-M20 statistics of Conselice (2003) and Lotz et al. (2004), respectively, and associating particular statistic values with disturbance. We introduce three statistics that enhance detection of disturbed morphologies at high-redshift (z ~ 2): the multi-mode (M), intensity (I), and deviation (D) statistics. We show their effectiveness by training a machine-learning classifier, random forest, using 1,639 galaxies observed in the H band by the Hubble Space Telescope WFC3, galaxies that had been previously classified by eye by the CANDELS collaboration (Grogin et al. 2011, Koekemoer et al. 2011). We find that the MID statistics (and the A statistic of Conselice 2003) are the most usef...

  1. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    Science.gov (United States)

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information.

  2. High-statistics study of K^0_S pair production in two-photon collisions

    CERN Document Server

    Uehara, S; Nakazawa, H; Adachi, I; Aihara, H; Asner, D M; Aulchenko, V; Aushev, T; Bakich, A M; Bala, A; Bhardwaj, V; Bhuyan, B; Bondar, A; Bonvicini, G; Bozek, A; Bračko, M; Chekelian, V; Chen, A; Chen, P; Cheon, B G; Chilikin, K; Chistov, R; Cho, K; Chobanova, V; Choi, S -K; Choi, Y; Cinabro, D; Dalseno, J; Dingfelder, J; Doležal, Z; Dutta, D; Eidelman, S; Epifanov, D; Farhat, H; Fast, J E; Feindt, M; Ferber, T; Frey, A; Gaur, V; Gabyshev, N; Ganguly, S; Gillard, R; Giordano, F; Goh, Y M; Golob, B; Haba, J; Hayasaka, K; Hayashii, H; Hoshi, Y; Hou, W -S; Hyun, H J; Iijima, T; Ishikawa, A; Itoh, R; Iwasaki, Y; Julius, T; Kah, D H; Kang, J H; Kato, E; Kawai, H; Kawasaki, T; Kiesling, C; Kim, D Y; Kim, H O; Kim, J B; Kim, J H; Kim, Y J; Klucar, J; Ko, B R; Kodyš, P; Korpar, S; Križan, P; Krokovny, P; Kumita, T; Kuzmin, A; Kwon, Y -J; Lee, S -H; Li, J; Li, Y; Liu, C; Liu, Z Q; Liventsev, D; Lukin, P; Matvienko, D; Miyabayashi, K; Miyata, H; Mizuk, R; Moll, A; Mori, T; Muramatsu, N; Mussa, R; Nagasaka, Y; Nakao, M; Ng, C; Nisar, N K; Nishida, S; Nitoh, O; Ogawa, S; Okuno, S; Pakhlova, G; Park, C W; Park, H; Park, H K; Pedlar, T K; Pestotnik, R; Petrič, M; Piilonen, L E; Ritter, M; Röhrken, M; Rostomyan, A; Sahoo, H; Saito, T; Sakai, Y; Sandilya, S; Santelj, L; Sanuki, T; Savinov, V; Schneider, O; Schnell, G; Schwanda, C; Seidl, R; Senyo, K; Seon, O; Shapkin, M; Shen, C P; Shibata, T -A; Shiu, J -G; Shwartz, B; Sibidanov, A; Simon, F; Sohn, Y -S; Sokolov, A; Solovieva, E; Starič, M; Steder, M; Sumihama, M; Sumiyoshi, T; Tamponi, U; Tanida, K; Tatishvili, G; Teramoto, Y; Uchida, M; Uglov, T; Unno, Y; Uno, S; Urquijo, P; Vahsen, S E; Van Hulse, C; Varner, G; Wagner, M N; Wang, C H; Wang, M -Z; Wang, P; Wang, X L; Williams, K M; Won, E; Yamashita, Y; Yashchenko, S; Yook, Y; Yuan, C Z; Yusa, Y; Zhang, C C; Zhang, Z P; Zhilich, V; Zhulanov, V; Zupanc, A

    2013-01-01

    We report a high-statistics measurement of the differential cross section of the process gamma gamma --> K^0_S K^0_S in the range 1.05 GeV K^0_S K^0_S is reported. The detailed behavior of the cross section is updated and compared with QCD-based calculations.

  3. Development of High Count and High Density Fabric for Home Textiles%新型高支高密家纺面料的开发实践

    Institute of Scientific and Technical Information of China (English)

    张慧霞; 郭杰; 李华辉

    2013-01-01

    There are some technical difficulties in producing high count and high density home-textile fabric with some new ifbers such as Lyocell and Modal, and these problems can be solved by technique optimization in spinning, sizing and weaving processes. This article put forward some countermeasures based on practice.%用新型纤维进行高支高密家纺面料的开发在实际生产中具有较高的难度,需要从原料选取、设备改进、工艺优化等方面来解决纱线生产、浆纱及织造过程中很多难以控制的质量问题。本文从生产实际出发,探讨了如何通过这些手段实现新型高支高密家纺面料产品的顺利生产。

  4. Immune activation, CD4+ T cell counts, and viremia exhibit oscillatory patterns over time in patients with highly resistant HIV infection.

    Directory of Open Access Journals (Sweden)

    Christina M R Kitchen

    Full Text Available The rates of immunologic and clinical progression are lower in patients with drug-resistant HIV compared to wild-type HIV. This difference is not fully explained by viral load. It has been argued that reductions in T cell activation and/or viral fitness might result in preserved target cells and an altered relationship between the level of viremia and the rate of CD4+ T cell loss. We tested this hypothesis over time in a cohort of patients with highly resistant HIV. Fifty-four antiretroviral-treated patients with multi-drug resistant HIV and detectable plasma HIV RNA were followed longitudinally. CD4+ T cell counts and HIV RNA levels were measured every 4 weeks and T cell activation (CD38/HLA-DR was measured every 16 weeks. We found that the levels of CD4+ T cell activation over time were a strong independent predictor of CD4+ T cell counts while CD8+ T cell activation was more strongly associated with viremia. Using spectral analysis, we found strong evidence for oscillatory (or cyclic behavior in CD4+ T cell counts, HIV RNA levels, and T cell activation. Each of the cell populations exhibited an oscillatory behavior with similar frequencies. Collectively, these data suggest that there may be a mechanistic link between T cell activation, CD4+ T cell counts, and viremia and lends support for the hypothesis of altered predator-prey dynamics as a possible explanation of the stability of CD4+ T cell counts in the presence of sustained multi-drug resistant viremia.

  5. A new model test in high energy physics in frequentist and Bayesian statistical formalisms

    CERN Document Server

    Kamenshchikov, Andrey

    2016-01-01

    A problem of a new physical model test given observed experimental data is a typical one for modern experiments of high energy physics (HEP). A solution of the problem may be provided with two alternative statistical formalisms, namely frequentist and Bayesian, which are widely spread in contemporary HEP searches. A characteristic experimental situation is modeled from general considerations and both the approaches are utilized in order to test a new model. The results are juxtaposed, what demonstrates their consistency in this work. An effect of a systematic uncertainty treatment in the statistical analysis is also considered.

  6. A new model test in high energy physics in frequentist and Bayesian statistical formalisms

    Science.gov (United States)

    Kamenshchikov, A.

    2017-01-01

    A problem of a new physical model test given observed experimental data is a typical one for modern experiments of high energy physics (HEP). A solution of the problem may be provided with two alternative statistical formalisms, namely frequentist and Bayesian, which are widely spread in contemporary HEP searches. A characteristic experimental situation is modeled from general considerations and both the approaches are utilized in order to test a new model. The results are juxtaposed, what demonstrates their consistency in this work. An effect of a systematic uncertainty treatment in the statistical analysis is also considered.

  7. Counting carbon

    DEFF Research Database (Denmark)

    Damsø, Tue; Kjær, Tyge; Christensen, Thomas Budde

    2016-01-01

    The article contains an analysis of GHG accounting methodologies applied by local governments in Denmark. Eight Danish methodologies have been identified, a Danish best practice distinguished and assessed based on the criteria for good practice in GHG accounting: Relevance, comparability......, transparency, completeness, consistency and accuracy. In doing so a number of key concepts are defined and compared, and the relative relevance of the different criteria has been discussed. We observe a high degree of convergence among the Danish approaches in the application of data sources, quantification...... approaches and scope, identifying data availability as the key barrier for improving the specificity and dynamicity of local GHG accounts, and with it the accuracy and ability to monitor changes in emissions. In furthering an international best practice the Danish approach indicates that an adaptive approach...

  8. Electrical cell counting process characterization in a microfluidic impedance cytometer.

    Science.gov (United States)

    Hassan, Umer; Bashir, Rashid

    2014-10-01

    Particle counting in microfluidic devices with coulter principle finds many applications in health and medicine. Cell enumeration using microfluidic particle counters is fast and requires small volumes of sample, and is being used for disease diagnostics in humans and animals. A complete characterization of the cell counting process is critical for accurate cell counting especially in complex systems with samples of heterogeneous population interacting with different reagents in a microfluidic device. In this paper, we have characterized the electrical cell counting process using a microfluidic impedance cytometer. Erythrocytes were lysed on-chip from whole blood and the lysing was quenched to preserve leukocytes which subsequently pass through a 15 μm × 15 μm measurement channel used to electrically count the cells. We show that cell counting over time is a non-homogeneous Poisson process and that the electrical cell counts over time show the log-normal distribution, whose skewness can be attributed to diffusion of cells in the buffer that is used to meter the blood. We further found that the heterogeneous cell population (i.e. different cell types) shows different diffusion characteristics based on the cell size. Lymphocytes spatially diffuse more as compared to granulocytes and monocytes. The time difference between the cell occurrences follows an exponential distribution and when plotted over time verifies the cell diffusion characteristics. We also characterized the probability of occurrence of more than one cell at the counter within specified time intervals using Poisson counting statistics. For high cell concentration samples, we also derived the required sample dilution based on our particle counting characterization. Buffer characterization by considering the size based particle diffusion and estimating the required dilution are critical parameters for accurate counting results.

  9. Association between Resting Heart Rate and Inflammatory Markers (White Blood Cell Count and High-Sensitivity C-Reactive Protein) in Healthy Korean People.

    Science.gov (United States)

    Park, Woo-Chul; Seo, Inho; Kim, Shin-Hye; Lee, Yong-Jae; Ahn, Song Vogue

    2017-01-01

    Inflammation is an important underlying mechanism in the pathogenesis of atherosclerosis, and an elevated resting heart rate underlies the process of atherosclerotic plaque formation. We hypothesized an association between resting heart rate and subclinical inflammation. Resting heart rate was recorded at baseline in the KoGES-ARIRANG (Korean Genome and Epidemiology Study on Atherosclerosis Risk of Rural Areas in the Korean General Population) cohort study, and was then divided into quartiles. Subclinical inflammation was measured by white blood cell count and high-sensitivity C-reactive protein. We used progressively adjusted regression models with terms for muscle mass, body fat proportion, and adiponectin in the fully adjusted models. We examined inflammatory markers as both continuous and categorical variables, using the clinical cut point of the highest quartile of white blood cell count (≥7,900/mm(3)) and ≥3 mg/dL for high-sensitivity C-reactive protein. Participants had a mean age of 56.3±8.1 years and a mean resting heart rate of 71.4±10.7 beats/min; 39.1% were men. In a fully adjusted model, an increased resting heart rate was significantly associated with a higher white blood cell count and higher levels of high-sensitivity C-reactive protein in both continuous (P for trend heart rate is associated with a higher level of subclinical inflammation among healthy Korean people.

  10. Model Accuracy Comparison for High Resolution Insar Coherence Statistics Over Urban Areas

    Science.gov (United States)

    Zhang, Yue; Fu, Kun; Sun, Xian; Xu, Guangluan; Wang, Hongqi

    2016-06-01

    The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR) images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR) coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  11. Exploratory Visual Analysis of Statistical Results from Microarray Experiments Comparing High and Low Grade Glioma

    Directory of Open Access Journals (Sweden)

    Jason H. Moore

    2007-01-01

    Full Text Available The biological interpretation of gene expression microarray results is a daunting challenge. For complex diseases such as cancer, wherein the body of published research is extensive, the incorporation of expert knowledge provides a useful analytical framework. We have previously developed the Exploratory Visual Analysis (EVA software for exploring data analysis results in the context of annotation information about each gene, as well as biologically relevant groups of genes. We present EVA as a fl exible combination of statistics and biological annotation that provides a straightforward visual interface for the interpretation of microarray analyses of gene expression in the most commonly occurring class of brain tumors, glioma. We demonstrate the utility of EVA for the biological interpretation of statistical results by analyzing publicly available gene expression profi les of two important glial tumors. The results of a statistical comparison between 21 malignant, high-grade glioblastoma multiforme (GBM tumors and 19 indolent, low-grade pilocytic astrocytomas were analyzed using EVA. By using EVA to examine the results of a relatively simple statistical analysis, we were able to identify tumor class-specifi c gene expression patterns having both statistical and biological signifi cance. Our interactive analysis highlighted the potential importance of genes involved in cell cycle progression, proliferation, signaling, adhesion, migration, motility, and structure, as well as candidate gene loci on a region of Chromosome 7 that has been implicated in glioma. Because EVA does not require statistical or computational expertise and has the fl exibility to accommodate any type of statistical analysis, we anticipate EVA will prove a useful addition to the repertoire of computational methods used for microarray data analysis. EVA is available at no charge to academic users and can be found at http://www.epistasis.org.

  12. KIDS COUNT New Hampshire, 2000.

    Science.gov (United States)

    Shemitz, Elllen, Ed.

    This Kids Count report presents statewide trends in the well-being of New Hampshire's children. The statistical report is based on 22 indicators of child well-being in 5 interrelated areas: (1) children and families (including child population, births, children living with single parent, and children experiencing parental divorce); (2) economic…

  13. The Statistics of Radio Astronomical Polarimetry: Bright Sources and High Time Resolution

    CERN Document Server

    Van Straten, W

    2008-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal broadening phenomenon in single pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single and giant pulse polarimetry typically involves sources with large flux densities and observations with high time resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self noise is shown to fully explain the excess polarization dispersion previously noted in single pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes param...

  14. Statistical methods for integrating multiple types of high-throughput data.

    Science.gov (United States)

    Xie, Yang; Ahn, Chul

    2010-01-01

    Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integrating the heterogeneous data. We then introduce and review some recently developed statistical methods for integrative analysis for both statistical inference and classification purposes. Finally, we present some useful public access databases and program code to facilitate the integrative analysis in practice.

  15. Susceptibility to cephalosporins of bacteria causing intramammary infections in dairy cows with a high somatic cell count in Germany.

    Science.gov (United States)

    Wente, N; Zoche-Golob, V; Behr, M; Krömker, V

    2016-09-01

    The objective of this cross-sectional study was to determine the minimal inhibitory concentrations of cephalosporins of the first (cefalonium and cefapirin) and fourth generation (cefquinome) against bacteria isolated from intramammary infections in dairy cows with elevated somatic cell counts in Germany. Additionally, possible regional differences of the minimal inhibitory concentrations within Germany should be evaluated. In total, 6936 quarter milk samples from cows with a somatic cell count >200,000cells/ml were taken in 43 herds. The concentrations of the first generation cephalosporins inhibiting at least 90% of the isolates of a pathogen (MIC90) were ≥64μg/ml against Gram-negative bacteria and enterococci whereas the respective MIC90 against the other Gram-positive bacteria were ≤4μg/ml. The MIC90 of cefquinome were ≥16μg/ml against Gram-negative bacteria, bacilli and enterococci, and ≤2μg/ml against the other Gram-positive bacteria. Only the minimal inhibitory concentrations against coagulase-negative staphylococci differed significantly between regions in parametric survival models with shared frailties for the herds. However, the minimal inhibitory concentrations of cefquinome against staphylococci were higher than the minimal inhibitory concentrations of the tested cephalosporins of the first generation. Therefore, cefquinome should not be the first choice to treat staphylococcal mastitis in dairy cows.

  16. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images

    Science.gov (United States)

    Hu, Qin; Victor, Jonathan D.

    2016-01-01

    Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features, but they are challenging to study – largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH) functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis) of the distribution of filter coefficients depends only on the projection of the function onto a 1-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank. PMID:27713838

  17. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images

    Directory of Open Access Journals (Sweden)

    Qin Hu

    2016-09-01

    Full Text Available Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features but they are challenging to study, largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis of the distribution of filter coefficients depends only on the projection of the function onto a one-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank.

  18. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  19. Selective Detection and Automated Counting of Fluorescently-Labeled Chrysotile Asbestos Using a Dual-Mode High-Throughput Microscopy (DM-HTM Method

    Directory of Open Access Journals (Sweden)

    Jung Kyung Kim

    2013-05-01

    Full Text Available Phase contrast microscopy (PCM is a widely used analytical method for airborne asbestos, but it is unable to distinguish asbestos from non-asbestos fibers and requires time-consuming and laborious manual counting of fibers. Previously, we developed a high-throughput microscopy (HTM method that could greatly reduce human intervention and analysis time through automated image acquisition and counting of fibers. In this study, we designed a dual-mode HTM (DM-HTM device for the combined reflection and fluorescence imaging of asbestos, and automated a series of built-in image processing commands of ImageJ software to test its capabilities. We used DksA, a chrysotile-adhesive protein, for selective detection of chrysotile fibers in the mixed dust-free suspension of crysotile and amosite prepared in the laboratory. We demonstrate that fluorescently-stained chrysotile and total fibers can be identified and enumerated automatically in a high-throughput manner by the DM-HTM system. Combined with more advanced software that can correctly identify overlapping and branching fibers and distinguish between fibers and elongated dust particles, the DM-HTM method should enable fully automated counting of airborne asbestos.

  20. High Performance Human Face Recognition using Independent High Intensity Gabor Wavelet Responses: A Statistical Approach

    CERN Document Server

    Kar, Arindam; Basu, Dipak Kumar; Nasipuri, Mita; Kundu, Mahantapas

    2011-01-01

    In this paper, we present a technique by which high-intensity feature vectors extracted from the Gabor wavelet transformation of frontal face images, is combined together with Independent Component Analysis (ICA) for enhanced face recognition. Firstly, the high-intensity feature vectors are automatically extracted using the local characteristics of each individual face from the Gabor transformed images. Then ICA is applied on these locally extracted high-intensity feature vectors of the facial images to obtain the independent high intensity feature (IHIF) vectors. These IHIF forms the basis of the work. Finally, the image classification is done using these IHIF vectors, which are considered as representatives of the images. The importance behind implementing ICA along with the high-intensity features of Gabor wavelet transformation is twofold. On the one hand, selecting peaks of the Gabor transformed face images exhibit strong characteristics of spatial locality, scale, and orientation selectivity. Thus these...

  1. Comparison of antral and preantral ovarian follicle populations between Bos indicus and Bos indicus-taurus cows with high or low antral follicles counts.

    Science.gov (United States)

    Silva-Santos, K C; Siloto, L S; Santos, G M G; Morotti, F; Marcantonio, T N; Seneda, M M

    2014-02-01

    The objective was to compare populations of antral and pre-antral ovarian follicles in Bos indicus and Bos indicus-taurus cows with high and low antral follicle counts. Nelore (Bos indicus, n = 20) and Nelore X Angus (1/2 Bos indicus-taurus, n = 20) cows were subjected to follicular aspiration without regard to the stage of their oestrous cycle (day of aspiration = D0) to remove all follicles ≥3 mm and induce growth of a new follicular wave. Ovaries were examined by ultrasonography on D4, D19, D34, D49 and D64, and antral follicles ≥3 mm were counted. Thereafter, cows were assigned to one of two groups: high or low antral follicular count (AFC, ≥30 and ≤15 antral follicles, respectively). After D64, ovaries were collected after slaughter and processed for histological evaluation. There was high repeatability in the numbers of antral follicles for all groups (range 0.77-0.96). The mean (±SD) numbers of antral follicles were 35 ± 9 (Bos indicus) and 38 ± 6 (Bos indicus-taurus) for the high AFC group and 10 ± 3 (Bos indicus) and 12 ± 2 (Bos indicus-taurus) follicles for the low AFC. The mean number of preantral follicles in the ovaries of Bos indicus-taurus cows with high AFC (116 226 ± 83 156 follicles) was greater (p < 0.05) than that of Bos indicus cows (63 032 ± 58 705 follicles) with high AFC. However, there was no significant correlation between numbers of antral and preantral follicles.

  2. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    Energy Technology Data Exchange (ETDEWEB)

    Adams, T.; /Florida State U.; Batra, P.; /Columbia U.; Bugel, Leonard G.; /Columbia U.; Camilleri, Leslie Loris; /Columbia U.; Conrad, Janet Marie; /MIT; de Gouvea, A.; /Northwestern U.; Fisher, Peter H.; /MIT; Formaggio, Joseph Angelo; /MIT; Jenkins, J.; /Northwestern U.; Karagiorgi, Georgia S.; /MIT; Kobilarcik, T.R.; /Fermilab /Texas U.

    2009-06-01

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDFs). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parameterized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of 'Beyond the Standard Model' physics.

  3. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    CERN Document Server

    Adams, T; Bugel, L; Camilleri, L; Conrad, J M; De Gouvêa, A; Fisher, P H; Formaggio, J A; Jenkins, J; Karagiorgi, G; Kobilarcik, T R; Kopp, S; Kyle, G; Loinaz, W A; Mason, D A; Milner, R; Moore, R; Morfín, J G; Nakamura, M; Naples, D; Nienaber, P; Olness, F I; Owens, J F; Pate, S F; Pronin, A; Seligman, W G; Shaevitz, M H; Schellman, H; Schienbein, I; Syphers, M J; Tait, T M P; Takeuchi, T; Tan, C Y; Van de Water, R G; Yamamoto, R K; Yu, J Y

    2009-01-01

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDFs). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parameterized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of "Beyond the Standard Model" physics.

  4. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  5. Generation of high frequency photons with sub-Poissonian statistics at consecutive interactions

    CERN Document Server

    Chirkin, A S

    2003-01-01

    The process of parametric amplification at high frequency pumping, which is accompanied by optical frequency mixing in the same nonlinear crystal (NC), is considered. It is shown that if a signal wave is in a coherent state at the input of the NC, then the radiation with signal and summary frequencies can have sub-Poissonian photon statistics at the output of the NC in the deamplification regime. The Fano factors as functions of parameters of the problem are studied.

  6. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis......, multivariate analysis and data interpretation. We furthermore discuss the potential of future developments that will help to gain deep insight into the PTM-ome and its biological role in cells....

  7. Statistical Methods for Integrating Multiple Types of High-Throughput Data

    OpenAIRE

    Xie, Yang; Ahn, Chul

    2010-01-01

    Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integratin...

  8. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays*

    OpenAIRE

    Sturino, Joseph; Zorych, Ivan; Mallick, Bani; Pokusaeva, Karina; Chang, Ying-Ying; Carroll, Raymond J.; Bliznuyk, Nikolay

    2010-01-01

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second ap...

  9. High-speed particle image velocimetry for the efficient measurement of turbulence statistics

    Science.gov (United States)

    Willert, Christian E.

    2015-01-01

    A high-frame-rate camera and a continuous-wave laser are used to capture long particle image sequences exceeding 100,000 consecutive frames at framing frequencies up to 20 kHz. The electronic shutter of the high-speed CMOS camera is reduced to s to prevent excessive particle image streaking. The combination of large image number and high frame rate is possible by limiting the field of view to a narrow strip, primarily to capture temporally resolved profiles of velocity and derived quantities, such as vorticity as well as higher order statistics. Multi-frame PIV processing algorithms are employed to improve the dynamic range of recovered PIV data. The recovered data are temporally well resolved and provide sufficient samples for statistical convergence of the fluctuating velocity components. The measurement technique is demonstrated on a spatially developing turbulent boundary layer inside a small wind tunnel with and . The chosen magnification permits a reliable estimation of the mean velocity profile down to a few wall units and yields statistical information such as the Reynolds stress components and probability density functions. By means of single-line correlation, it is further possible to extract the near-wall velocity profile in the viscous sublayer, both time-averaged as well as instantaneous, which permits the estimation the wall shear rate and along with it the shear stress and friction velocity . These data are then used for the calculation of space-time correlation maps of wall shear stress and velocity.

  10. Towards Direct Simulation of Future Tropical Cyclone Statistics in a High-Resolution Global Atmospheric Model

    Directory of Open Access Journals (Sweden)

    Michael F. Wehner

    2010-01-01

    Full Text Available We present a set of high-resolution global atmospheric general circulation model (AGCM simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. While this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.

  11. APPLICATION OF THE UNIFIED STATISTICAL MATERIAL DATABASE FOR DESIGN AND LIFE/RISK ASSESSMENT OF HIGH TEMPERATURE COMPONENTS

    Institute of Scientific and Technical Information of China (English)

    K.Fujiyama; T.Fujiwara; Y.Nakatani; K.Saito; A.Sakuma; Y.Akikuni; S.Hayashi; S.Matsumoto

    2004-01-01

    Statistical manipulation of material data was conducted for probabilistic life assessment or risk-based design and maintenance for high temperature components of power plants. To obtain the statistical distribution of material properties, dominant parameters affecting material properties are introduced into normalization of statistical variables. Those parameters are hardness, chemical composition, characteristic microstructural features and so on. Creep and fatigue properties are expressed by normalized parameters and the unified statistical distributions are obtained. These probability distribution functions show good coincidence statistically with the field database of steam turbine components. It was concluded that the unified statistical baseline approach is useful for the risk management of components in power plants.

  12. Statistical Issues in High-Energy Gamma-Ray Astronomy for GLAST

    Energy Technology Data Exchange (ETDEWEB)

    Digel, S

    2004-04-06

    This paper describes the statistical issues involved in analyzing data from high-energy gamma-ray telescopes, at levels from event reconstruction to correlations of populations of astrophysical sources. Some motivation for attempting to do astronomy with high-energy gamma rays is also given, along with some of the constraints implied by operating the instrument in orbit. Specific attention is given to the Large Area Telescope (LAT) under development for launch in late 2006 on the Gamma-ray Large Area Space Telescope (GLAST) mission.

  13. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  14. Efficient statistical significance approximation for local similarity analysis of high-throughput time series data.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob; Fuhrman, Jed A; Sun, Fengzhu

    2013-01-15

    Local similarity analysis of biological time series data helps elucidate the varying dynamics of biological systems. However, its applications to large scale high-throughput data are limited by slow permutation procedures for statistical significance evaluation. We developed a theoretical approach to approximate the statistical significance of local similarity analysis based on the approximate tail distribution of the maximum partial sum of independent identically distributed (i.i.d.) random variables. Simulations show that the derived formula approximates the tail distribution reasonably well (starting at time points > 10 with no delay and > 20 with delay) and provides P-values comparable with those from permutations. The new approach enables efficient calculation of statistical significance for pairwise local similarity analysis, making possible all-to-all local association studies otherwise prohibitive. As a demonstration, local similarity analysis of human microbiome time series shows that core operational taxonomic units (OTUs) are highly synergetic and some of the associations are body-site specific across samples. The new approach is implemented in our eLSA package, which now provides pipelines for faster local similarity analysis of time series data. The tool is freely available from eLSA's website: http://meta.usc.edu/softs/lsa. Supplementary data are available at Bioinformatics online. fsun@usc.edu.

  15. On high time-range resolution observations of PMSE: Statistical characteristics

    Science.gov (United States)

    Sommer, Svenja; Chau, Jorge L.; Schult, Carsten

    2016-06-01

    We present observations of polar mesospheric summer echoes (PMSE) with an unprecedented temporal sampling of 2 ms and range resolution down to 75 m. On these time and spatial scales, PMSE exhibit features, like correlation in time and range, that have not been described before. To characterize our high resolution observations, we provide a 4-D statistical model, based on random processes. In this way we can distinguish between geophysical and instrumental effects on our measurements. In our simulations, PMSE is statistically characterized in frequency, angular space, and inverse altitude. With this model, we are able to reproduce our observations on a statistical basis and estimate the intrinsic spectral width of PMSE. For chosen data sets, such values range between 0.5 Hz and 4 Hz (1.4 ms-1 to 11.2 ms-1). Furthermore, we show that apparent oscillations in time and an apparent high speed motion of the mean scattering center are just representations of the random nature of PMSE measurements on short time scales.

  16. VSRR Provisional Drug Overdose Death Counts

    Data.gov (United States)

    U.S. Department of Health & Human Services — This data contains provisional counts for drug overdose deaths based on a current flow of mortality data in the National Vital Statistics System. National...

  17. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  18. Association between Resting Heart Rate and Inflammatory Markers (White Blood Cell Count and High-Sensitivity C-Reactive Protein) in Healthy Korean People

    Science.gov (United States)

    Park, Woo-Chul; Seo, Inho; Kim, Shin-Hye

    2017-01-01

    Background Inflammation is an important underlying mechanism in the pathogenesis of atherosclerosis, and an elevated resting heart rate underlies the process of atherosclerotic plaque formation. We hypothesized an association between resting heart rate and subclinical inflammation. Methods Resting heart rate was recorded at baseline in the KoGES-ARIRANG (Korean Genome and Epidemiology Study on Atherosclerosis Risk of Rural Areas in the Korean General Population) cohort study, and was then divided into quartiles. Subclinical inflammation was measured by white blood cell count and high-sensitivity C-reactive protein. We used progressively adjusted regression models with terms for muscle mass, body fat proportion, and adiponectin in the fully adjusted models. We examined inflammatory markers as both continuous and categorical variables, using the clinical cut point of the highest quartile of white blood cell count (≥7,900/mm3) and ≥3 mg/dL for high-sensitivity C-reactive protein. Results Participants had a mean age of 56.3±8.1 years and a mean resting heart rate of 71.4±10.7 beats/min; 39.1% were men. In a fully adjusted model, an increased resting heart rate was significantly associated with a higher white blood cell count and higher levels of high-sensitivity C-reactive protein in both continuous (P for trend <0.001) and categorical (P for trend <0.001) models. Conclusion An increased resting heart rate is associated with a higher level of subclinical inflammation among healthy Korean people.

  19. Effects of Fluoride Varnish on Streptococcus mutans Count in Saliva.

    Science.gov (United States)

    Badjatia, Sourabh; Badjatia, Rini G; Thanveer, K; Krishnan, Ajith Cg

    2017-01-01

    To evaluate the effect of fluoride varnish on Streptococcus mutans count in saliva among 12-year-old school children. A field experiment was conducted to evaluate the effects of fluoride varnish on S. mutans count in saliva among 12-year-old school children. A total of 42 school-going children attending schools in Vadodara district, Gujarat, India, were divided into two groups. Group I was treated with fluoride varnish and group II received no treatment. Assessment of S. mutans was carried out at baseline and 3 to 6 months postfluoride varnish application. Friedman analysis of variance test and post hoc test were applied to detect statistically significant differences between baseline, 3 to 6 months of fluoride varnish application, and also between groups I and II. The mean number of salivary S. mutans value found in case group at baseline, 3 to 6 months was 31.23 ± 1.119, 9.27 ± 0.852, and 9.39 ± 0.908 × 10(4) colony-forming unit CFU/mL respectively. The difference in S. mutans count from baseline to 3 to 6 months was highly statistically significant (p = 0.000), but the difference from 3 to 6 months was not statistically significant (p = 0.142). In control group, the mean S. mutans value found at baseline, 3 to 6 months was 30.63 ± 1.436, 31.23 ± 1.351, and 31.40 ± 1.374 × 10(4) CFU/mL respectively. The differences between these values were not statistically significant (p = 0.11). Statistically significant reduction in S. mutans count in saliva was seen 3 to 6 months after fluoride varnish application. Badjatia S, Badjatia RG, Thanveer K, Krishnan ACG. Effects of Fluoride Varnish on Streptococcus mutans Count in Saliva. Int J Clin Pediatr Dent 2017;10(1):62-66.

  20. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  1. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  2. High order statistics based blind deconvolution of bi-level images with unknown intensity values.

    Science.gov (United States)

    Kim, Jeongtae; Jang, Soohyun

    2010-06-07

    We propose a novel linear blind deconvolution method for bi-level images. The proposed method seeks an optimal point spread function and two parameters that maximize a high order statistics based objective function. Unlike existing minimum entropy deconvolution and least squares minimization methods, the proposed method requires neither unrealistic assumption that the pixel values of a bi-level image are independently identically distributed samples of a random variable nor tuning of regularization parameters.We demonstrate the effectiveness of the proposed method in simulations and experiments.

  3. High Statistics Analysis using Anisotropic Clover Lattices: (II) Three-Baryon Systems

    OpenAIRE

    2009-01-01

    We present the results of an exploratory Lattice QCD calculation of three-baryon systems through a high-statistics study of one ensemble of anisotropic clover gauge-field configurations with a pion mass of m_\\pi ~ 390 MeV. Because of the computational cost of the necessary contractions, we focus on correlation functions generated by interpolating-operators with the quantum numbers of the $\\Xi^0\\Xi^0 n$ system, one of the least demanding three baryon systems in terms of the number of contracti...

  4. STATISTICAL PROPERTIES OF HIGH-FREQUENCY INTERNAL WAVES IN QINGDAO OFFSHORE AREA OF THE YELLOW SEA

    Institute of Scientific and Technical Information of China (English)

    王涛; 高天赋

    2002-01-01

    Densely-sampled thermistor chain data obtained from a shallow-water acoustics experiment in the Yellow Sea off the coast of Qingdao were analyzed to examine the statistical properties of the 6 to 520 cpd frequency band internal waves observed. The negative skewness coefficients and the greater-than-3 kurtosis coefficients indicated non-Gaussianity of the internal waves. The probability distributions were negatively skewed and abnormally high peaks. Nonlinear properties, as exemplified by the asymmetric waveshapes of the internal waves in the offshore area are described quantitatively.

  5. A Statistical Approach to Describe Highly Excited Heavy and Superheavy Nuclei

    CERN Document Server

    Chen, Peng-Hui; Li, Jun-Qing; Zhang, Hong-Fei

    2016-01-01

    A statistical approach based on the Weisskopf evaporation theory has been developed to describe the de-excitation process of highly excited heavy and superheavy nuclei, in particular for the proton-rich nuclei. The excited nucleus is cooled by evaporating $\\gamma$-ray, light particles (neutrons, protons, $\\alpha$ etc) in competition with the binary fission, in which the structure effects (shell correction, fission barrier, particle separation energy) contribute to the processes. The formation of residual nuclei is evaluated via sequential emission of possible particles above the separation energies. The available data of fusion-evaporation excitation functions in the $^{28}$Si+$^{198}$Pt reaction can be reproduced nicely well within the approach.

  6. Statistical methods for comparative phenomics using high-throughput phenotype microarrays.

    Science.gov (United States)

    Sturino, Joseph; Zorych, Ivan; Mallick, Bani; Pokusaeva, Karina; Chang, Ying-Ying; Carroll, Raymond J; Bliznuyk, Nikolay

    2010-08-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  7. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays*

    Science.gov (United States)

    Sturino, Joseph; Zorych, Ivan; Mallick, Bani; Pokusaeva, Karina; Chang, Ying-Ying; Carroll, Raymond J; Bliznuyk, Nikolay

    2010-01-01

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform. PMID:20865133

  8. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  9. Hanford whole body counting manual

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, H.E.; Rieksts, G.A.; Lynch, T.P.

    1990-06-01

    This document describes the Hanford Whole Body Counting Program as it is administered by Pacific Northwest Laboratory (PNL) in support of the US Department of Energy--Richland Operations Office (DOE-RL) and its Hanford contractors. Program services include providing in vivo measurements of internally deposited radioactivity in Hanford employees (or visitors). Specific chapters of this manual deal with the following subjects: program operational charter, authority, administration, and practices, including interpreting applicable DOE Orders, regulations, and guidance into criteria for in vivo measurement frequency, etc., for the plant-wide whole body counting services; state-of-the-art facilities and equipment used to provide the best in vivo measurement results possible for the approximately 11,000 measurements made annually; procedures for performing the various in vivo measurements at the Whole Body Counter (WBC) and related facilities including whole body counts; operation and maintenance of counting equipment, quality assurance provisions of the program, WBC data processing functions, statistical aspects of in vivo measurements, and whole body counting records and associated guidance documents. 16 refs., 48 figs., 22 tabs.

  10. The Big Pumpkin Count.

    Science.gov (United States)

    Coplestone-Loomis, Lenny

    1981-01-01

    Pumpkin seeds are counted after students convert pumpkins to jack-o-lanterns. Among the activities involved, pupils learn to count by 10s, make estimates, and to construct a visual representation of 1,000. (MP)

  11. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    Science.gov (United States)

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  12. Pile-up corrections for high-precision superallowed {beta} decay half-life measurements via {gamma}-ray photopeak counting

    Energy Technology Data Exchange (ETDEWEB)

    Grinyer, G.F. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada)], E-mail: ggrinyer@physics.uoguelph.ca; Svensson, C.E.; Andreoiu, C. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Andreyev, A.N. [TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Austin, R.A.E. [Department of Astronomy and Physics, St. Mary' s University, Halifax, NS, B3H 3C3 (Canada); Ball, G.C. [TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Bandyopadhyay, D. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Chakrawarthy, R.S. [TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Finlay, P. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Garrett, P.E. [Department of Physics, University of Guelph, Guelph, Ont, Canada N1G 2W1 (Canada); TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Hackman, G. [TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Hyland, B. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Kulp, W.D. [School of Physics, Georgia Institute of Technology, Atlanta, GA 30332 0430 (United States); Leach, K.G. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Leslie, J.R. [Department of Physics, Queen' s University, Kingston, Ont., K7L 3N6 (Canada); Morton, A.C.; Pearson, C.J. [TRIUMF, 4004 Wesbrook Mall, Vancouver, BC, V6T 2A3 (Canada); Phillips, A.A. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada); Sarazin, F. [Department of Physics, Colorado School of Mines, Golden, CO 80401 (United States); Schumaker, M.A. [Department of Physics, University of Guelph, Guelph, Ont, N1G 2W1 (Canada)] (and others)

    2007-09-11

    A general technique that corrects {gamma}-ray gated {beta} decay-curve data for detector pulse pile-up is presented. The method includes corrections for non-zero time-resolution and energy-threshold effects in addition to a special treatment of saturating events due to cosmic rays. This technique is verified through a Monte Carlo simulation and experimental data using radioactive beams of {sup 26}Na implanted at the center of the 8{pi}{gamma}-ray spectrometer at the ISAC facility at TRIUMF in Vancouver, Canada. The {beta}-decay half-life of {sup 26}Na obtained from counting 1809-keV {gamma}-ray photopeaks emitted by the daughter {sup 26}Mg was determined to be T{sub 1/2}=1.07167{+-}0.00055s following a 27{sigma} correction for detector pulse pile-up. This result is in excellent agreement with the result of a previous measurement that employed direct {beta} counting and demonstrates the feasibility of high-precision {beta}-decay half-life measurements through the use of high-purity germanium {gamma}-ray detectors. The technique presented here, while motivated by superallowed-Fermi {beta} decay studies, is general and can be used for all half-life determinations (e.g. {alpha}-, {beta}-, X-ray, fission) in which a {gamma}-ray photopeak is used to select the decays of a particular isotope.

  13. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  14. Drug-excipient compatibility testing using a high-throughput approach and statistical design.

    Science.gov (United States)

    Wyttenbach, Nicole; Birringer, Christian; Alsenz, Jochem; Kuentz, Martin

    2005-01-01

    The aim of our research was to develop a miniaturized high throughput drug-excipient compatibility test. Experiments were planned and evaluated using statistical experimental design. Binary mixtures of a drug, acetylsalicylic acid, or fluoxetine hydrochloride, and of excipients commonly used in solid dosage forms were prepared at a ratio of approximately 1:100 in 96-well microtiter plates. Samples were exposed to different temperature (40 degrees C/ 50 degrees C) and humidity (10%/75%) for different time (1 week/4 weeks), and chemical drug degradation was analyzed using a fast gradient high pressure liquid chromatography (HPLC). Categorical statistical design was applied to identify the effects and interactions of time, temperature, humidity, and excipient on drug degradation. Acetylsalicylic acid was least stable in the presence of magnesium stearate, dibasic calcium phosphate, or sodium starch glycolate. Fluoxetine hydrochloride exhibited a marked degradation only with lactose. Factor-interaction plots revealed that the relative humidity had the strongest effect on the drug excipient blends tested. In conclusion, the developed technique enables fast drug-excipient compatibility testing and identification of interactions. Since only 0.1 mg of drug is needed per data point, fast rational preselection of the pharmaceutical additives can be performed early in solid dosage form development.

  15. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    Energy Technology Data Exchange (ETDEWEB)

    Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)

    2013-02-15

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  16. A Medium/Long-Range Forecast of Pacific Subtropical High Based on Dynamic Statistic Model Reconstruction

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the 500-hPa geopotential height field series of T106 numerical forecast products, by empirical orthogonal function (EOF) time-space separation, and on the hypotheses of EOF space-models being stable, the EOF time coefficient series were taken as dynamical statistic model variables. The dynamic system reconstruction idea and genetic algorithm were introduced to make the dynamical model parameters optimized, and a nonlinear dynamic statistic model of EOF separating time coefficient series was established. By the model time integral and EOF time-space reconstruction, a medium/long-range forecast of subtropical high was carried out. The results show that the dynamical model forecast and T106 numerical forecast were approximately similar in the short-range forecast (≤5 days), but in the medium/long-range forecast (≥5 days), the forecast results of dynamical model was superior to that of T106 numerical products. A new method and idea were presented for diagnosing and forecasting complicated weathers such as subtropical high, and showed a better application outlook.

  17. Genetic loci on chromosome 5 are associated with circulating levels of interleukin-5 and eosinophil count in a European population with high risk for cardiovascular disease.

    Science.gov (United States)

    McLeod, Olga; Silveira, Angela; Valdes-Marquez, Elsa; Björkbacka, Harry; Almgren, Peter; Gertow, Karl; Gådin, Jesper R; Bäcklund, Alexandra; Sennblad, Bengt; Baldassarre, Damiano; Veglia, Fabrizio; Humphries, Steve E; Tremoli, Elena; de Faire, Ulf; Nilsson, Jan; Melander, Olle; Hopewell, Jemma C; Clarke, Robert; Björck, Hanna M; Hamsten, Anders; Öhrvik, John; Strawbridge, Rona J

    2016-05-01

    IL-5 is a Th2 cytokine which activates eosinophils and is suggested to have an atheroprotective role. Genetic variants in the IL5 locus have been associated with increased risk of CAD and ischemic stroke. In this study we aimed to identify genetic variants associated with IL-5 concentrations and apply a Mendelian randomisation approach to assess IL-5 levels for causal effect on intima-media thickness in a European population at high risk of coronary artery disease. We analysed SNPs within robustly associated candidate loci for immune, inflammatory, metabolic and cardiovascular traits. We identified 2 genetic loci for IL-5 levels (chromosome 5, rs56183820, BETA=0.11, P=6.73E(-5) and chromosome 14, rs4902762, BETA=0.12, P=5.76E(-6)) and one for eosinophil count (rs72797327, BETA=-0.10, P=1.41E(-6)). Both chromosome 5 loci were in the vicinity of the IL5 gene, however the association with IL-5 levels failed to replicate in a meta-analysis of 2 independent cohorts (rs56183820, BETA=0.04, P=0.2763, I(2)=24, I(2)-P=0.2516). No significant associations were observed between SNPs associated with IL-5 levels or eosinophil count and IMT measures. Expression quantitative trait analyses indicate effects of the IL-5 and eosinophil-associated SNPs on RAD50 mRNA expression levels (rs12652920 (r2=0.93 with rs56183820) BETA=-0.10, P=8.64E(-6) and rs11739623 (r2=0.96 with rs72797327) BETA=-0.23, P=1.74E(-29), respectively). Our data do not support a role for IL-5 levels and eosinophil count in intima-media thickness, however SNPs associated with IL-5 and eosinophils might influence stability of the atherosclerotic plaque via modulation of RAD50 levels.

  18. Statistical Tensile Strength for High Strain Rate of Aramid and UHMWPE Fibers

    Institute of Scientific and Technical Information of China (English)

    YANG Bin; XIONG Tao; XIONG Jie

    2006-01-01

    Dynamic tensile impact properties of aramid (Technora(R)) and UHMWPE (DC851) fiber bundles were studied at two high strain rates by means of reflecting type Split Hopkinson Bar, and stress-strain curves of fiber yarns at different strain rates were obtained. Experimental results show that the initial elastic modulus, failure strength and unstable strain of aramid fiber yarns are strain rate insensitive, whereas the initial elastic modulus and unstable strain of UHMWPE fiber yarns are strain rate sensitive. A fiber-bundle statistical constitutive equation was used to describe the tensile behavior of aramid and UHMWPE fiber bundles at high strain rates. The good consistency between the simulated results and experimental data indicates that the modified double Weibull function can represent the tensile strength distribution of aramid and UHMWPE fibers and the method of extracting Weibull parameters from fiber bundles stress-strain data is valid.

  19. Pair normalized channel feature and statistics-based learning for high-performance pedestrian detection

    Science.gov (United States)

    Zeng, Bobo; Wang, Guijin; Ruan, Zhiwei; Lin, Xinggang; Meng, Long

    2012-07-01

    High-performance pedestrian detection with good accuracy and fast speed is an important yet challenging task in computer vision. We design a novel feature named pair normalized channel feature (PNCF), which simultaneously combines and normalizes two channel features in image channels, achieving a highly discriminative power and computational efficiency. PNCF applies to both gradient channels and color channels so that shape and appearance information are described and integrated in the same feature. To efficiently explore the formidably large PNCF feature space, we propose a statistics-based feature learning method to select a small number of potentially discriminative candidate features, which are fed into the boosting algorithm. In addition, channel compression and a hybrid pyramid are employed to speed up the multiscale detection. Experiments illustrate the effectiveness of PNCF and its learning method. Our proposed detector outperforms the state-of-the-art on several benchmark datasets in both detection accuracy and efficiency.

  20. High-level production of astaxanthin by Xanthophyllomyces dendrorhous mutant JH1 using statistical experimental designs.

    Science.gov (United States)

    Kim, Jeong-Hwan; Kang, Seong-Woo; Kim, Seung-Wook; Chang, Hyo-Ihl

    2005-09-01

    Medium composition was optimized for high-level production of astaxanthin by Xanthophyllomyces dendrorhous mutant JH1 using statistical experimental designs. Glucose and yeast extract were the most important factors affecting astaxanthin production. Glucose 3.89%, yeast extract 0.29%, KH2PO4 0.25%, MgSO4 0.05%, MnSO4 0.02%, and CaCl2 0.01% were optimum for high-level production of astaxanthin. Under optimized conditions, the maximum concentration of astaxanthin obtained after 7 d of cultivation was 36.06 mg/l. The concentration of astaxanthin predicted by a polynomial model was 36.16 mg/l.

  1. Characterization of the count rate performance of modern gamma cameras

    Science.gov (United States)

    Silosky, M.; Johnson, V.; Beasley, C.; Cheenu Kappadath, S.

    2013-01-01

    Purpose: Evaluation of count rate performance (CRP) is an integral component of gamma camera quality assurance and system deadtime (τ) may be utilized for image correction in quantitative studies. This work characterizes the CRP of three modern gamma cameras and estimates τ using two established methods (decay and dual source) under a variety of experimental conditions. Methods: For the decay method, uncollimated detectors were exposed to a Tc-99m source of relatively high activity and count rates were sampled regularly over 48 h. Input count rate at each time point was based on the lowest observed count rate data point. The input count rate was plotted against the observed count rate and fit via least-squares to the paralyzable detector model (PDM) to estimate τ (rates method). A novel expression for observed counts as a function of measurement time interval was derived, taking into account the PDM and the presence of background but making no assumption regarding input count rate. The observed counts were fit via least-squares to this novel expression to estimate τ (counts method). Correlation and Bland-Altman analyses were performed to assess agreement in estimates of τ between the rates and counts methods. The dependence of τ on energy window definition and incident energy spectrum were characterized. The dual source method was also used to estimate τ and its agreement with estimates from the decay method under identical conditions was also investigated. The dependences of τ on the total activity and the ratio of source activities were characterized. Results: The observed CRP curves for each gamma camera agreed with the PDM at low count rates but deviated substantially from it at high count rates. The estimates of τ determined from the paralyzable portion of the CPR curve using the rates method and the counts method were found to be highly correlated (r = 0.999) but with a small (∼6%) difference. No statistically significant difference was observed

  2. A Computer Vision Approach to Object Tracking and Counting

    Directory of Open Access Journals (Sweden)

    Sergiu Mezei

    2010-09-01

    Full Text Available This paper, introduces a new method for counting people or more generally objects that enter or exit a certain area/building or perimeter. We propose an algorithm (method that analyzes a video sequence, detects moving objects and their moving direction and filters them according to some criteria (ex only humans. As result one obtains in and out counters for objects passing the defined perimeter. Automatic object counting is a growing size application in many industry/commerce areas. Counting can be used in statistical analysis and optimal activity scheduling methods. One of the main applications is the approximation of the number of persons passing trough, or reaching a certain area: airports (customs, shopping centers and malls and sports or cultural activities with high attendance. The main purpose is to offer an accurate estimation while still keeping the anonymity of the visitors.

  3. Bayesian inference from count data using discrete uniform priors.

    Science.gov (United States)

    Comoglio, Federico; Fracchia, Letizia; Rinaldi, Maurizio

    2013-01-01

    We consider a set of sample counts obtained by sampling arbitrary fractions of a finite volume containing an homogeneously dispersed population of identical objects. We report a Bayesian derivation of the posterior probability distribution of the population size using a binomial likelihood and non-conjugate, discrete uniform priors under sampling with or without replacement. Our derivation yields a computationally feasible formula that can prove useful in a variety of statistical problems involving absolute quantification under uncertainty. We implemented our algorithm in the R package dupiR and compared it with a previously proposed Bayesian method based on a Gamma prior. As a showcase, we demonstrate that our inference framework can be used to estimate bacterial survival curves from measurements characterized by extremely low or zero counts and rather high sampling fractions. All in all, we provide a versatile, general purpose algorithm to infer population sizes from count data, which can find application in a broad spectrum of biological and physical problems.

  4. High redshift galaxies in the ALHAMBRA survey: I. selection method and number counts based on redshift PDFs

    CERN Document Server

    Viironen, K; López-Sanjuan, C; Varela, J; Chaves-Montero, J; Cristóbal-Hornillos, D; Molino, A; Fernández-Soto, A; Ascaso, B; Cenarro, A J; Cerviño, M; Cepa, J; Ederoclite, A; Márquez, I; Masegosa, J; Moles, M; Oteo, I; Pović, M; Aguerri, J A L; Alfaro, E; Aparicio-Villegas, T; Benítez, N; Broadhurst, T; Cabrera-Caño, J; Castander, J F; Del Olmo, A; Delgado, R M González; Husillos, C; Infante, L; Martínez, V J; Perea, J; Prada, F; Quintana, J M

    2015-01-01

    Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so called dropout technique or Ly-alpha selection. However, the availability of multifilter data allows now replacing the dropout selections by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims. Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing in the study of the brightest, less frequent, high redshift galaxies. Methods. The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reach...

  5. Broadband Transmission and Statistical Performance Properties of Overhead High-Voltage Transmission Networks

    Directory of Open Access Journals (Sweden)

    Athanasios G. Lazaropoulos

    2012-01-01

    Full Text Available This paper considers broadband signal transmission and statistical performance properties of high-voltage/broadband over power lines (HV/BPL channels associated with overhead power transmission. The overhead HV/BPL transmission channel is investigated with regard to its spectral behavior, its end-to-end signal attenuation, and its statistical performance metrics. It is found that the above features depend critically on the frequency, the overhead HV power grid type (150 kV, 275 kV, or 400 kV and single- or double-circuit, the coupling scheme applied, the physical properties of the cables used, the MTL configuration, and the type of branches existing along the end-to-end BPL signal propagation. The contribution of this paper is threefold. First, the significant broadband transmission potential of overhead HV lines is revealed. The results demonstrate that, regardless of overhead HV power grid type, the overhead HV grid is a potentially excellent communications medium, offering low-loss characteristics, flat-fading features, and low multipath dispersion over a 25 km repeater span well beyond 100 MHz. Second, regarding the statistical properties of various overhead HV/BPL transmission channels, two fundamental correlations of several wireline systems, for example, coaxial cables and xDSL, are also validated in the case of overhead HV/BPL transmission channels, namely, (i end-to-end channel attenuation in relation with root-mean-square delay spread (RMS-DS and (ii coherence bandwidth (CB in relation with RMS-DS. Third, fitting the numerical results and other field trial measurements, two regression distributions suitable for each fundamental correlation are proposed.

  6. Herschel-ATLAS Galaxy Counts and High Redshift Luminosity Functions: The Formation of Massive Early Type Galaxies

    CERN Document Server

    Lapi, A; Fan, L; Bressan, A; De Zotti, G; Danese, L; Negrello, M; Dunne, L; Eales, S; Maddox, S; Auld, R; Baes, M; Bonfield, D G; Buttiglione, S; Cava, A; Clements, D L; Cooray, A; Dariush, A; Dye, S; Fritz, J; Herranz, D; Hopwood, R; Ibar, E; Ivison, R; Jarvis, M J; Kaviraj, S; Lopez-Caniego, M; Massardi, M; Michalowski, M J; Pascale, E; Pohlen, M; Rigby, E; Rodighiero, G; Serjeant, S; Smith, D J B; Temi, P; Wardlow, J; van der Werf, P

    2011-01-01

    Exploiting the Herschel-ATLAS Science Demonstration Phase (SDP) survey data, we have determined the luminosity functions (LFs) at rest-frame wavelengths of 100 and 250 micron and at several redshifts z>1, for bright sub-mm galaxies with star formation rates (SFR) >100 M_sun/yr. We find that the evolution of the comoving LF is strong up to z~2.5, and slows down at higher redshifts. From the LFs and the information on halo masses inferred from clustering analysis, we derived an average relation between SFR and halo mass (and its scatter). We also infer that the timescale of the main episode of dust-enshrouded star formation in massive halos (M_H>3*10^12 M_sun) amounts to ~7*10^8 yr. Given the SFRs, which are in the range 10^2-10^3 M_sun/yr, this timescale implies final stellar masses of order of 10^11-10^12 M_sun. The corresponding stellar mass function matches the observed mass function of passively evolving galaxies at z>1. The comparison of the statistics for sub-mm and UV selected galaxies suggests that the...

  7. High-Statistics Study of the tau^- -> pi^- pi^0 nu_tau Decay

    CERN Document Server

    Fujikawa, M; Eidelman, S; Adachi, I; Aihara, H; Arinstein, K; Aulchenko, V; Aushev, T; Bakich, A M; Balagura, V; Barberio, E; Bay, A; Bedny, I; Belous, K S; Bhardwaj, V; Bitenc, U; Bondar, A; Bracko, M; Brodzicka, J; Browder, T E; Chang, P; Chao, Y; Chen, A; Cheon, B G; Chistov, R; Cho, I S; Choi, Y; Dalseno, J; Dash, M; Epifanov, D; Gabyshev, N; Golob, B; Ha, H; Haba, J; Hara, K; Hasegawa, Y; Hayasaka, K; Hazumi, M; Heffernan, D; Hoshi, Y; Hou, W S; Hyun, H J; Iijima, T; Inami, K; Ishikawa, A; Ishino, H; Itoh, R; Iwabuchi, M; Iwasaki, M; Kah, D H; Kaji, H; Kataoka, S U; Kawasaki, T; Kichimi, H; Kim, H O; Kim, S K; Kim, Y I; Kim, Y J; Korpar, S; Krizan, P; Krokovny, P; Kumar, R; Kuzmin, A; Kwon, Y J; Kyeong, S H; Lange, J S; Lee, M J; Lee, S E; Limosani, A; Liu, C; Liu, Y; Liventsev, D; MacNaughton, J; Mandl, F; Matyja, A; McOnie, S; Miyabayashi, K; Miyata, H; Miyazaki, Y; Mizuk, R; Moloney, G R; Mori, T; Nagasaka, Y; Nakano, E; Nakao, M; Nakazawa, H; Natkaniec, Z; Nishida, S; Nitoh, O; Noguchi, S; Nozaki, T; Ohshima, T; Okuno, S; Olsen, S L; Ozaki, H; Pakhlov, P; Pakhlova, G; Palka, H; Park, C W; Park, H; Park, H K; Peak, L S; Pestotnik, R; Piilonen, L E; Poluektov, A; Sahoo, H; Sakai, Y; Schneider, O; Schwartz, A J; Seidl, R; Sekiya, A; Senyo, K; Sevior, M E; Shapkin, M; Shebalin, V; Shen, C P; Shiu, J G; Shwartz, B; Singh, J B; Sokolov, A; Stanic, S; Staric, M; Sumiyoshi, T; Takasaki, F; Tamura, N; Tanaka, M; Taylor, G N; Teramoto, Y; Trabelsi, K; Tsuboyama, T; Uehara, S; Uglov, T; Unno, Y; Uno, S; Urquijo, P; Usov, Yu; Varner, G; Vervink, K; Vinokurova, A; Wang, C H; Wang, P; Wang, X L; Watanabe, Y; Won, E; Yamashita, Y; Yamauchi, M; Yuan, C Z; Zhang, C C; Zhang, Z P; Zhilich, V; Zhulanov, V; Zivko, T; Zupanc, A; Zyukova, O

    2008-01-01

    We report a high-statistics measurement of the branching fraction for tau^- -> pi^- pi^0 nu_tau and the invariant mass spectrum of the produced pi^- pi^0 system using 72.2 fb^-1 of data recorded with the Belle detector at the KEKB asymmetric-energy e^+ e^- collider. The branching fraction obtained is (25.12 +/- 0.01 +/- 0.38)%, where the first error is statistical and the second is systematic. The unfolded pi^- pi^0 mass spectrum is used to determine resonance parameters for the rho(770), rho'(1450), and rho''(1700) mesons. We also use this spectrum to estimate the hadronic (2pi) contribution to the anomalous magnetic moment of the muon (a_{mu}^{pipi}). Our result for a_{mu}^{pipi} integrated over the mass range sqrt{s} = 2m_{pi} - 1.8 GeV/c^2 is a_{mu}^{pipi} = (519.1 +/- 1.5 (exp) +/- 2.6 (Br) +/- 2.5 (isospin)) x 10^{-10}, where the first error is due to the experimental uncertainties, the second is due to the uncertainties in the branching fractions and the third is due to the uncertainties in the isospin...

  8. High Accuracy Extraction of Respiratory Sinus Arrhythmia with Statistical Processing using Normal Distribution

    Science.gov (United States)

    Numata, Takashi; Ogawa, Yutaro; Yoshida, Lui; Kotani, Kiyoshi; Jimbo, Yasuhiko

    The autonomic nervous system is important in maintaining homeostasis by mediating the opposing effects of the sympathetic and parasympathetic nervous activity on organs. Although it is known that the amplitude of RSA (Respiratory Sinus Arrhythmia) is an index of parasympathetic nervous activity, it is difficult to estimate that activity in real-time in everyday situations. It is partly caused by body motions and extrasystoles. Also, automatic recognition of the R-wave on electrocardiograms is required for real-time analysis of RSA amplitude, there is an unresolved problem of false recognition of the R-wave. In this paper, we propose a method to evaluate the amplitude of RSA accurately using statistical processing with probabilistic models. Then, we estimate parasympathetic nervous activity during body motion and isometric exercise to examine the validity of the method. As a result, using the proposed method, we demonstrate that the amplitude of RSA can be extracted with false recognition of the R-wave. In addition, an appropriate threshold for the estimate is one or five percent because waveforms of RSA amplitude do not follow the abrupt changes of the parasympathetic nervous activity evoked by isometric exercise with the threshold at ten percent. Furthermore, the method using normal distribution is found to be more appropriate than that of chi-square distribution for statistical processing. Therefore, we expect that the proposed method can evaluate parasympathetic nervous activity with high accuracy in everyday situations.

  9. Multicomponent statistical analysis to identify flow and transport processes in a highly-complex environment

    Science.gov (United States)

    Moeck, Christian; Radny, Dirk; Borer, Paul; Rothardt, Judith; Auckenthaler, Adrian; Berg, Michael; Schirmer, Mario

    2016-11-01

    A combined approach of multivariate statistical analysis, namely factor analysis (FA) and hierarchical cluster analysis (HCA), interpretation of geochemical processes, stable water isotope data and organic micropollutants enabling to assess spatial patterns of water types was performed for a study area in Switzerland, where drinking water production is close to different potential input pathways for contamination. To avoid drinking water contamination, artificial groundwater recharge with surface water into an aquifer is used to create a hydraulic barrier between potential intake pathways for contamination and drinking water extraction wells. Inter-aquifer mixing in the subsurface is identified, where a high amount of artificial infiltrated surface water is mixed with a lesser amount of water originating from the regional flow pathway in the vicinity of drinking water extraction wells. The spatial distribution of different water types can be estimated and a conceptual system understanding is developed. Results of the multivariate statistical analysis are comparable with gained information from isotopic data and organic micropollutants analyses. The integrated approach using different kinds of observations can be easily transferred to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets. The combination of additional data with different information content is conceivable and enabled effective interpretation of hydrological processes. Using the applied approach leads to more sound conceptual system understanding acting as the very basis to develop improved water resources management practices in a sustainable way.

  10. Parallelization and High-Performance Computing Enables Automated Statistical Inference of Multi-scale Models.

    Science.gov (United States)

    Jagiella, Nick; Rickert, Dennis; Theis, Fabian J; Hasenauer, Jan

    2017-02-22

    Mechanistic understanding of multi-scale biological processes, such as cell proliferation in a changing biological tissue, is readily facilitated by computational models. While tools exist to construct and simulate multi-scale models, the statistical inference of the unknown model parameters remains an open problem. Here, we present and benchmark a parallel approximate Bayesian computation sequential Monte Carlo (pABC SMC) algorithm, tailored for high-performance computing clusters. pABC SMC is fully automated and returns reliable parameter estimates and confidence intervals. By running the pABC SMC algorithm for ∼10(6) hr, we parameterize multi-scale models that accurately describe quantitative growth curves and histological data obtained in vivo from individual tumor spheroid growth in media droplets. The models capture the hybrid deterministic-stochastic behaviors of 10(5)-10(6) of cells growing in a 3D dynamically changing nutrient environment. The pABC SMC algorithm reliably converges to a consistent set of parameters. Our study demonstrates a proof of principle for robust, data-driven modeling of multi-scale biological systems and the feasibility of multi-scale model parameterization through statistical inference.

  11. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution.

    All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity.

    An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to

  12. Economic consequences of mastitis and withdrawal of milk with high somatic cell count in Swedish dairy herds

    DEFF Research Database (Denmark)

    Nielsen, C; Østergaard, Søren; Emanuelson, U

    2010-01-01

    no clinical mastitis (CM) while keeping the incidence of subclinical mastitis (SCM) constant and vice versa. Six different strategies to withdraw milk with high SCC were compared. The decision to withdraw milk was based on herd-level information in three scenarios: withdrawal was initiated when the predicted......% of the herd net return given the initial incidence of mastitis. Expressed per cow-year, the avoidable cost of mastitis was €55. The costs per case of CM and SCM were estimated at €278 and €60, respectively. Withdrawing milk with high SCC was never profitable because this generated a substantial amount of milk...... resulted in less negative effect on net return. It was concluded that the current milk-pricing system makes it more profitable for farmers to sell a larger amount of milk with higher SCC than to withdraw milk with high SCC to obtain payment premiums, at least in herds with mastitis incidences within...

  13. High Statistics Measurement of B arrow Λc X and Σc X

    Science.gov (United States)

    Wappler, Frank

    1997-04-01

    This analysis is based on about 5 fb-1 of data recorded by the CLEO II detector operating at CESR on the Υ(4S) resonance and at center-of-mass energies just below the threshold for the production of B\\overlineB mesons. The first observation of the inclusive decays B arrow Λ_c^+ X, Σ_c^0 X and Σ_c^++X has already been reported by ARGUS (1988) and CLEO (1992). We now present a high statistics measurement of these decay modes. Improved values for branching fractions and momentum spectra will be presented. Results of the search for the inclusive decay B arrow Σ_c^+ X, which has not yet been observed, will also be presented.

  14. Geant4 electromagnetic physics for high statistic simulation of LHC experiments

    CERN Document Server

    Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L

    2012-01-01

    An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.

  15. Modification to the Klein-Nishina cross section for Ge electrons at high statistics limit

    Energy Technology Data Exchange (ETDEWEB)

    Wang, T.F.

    1995-11-03

    Modification factors for the Klein-Nishina cross-sections for gamma-ray with energies between 50 keV and 250 keV incident on Ge electrons have been obtained at the high statistics limit. In this limit, the Ge electrons can then be treated as they are obtained from the self-consistent augmented plane wave calculations, without considering the orientation of crystal lattice with respect to incident photons. The kinematics corrections (i.e. outgoing momenta), on the other hand, have to be taken into account on an event by event basis. Even so, the computing time has been reduced dramatically since the relativistic calculation of the modifications to the Klein-Nishina cross sections is the most tedious one. The modification factors are almost linear with respect to incident photon energy in the interesting energy range with respect to a given photon outgoing angle.

  16. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  17. A statistical approach to describe highly excited heavy and superheavy nuclei

    Science.gov (United States)

    Chen, Peng-Hui; Feng, Zhao-Qing; Li, Jun-Qing; Zhang, Hong-Fei

    2016-09-01

    A statistical approach based on the Weisskopf evaporation theory has been developed to describe the de-excitation process of highly excited heavy and superheavy nuclei, in particular for the proton-rich nuclei. The excited nucleus is cooled by evaporating γ-rays, light particles (neutrons, protons, α etc) in competition with binary fission, in which the structure effects (shell correction, fission barrier, particle separation energy) contribute to the processes. The formation of residual nuclei is evaluated via sequential emission of possible particles above the separation energies. The available data of fusion-evaporation excitation functions in the 28Si+198Pt reaction can be reproduced nicely within the approach. Supported by Major State Basic Research Development Program in China (2015CB856903), National Natural Science Foundation of China Projects (11175218, U1332207, 11475050, 11175074), and Youth Innovation Promotion Association of Chinese Academy of Sciences

  18. Atrial fibrillatory signal estimation using blind source extraction algorithm based on high-order statistics

    Institute of Scientific and Technical Information of China (English)

    WANG Gang; RAO NiNi; ZHANG Ying

    2008-01-01

    The analysis and the characterization of atrial fibrillation (AF) requires,in a previous key step,the extraction of the atrial activity (AA) free from 12-lead electrocardiogram (ECG).This contribution proposes a novel non-invasive approach for the AA estimation in AF episodes.The method is based on blind source extraction (BSE) using high order statistics (HOS).The validity and performance of this algorithm are confirmed by extensive computer simulations and experiments on realworld data.In contrast to blind source separation (BSS) methods,BSE only extract one desired signal,and it is easy for the machine to judge whether the extracted signal is AA source by calculating its spectrum concentration,while it is hard for the machine using BSS method to judge which one of the separated twelve signals is AA source.Therefore,the proposed method is expected to have great potential in clinical monitoring.

  19. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  20. StatPatternRecognition: A C++ Package for Statistical Analysis of High Energy Physics Data

    CERN Document Server

    Narsky, I

    2005-01-01

    Modern analysis of high energy physics (HEP) data needs advanced statistical tools to separate signal from background. A C++ package has been implemented to provide such tools for the HEP community. The package includes linear and quadratic discriminant analysis, decision trees, bump hunting (PRIM), boosting (AdaBoost), bagging and random forest algorithms, and interfaces to the standard backpropagation neural net and radial basis function neural net implemented in the Stuttgart Neural Network Simulator. Supplemental tools such as bootstrap, estimation of data moments, and a test of zero correlation between two variables with a joint elliptical distribution are also provided. The package offers a convenient set of tools for imposing requirements on input data and displaying output. Integrated in the BaBar computing environment, the package maintains a minimal set of external dependencies and therefore can be easily adapted to any other environment. It has been tested on many idealistic and realistic examples.

  1. Gamma-gamma coincidence performance of LaBr3:Ce scintillation detectors vs HPGe detectors in high count-rate scenarios.

    Science.gov (United States)

    Drescher, A; Yoho, M; Landsberger, S; Durbin, M; Biegalski, S; Meier, D; Schwantes, J

    2017-04-01

    A radiation detection system consisting of two cerium doped lanthanum bromide (LaBr3:Ce) scintillation detectors in a gamma-gamma coincidence configuration has been used to demonstrate the advantages that coincident detection provides relative to a single detector, and the advantages that LaBr3:Ce detectors provide relative to high purity germanium (HPGe) detectors. Signal to noise ratios of select photopeak pairs for these detectors have been compared to high-purity germanium (HPGe) detectors in both single and coincident detector configurations in order to quantify the performance of each detector configuration. The efficiency and energy resolution of LaBr3:Ce detectors have been determined and compared to HPGe detectors. Coincident gamma-ray pairs from the radionuclides (152)Eu and (133)Ba have been identified in a sample that is dominated by (137)Cs. Gamma-gamma coincidence successfully reduced the Compton continuum from the large (137)Cs peak, revealed several coincident gamma energies characteristic of these nuclides, and improved the signal-to-noise ratio relative to single detector measurements. LaBr3:Ce detectors performed at count rates multiple times higher than can be achieved with HPGe detectors. The standard background spectrum consisting of peaks associated with transitions within the LaBr3:Ce crystal has also been significantly reduced. It is shown that LaBr3:Ce detectors have the unique capability to perform gamma-gamma coincidence measurements in very high count rate scenarios, which can potentially benefit nuclear safeguards in situ measurements of spent nuclear fuel. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Automotive Interior Noise Reduction in High Frequency Using Statistical Energy Analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Xin; WANG Deng-feng; ZHU Lei; MA Zheng-dong

    2009-01-01

    Statistical energy analysis (SEA) is an effective method for predicting high frequency vibro-acoustic performance of automobiles. A full vehicle SEA model is presented for interior noise reduction. It is composed of a number of subsystems based on a 3D model with all parameters for each subsystem. The excitation inputs are measured through road tests in different conditions, including inputs from the engine vibration and the sound pressure of the engine bay. The accuracy in high frequency of SEA model is validated, by comparing the analysis results with the testing pressure level data at driver's right ear. Noise contribution and sensitivity of key subsystems are analyzed. Finally, the effectiveness of noise reduction is verified. Based on the SEA model, an approach combining test and simulation is proposed for the noise vibration and harshness (NVH) design in vehicle development. It contains building the SEA model, testing for subsystem parameter identification, validating the simulation model, identifying subsystem power inputs, analyzing the design sensitivity. An example is given to demonstrate the interior noise reduction in high frequency.

  3. Topical levocabastine—a review of therapeutic efficacy compared with topical sodium cromoglycate and oral terfenadine on days with high pollen counts

    Directory of Open Access Journals (Sweden)

    Marianela de Azevedo

    1995-01-01

    Full Text Available Levocabastine is a new H1-receptor antagonist specifically developed for the topical treatment of seasonal allergic rhinoconjunctivitis. Clinical experience to date clearly demonstrates that levocabastine eye drops and nasal spray are effective and well tolerated for the treatment of this allergic disorder. Analysis of data from a number of comparative trials reveals that topical levocabastine is at least as effective as sodium cromoglycate and the oral antihistamine terfenadine, even on days with high pollen counts (≥ 50 pollen particles/m3 when symptoms are severe. Coupled with a rapid onset of action and twice daily dosing, these findings make topical levocabastine an attractive alternative to other therapeutic approaches as a first-line therapy for the treatment, of this common condition.

  4. High white blood cell count is associated with a worsening of insulin sensitivity and predicts the development of type 2 diabetes

    DEFF Research Database (Denmark)

    Vozarova, Barbora; Weyer, Christian; Lindsay, Robert S

    2002-01-01

    Chronic low-grade inflammation may be involved in the pathogenesis of insulin resistance and type 2 diabetes. We examined whether a high white blood cell count (WBC), a marker of inflammation, predicts a worsening of insulin action, insulin secretory function, and the development of type 2 diabetes......-g oral glucose tolerance test), insulin action (M; by hyperinsulinemic clamp), and acute insulin secretory response (AIR; by 25-g intravenous glucose challenge). Among 272 subjects who were normal glucose tolerant (NGT) at baseline, 54 developed diabetes over an average follow-up of 5.5 +/- 4.......4 years. Among those who remained nondiabetic, 81 subjects had follow-up measurements of M and AIR. Cross-sectionally, WBC was related to percent body fat (r = 0.32, P AIR (r = 0.06, P = 0.4). In a multivariate analysis, when adjusted for age and sex, both...

  5. The ICESat-2 Inland Water Height Data Product: Evaluation of Water Profiles Using High Altitude Photon Counting Lidar

    Science.gov (United States)

    Jasinski, M. F.; Stoll, J.; Cook, W. B.; Arp, C. D.; Birkett, C. M.; Brunt, K. M.; Harding, D. J.; Jones, B. M.; Markus, T.; Neumann, T.

    2015-12-01

    The Advanced Topographic Laser Altimeter System (ATLAS) on the Ice, Cloud, and Land Elevation Satellite (ICESat-2), scheduled to launch in 2017, is a low energy, high repetition rate, short pulse width, 532 nm lidar. Although primarily designed for icecap and sea ice monitoring, ATLAS also will record dense observations over Pan-Arctic inland water bodies throughout its designed three year life span. These measurements will offer improved understanding of the linkages between climate variability and Arctic hydrology including closure of the Pan-Arctic water balance. An ICESat-2 Inland Water Body Height Data Product is being developed consisting of along-track water surface height, slope, and roughness for each ATLAS strong beam, and also aspect and slope between adjacent beams. The data product will be computed for all global inland water bodies that are traversed by ICESat-2 during clear to moderately clear atmospheric conditions. While the domain of the ATL13 data product is global, the focus is on high-latitude terrestrial regions where the convergence of the ICESat-2 orbits will provide spatially dense observations. Water bodies will be identified primarily through the use of an "Inland Water Body Shape Mask". In preparation for the mission, the Multiple Beam Altimeter Lidar Experimental Lidar (MABEL), was built and flown during numerous high altitude experiments, observing a wide range of water targets. The current analysis examines several MABEL inland and near coastal coastal targets during 2012 to 2015, focusing on along track surface water height, light penetration into water under a range of atmospheric and water conditions. Sites include several Alaska lakes, the Chesapeake Bay, and the near shore Atlantic coast. Results indicate very good capability for retrieving along track surface water height and standard deviation and penetration depth. Overall, the MABEL data and subsequent analyses have demonstrated the feasibility of the ATL13 algorithm for

  6. High-Moisture Diet for Laboratory Rats: Complete Blood Counts, Serum Biochemical Values, and Intestinal Enzyme Activity

    Science.gov (United States)

    Battles, August H.; Knapka, Joseph T.; Stevens, Bruce R.; Lewis, Laura; Lang, Marie T.; Gruendel, Douglas J.

    1991-01-01

    Rats were fed an irradiated high-moisture diet (KSC-25) with or without access to a water bottle. Physiologic values were compared between these two groups and a group of rats fed a purified diet. Hematologic and serum biochemical values, urine specific gravity, and intestinal enzyme activities were determined from samples collected from the three groups of rats. Sprague Dawley rats (n=32) fed the irradiated high-moisture diet with or without a water bottle were the test animals. Rats (n=16) fed an irradiated purified diet and water provided via a water bottle were the control group. The purified diet formulation, modified AIN-76A, is a commonly used purified diet for laboratory rodents. All rats remained alert and healthy throughout the study. A comparison of the physiologic values of rats in this study with reported normal values indicated that all of the rats in the study were in good health. Significant differences (P less than 0.05) of the physiologic values from each rat group are reported.

  7. High-Moisture Diet for Laboratory Rats: Complete Blood Counts, Serum Biochemical Values, and Intestinal Enzyme Activity

    Science.gov (United States)

    Battles, August H.; Knapka, Joseph T.; Stevens, Bruce R.; Lewis, Laura; Lang, Marie T.; Gruendel, Douglas J.

    1991-01-01

    Rats were fed an irradiated high-moisture diet (KSC-25) with or without access to a water bottle. Physiologic values were compared between these two groups and a group of rats fed a purified diet. Hematologic and serum biochemical values, urine specific gravity, and intestinal enzyme activities were determined from samples collected from the three groups of rats. Sprague Dawley rats (n=32) fed the irradiated high-moisture diet with or without a water bottle were the test animals. Rats (n=16) fed an irradiated purified diet and water provided via a water bottle were the control group. The purified diet formulation, modified AIN-76A, is a commonly used purified diet for laboratory rodents. All rats remained alert and healthy throughout the study. A comparison of the physiologic values of rats in this study with reported normal values indicated that all of the rats in the study were in good health. Significant differences (P less than 0.05) of the physiologic values from each rat group are reported.

  8. Constraints on photoionization feedback from number counts of ultra-faint high-redshift galaxies in the Frontier Fields

    CERN Document Server

    Castellano, M; Ferrara, A; Merlin, E; Fontana, A; Amorín, R; Grazian, A; Mármol-Queralto, E; Michałowski, M J; Mortlock, A; Paris, D; Parsa, S; Pilo, S; Santini, P

    2016-01-01

    We exploit a sample of ultra-faint high-redshift galaxies (demagnified HST $H_{160}$ magnitude $>30$) in the Frontier Fields clusters A2744 and M0416 to constrain a theoretical model for the UV luminosity function (LF) in the presence of photoionization feedback. The objects have been selected on the basis of accurate photometric redshifts computed from multi-band photometry including 7 HST bands and deep $K_s$ and IRAC observations. Magnification is computed on an object-by-object basis from all available lensing models of the two clusters. We take into account source detection completeness as a function of luminosity and size, magnification effects and systematics in the lens modeling of the clusters under investigation. We find that our sample of high-$z$ galaxies constrain the cut-off halo circular velocity below which star-formation is suppressed by photo-ionization feedback to $v_c^{\\rm cut} < 50$ km s$^{-1}$. This circular velocity corresponds to a halo mass of $\\approx5.6\\times10^9~M_\\odot$ and $\\a...

  9. Enzyme pretreatment of high-count cotton fabric%纯棉细布的生物酶前处理工艺

    Institute of Scientific and Technical Information of China (English)

    岳仕芳

    2012-01-01

    Enzyme is applied to pretreatment of high counts cotton fabrics to realize alkali free clean production. Quality indexes of semi-products and wastewater indexes of different pretreatment processes are discussed. The results show that the high counts cotton fabrics pretreated with peroxide desizing and bleaching, enzyme cold pad batch, hot washing process have the whiteness of 81.39%, capillary effect of 8.2 cm/30 min and lower strength loss of 5.8%. The pretreatment process with cold pad batch method can realize alkali free pretreatment, cut down water and energy consumption, and reduce wastewater discharge and burden on wastewater treatment.%纯棉高支织物前处理中使用生物酶,以实现无碱清洁生产.探讨了几种前处理工艺的半制品质量指标和废水指标.结果表明,采用氧退漂-生物酶冷堆-热水洗工艺对纯棉高支轻薄织物进行前处理,半制品白度81.39%,毛效8.2 cm/30 min,强力损伤率5.8%,尤其是强力保留率较高.该工艺通过冷堆完成,实现了无碱前处理,大幅降低了能耗和水耗,减少了排污量及废水处理难度.

  10. Is total lymphocyte count related to nutritional markers in hospitalized older adults?

    Directory of Open Access Journals (Sweden)

    Vânia Aparecida LEANDRO-MERHI

    Full Text Available ABSTRACT BACKGROUND Older patients are commonly malnourished during hospital stay, and a high prevalence of malnutrition is found in hospitalized patients aged more than 65 years. OBJECTIVE To investigate whether total lymphocyte count is related to other nutritional markers in hospitalized older adults. METHODS Hospitalized older adults (N=131 were recruited for a cross-sectional study. Their nutritional status was assessed by the Nutritional Risk Screening (NRS, anthropometry, and total lymphocyte count. The statistical analyses included the chi-square test, Fisher's exact test, and Mann-Whitney test. Spearman's linear correlation coefficient determined whether total lymphocyte count was correlated with the nutritional markers. Multiple linear regression determined the parameters associated with lymphocyte count. The significance level was set at 5%. RESULTS According to the NRS, 41.2% of the patients were at nutritional risk, and 36% had mild or moderate depletion according to total lymphocyte count. Total lymphocyte count was weakly correlated with mid-upper arm circumference (r=0.20507; triceps skinfold thickness (r=0.29036, and length of hospital stay (r= -0.21518. Total lymphocyte count in different NRS categories differed significantly: older adults who were not at nutritional risk had higher mean and median total lymphocyte count ( P =0.0245. Multiple regression analysis showed that higher lymphocyte counts were associated with higher triceps skinfold thicknesses and no nutritional risk according to the NRS. CONCLUSION Total lymphocyte count was correlated with mid-upper arm circumference, triceps skinfold thickness, and nutritional risk according to the NRS. In multiple regression the combined factors that remained associated with lymphocyte count were NRS and triceps skinfold thickness. Therefore, total lymphocyte count may be considered a nutritional marker. Other studies should confirm these findings.

  11. The Suzaku View of Highly Ionized Outflows in AGN. 1; Statistical Detection and Global Absorber Properties

    Science.gov (United States)

    Gofford, Jason; Reeves, James N.; Tombesi, Francesco; Braito, Valentina; Turner, T. Jane; Miller, Lance; Cappi, Massimo

    2013-01-01

    We present the results of a new spectroscopic study of Fe K-band absorption in active galactic nuclei (AGN). Using data obtained from the Suzaku public archive we have performed a statistically driven blind search for Fe XXV Healpha and/or Fe XXVI Lyalpha absorption lines in a large sample of 51 Type 1.0-1.9 AGN. Through extensive Monte Carlo simulations we find that statistically significant absorption is detected at E greater than or approximately equal to 6.7 keV in 20/51 sources at the P(sub MC) greater than or equal tov 95 per cent level, which corresponds to approximately 40 per cent of the total sample. In all cases, individual absorption lines are detected independently and simultaneously amongst the two (or three) available X-ray imaging spectrometer detectors, which confirms the robustness of the line detections. The most frequently observed outflow phenomenology consists of two discrete absorption troughs corresponding to Fe XXV Healpha and Fe XXVI Lyalpha at a common velocity shift. From xstar fitting the mean column density and ionization parameter for the Fe K absorption components are log (N(sub H) per square centimeter)) is approximately equal to 23 and log (Xi/erg centimeter per second) is approximately equal to 4.5, respectively. Measured outflow velocities span a continuous range from less than1500 kilometers per second up to approximately100 000 kilometers per second, with mean and median values of approximately 0.1 c and approximately 0.056 c, respectively. The results of this work are consistent with those recently obtained using XMM-Newton and independently provides strong evidence for the existence of very highly ionized circumnuclear material in a significant fraction of both radio-quiet and radio-loud AGN in the local universe.

  12. Cluster survey of the high-altitude cusp properties: a three-year statistical study

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2004-09-01

    Full Text Available The global characteristics of the high-altitude cusp and its surrounding regions are investigated using a three-year statistical survey based on data obtained by the Cluster spacecraft. The analysis involves an elaborate orbit-sampling methodology that uses a model field and takes into account the actual solar wind conditions and level of geomagnetic activity. The spatial distribution of the magnetic field and various plasma parameters in the vicinity of the low magnetic field exterior cusp are determined and it is found that: 1 The magnetic field distribution shows the presence of an intermediate region between the magnetosheath and the magnetosphere: the exterior cusp, 2 This region is characterized by the presence of dense plasma of magnetosheath origin; a comparison with the Tsyganenko (1996 magnetic field model shows that it is diamagnetic in nature, 3 The spatial distributions show that three distinct boundaries with the lobes, the dayside plasma sheet and the magnetosheath surround the exterior cusp, 4 The external boundary with the magnetosheath has a sharp bulk velocity gradient, as well as a density decrease and temperature increase as one goes from the magnetosheath to the exterior cusp, 5 While the two inner boundaries form a funnel, the external boundary shows no clear indentation, 6 The plasma and magnetic pressure distributions suggest that the exterior cusp is in equilibrium with its surroundings in a statistical sense, and 7 A preliminary analysis of the bulk flow distributions suggests that the exterior cusp is stagnant under northward IMF conditions but convective under southward IMF conditions.

  13. A comprehensive statistical investigation of schlieren image velocimetry (SIV) using high-velocity helium jet

    Science.gov (United States)

    Biswas, Sayan; Qiao, Li

    2017-03-01

    A detailed statistical assessment of seedless velocity measurement using Schlieren Image Velocimetry (SIV) was explored using open source Robust Phase Correlation (RPC) algorithm. A well-known flow field, an axisymmetric turbulent helium jet, was analyzed near and intermediate region (0≤ x/d≤ 20) for two different Reynolds numbers, Re d = 11,000 and Re d = 22,000 using schlieren with horizontal knife-edge, schlieren with vertical knife-edge and shadowgraph technique, and the resulted velocity fields from SIV techniques were compared to traditional Particle Image Velocimetry (PIV) measurements. A novel, inexpensive, easy to setup two-camera SIV technique had been demonstrated to measure high-velocity turbulent jet, with jet exit velocities 304 m/s (Mach = 0.3) and 611 m/s (Mach = 0.6), respectively. Several image restoration and enhancement techniques were tested to improve signal to noise ratio (SNR) in schlieren and shadowgraph images. Processing and post-processing parameters for SIV techniques were examined in detail. A quantitative comparison between self-seeded SIV techniques and traditional PIV had been made using correlation statistics. While the resulted flow field from schlieren with horizontal knife-edge and shadowgraph showed excellent agreement with PIV measurements, schlieren with vertical knife-edge performed poorly. The performance of spatial cross-correlations at different jet locations using SIV techniques and PIV was evaluated. Turbulence quantities like turbulence intensity, mean velocity fields, Reynolds shear stress influenced spatial correlations and correlation plane SNR heavily. Several performance metrics such as primary peak ratio (PPR), peak to correlation energy (PCE), the probability distribution of signal and noise were used to compare capability and potential of different SIV techniques.

  14. What every radiochemist should know about statistics

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, W.L.

    1994-04-01

    Radionuclide decay and measurement with appropriate counting instruments is one of the few physical processes for which exact mathematical/probabilistic models are available. This paper discusses statistical procedures associated with display and analysis of radionuclide counting data that derive from these exact models. For low count situations the attractiveness of fixed-count-random-time procedures is discussed.

  15. Statistical theory of relaxation of high-energy electrons in quantum Hall edge states

    Science.gov (United States)

    Lunde, Anders Mathias; Nigg, Simon E.

    2016-07-01

    We investigate theoretically the energy exchange between the electrons of two copropagating, out-of-equilibrium edge states with opposite spin polarization in the integer quantum Hall regime. A quantum dot tunnel coupled to one of the edge states locally injects electrons at high energy. Thereby a narrow peak in the energy distribution is created at high energy above the Fermi level. A second downstream quantum dot performs an energy-resolved measurement of the electronic distribution function. By varying the distance between the two dots, we are able to follow every step of the energy exchange and relaxation between the edge states, even analytically under certain conditions. In the absence of translational invariance along the edge, e.g., due to the presence of disorder, energy can be exchanged by non-momentum-conserving two-particle collisions. For weakly broken translational invariance, we show that the relaxation is described by coupled Fokker-Planck equations. From these we find that relaxation of the injected electrons can be understood statistically as a generalized drift-diffusion process in energy space for which we determine the drift velocity and the dynamical diffusion parameter. Finally, we provide a physically appealing picture in terms of individual edge-state heating as a result of the relaxation of the injected electrons.

  16. Statistical analysis of laser driven protons using a high-repetition-rate tape drive target system

    Directory of Open Access Journals (Sweden)

    Muhammad Noaman-ul-Haq

    2017-04-01

    Full Text Available One of the challenges for laser-driven proton beams for many potential applications is their stability and reproducibility. We investigate the stability of the laser driven proton beams through statistical analysis of the data obtained by employing a high repetition rate tape driven target system. The characterization of the target system shows the positioning of the target within ∼15  μm in the focal plane of an off-axis parabola, with less than a micron variation in surface flatness. By employing this stable target system, we study the stability of the proton beams driven by ultrashort and intense laser pulses. Protons with maximum energies of ∼6±0.3  MeV were accelerated for a large number of laser shots taken at a rate of 0.2 Hz with a stability of less than 5% variations in cutoff energy. The development of high repetition rate target system may provide a platform to understand the dynamics of laser driven proton beams at the rate required for future applications.

  17. Statistical characteristics of the double ridges of subtropical high in the Northern Hemisphere

    Institute of Scientific and Technical Information of China (English)

    ZHAN Ruifen; LI Jianping; HE Jinhai

    2005-01-01

    The generality and some climatological characteristics of the double ridge systems of subtropical high (SH) are investigated statistically by using the daily NCEP/NCAR reanalysis data from 1958 to 1998. The results show that the SH double-ridge event is a common phenomenon in the Northern Hemisphere, with the distinct seasonal and regional features, that is, the majority of SH double-ridge geneses concentrate over the eastern North India Ocean- western North Pacific as well as the central North Pacific in the period from mid-July to mid-September. Especially over the western North Pacific subtropics, the SH double-ridge events are extremely active. It is found that the life cycle of most double-ridge events of western Pacific subtropical high (WPSH) is shorter but some still last longer. The WPSH double-ridge events occur most frequently from July to September, while there is a paucity of occurrences during November-March. Also, it is shown that the WPSH double-ridge events have a strong interannual variation with a certain periodicity which possesses a remarkably abrupt change in the mid-1970s. Additionally, the relationship between the WPSH double ridges and the meridional movement of WPSH is discussed.

  18. HDBStat!: A platform-independent software suite for statistical analysis of high dimensional biology data

    Directory of Open Access Journals (Sweden)

    Brand Jacob PL

    2005-04-01

    Full Text Available Abstract Background Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. Conclusion HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  19. Capsaicin-induced transcriptional changes in hypothalamus and alterations in gut microbial count in high fat diet fed mice.

    Science.gov (United States)

    Baboota, Ritesh K; Murtaza, Nida; Jagtap, Sneha; Singh, Dhirendra P; Karmase, Aniket; Kaur, Jaspreet; Bhutani, Kamlesh K; Boparai, Ravneet K; Premkumar, Louis S; Kondepudi, Kanthi Kiran; Bishnoi, Mahendra

    2014-09-01

    Obesity is a global health problem and recently it has been seen as a growing concern for developing countries. Several bioactive dietary molecules have been associated with amelioration of obesity and associated complications and capsaicin is one among them. The present work is an attempt to understand and provide evidence for the novel mechanisms of anti-obesity activity of capsaicin in high fat diet (HFD)-fed mice. Swiss albino mice divided in three groups (n=8-10) i.e. control, HFD fed and capsaicin (2mg/kg, po)+HFD fed were administered respective treatment for 3months. After measuring phenotypic and serum related biochemical changes, effect of capsaicin on HFD-induced transcriptional changes in hypothalamus, white adipose tissue (WAT) (visceral and subcutaneous), brown adipose tissue (BAT) and gut microbial alterations was studied and quantified. Our results suggest that, in addition to its well-known effects, oral administration of capsaicin (a) modulates hypothalamic satiety associated genotype, (b) alters gut microbial composition, (c) induces "browning" genotype (BAT associated genes) in subcutaneous WAT and (d) increases expression of thermogenesis and mitochondrial biogenesis genes in BAT. The present study provides evidence for novel and interesting mechanisms to explain the anti-obesity effect of capsaicin.

  20. Health Physics counting room

    CERN Multimedia

    1970-01-01

    The Health Physics counting room, where the quantity of induced radioactivity in materials is determined. This information is used to evaluate possible radiation hazards from the material investigated.

  1. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...

  2. Serum inhibin-b in fertile men is strongly correlated with low but not high sperm counts: a coordinated study of 1,797 European and US men

    DEFF Research Database (Denmark)

    Jørgensen, Niels; Liu, Fan; Andersson, Anna-Maria;

    2010-01-01

    To describe associations between serum inhibin-b and sperm counts, adjusted for effect of time of blood sampling, in larger cohorts than have been previously reported.......To describe associations between serum inhibin-b and sperm counts, adjusted for effect of time of blood sampling, in larger cohorts than have been previously reported....

  3. Granulocyte colony-stimulating factor increases CD4+ T cell counts of human immunodeficiency virus-infected patients receiving stable, highly active antiretroviral therapy

    DEFF Research Database (Denmark)

    Aladdin, H; Ullum, H; Dam Nielsen, S.

    2000-01-01

    counts resulted from increases in CD45RO+ memory T cells and cells expressing the CD38 activation marker. Lymphocyte proliferative responses to phytohemagglutinin and Candida antigen decreased, whereas NK cell activity and plasma HIV RNA did not change during G-CSF treatment. After 24 weeks, all immune......Thirty human immunodeficiency virus (HIV)-infected patients with CD4+ T cell counts

  4. Practical Statistics

    CERN Document Server

    Lyons, L

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  5. Shakespeare Live! and Character Counts.

    Science.gov (United States)

    Brookshire, Cathy A.

    This paper discusses a live production of Shakespeare's "Macbeth" (in full costume but with no sets) for all public middle school and high school students in Harrisonburg and Rockingham, Virginia. The paper states that the "Character Counts" issues that are covered in the play are: decision making, responsibility and…

  6. On Counting the Rational Numbers

    Science.gov (United States)

    Almada, Carlos

    2010-01-01

    In this study, we show how to construct a function from the set N of natural numbers that explicitly counts the set Q[superscript +] of all positive rational numbers using a very intuitive approach. The function has the appeal of Cantor's function and it has the advantage that any high school student can understand the main idea at a glance…

  7. Anarthria impairs subvocal counting.

    Science.gov (United States)

    Cubelli, R; Nichelli, P; Pentore, R

    1993-12-01

    We studied subvocal counting in two pure anarthric patients. Analysis showed that they performed definitively worse than normal subjects free to articulate subvocally and their scores were in the lower bounds of the performances of subjects suppressing articulation. These results suggest that subvocal counting is impaired after anarthria.

  8. High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions

    Energy Technology Data Exchange (ETDEWEB)

    Detmold, Will; Detmold, William; Orginos, Konstantinos; R. Beane, Silas; C. Luu, Thomas; Parreno, Assumpta; J. Savage, Martin; Torok, Aaron; Walker-Loud, Andre

    2009-01-01

    We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of $\\Nprops$ sets of measurements are made using $\\Ncfgs$ gauge configurations of size $20^3\\times 128$ with an anisotropy parameter $\\xi= b_s/b_t = 3.5$, a spatial lattice spacing of $b_s=0.1227\\pm 0.0008~{\\rm fm}$, and pion mass of $\\mpi\\sim 390~{\\rm MeV}$. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the $\\sim 0.2\\%$-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the $N\\pi$ threshold and, therefore, the isos

  9. Combined effects of deterministic and statistical structure on high-frequency regional seismograms

    Science.gov (United States)

    Sanborn, Christopher J.; Cormier, Vernon F.; Fitzpatrick, Michele

    2017-08-01

    Radiative transport modelling can combine the effects of both large-scale (deterministic) and the small-scale (statistical) structure on the coda envelopes of high-frequency regional seismograms. We describe a computer code to implement radiative transport modelling that propagates packets of seismic body wave energy along ray paths through large-scale deterministic 3-D structure, including the effects of velocity gradients, intrinsic attenuation, source radiation pattern and multiple scattering by layer boundaries and small-scale heterogeneities specified by a heterogeneity spectrum. The spatial distribution of these energy packets can be displayed as time snapshots to aid in the understanding of regional phase propagation or displayed as a coda envelope by summing at receiver bins. These techniques are applied to earthquakes and explosions recorded in the Lop Nor, China region to model observed narrow band passed seismic codas in the 1-4 Hz band. We predict that source discriminants in this region based on P/Lg amplitude ratios will best separate earthquake and explosion populations at frequencies 2 Hz and higher.

  10. Identifying the 'inorganic gene' for high-temperature piezoelectric perovskites through statistical learning.

    Science.gov (United States)

    Balachandran, Prasanna V; Broderick, Scott R; Rajan, Krishna

    2011-08-01

    This paper develops a statistical learning approach to identify potentially new high-temperature ferroelectric piezoelectric perovskite compounds. Unlike most computational studies on crystal chemistry, where the starting point is some form of electronic structure calculation, we use a data-driven approach to initiate our search. This is accomplished by identifying patterns of behaviour between discrete scalar descriptors associated with crystal and electronic structure and the reported Curie temperature (TC) of known compounds; extracting design rules that govern critical structure-property relationships; and discovering in a quantitative fashion the exact role of these materials descriptors. Our approach applies linear manifold methods for data dimensionality reduction to discover the dominant descriptors governing structure-property correlations (the 'genes') and Shannon entropy metrics coupled to recursive partitioning methods to quantitatively assess the specific combination of descriptors that govern the link between crystal chemistry and TC (their 'sequencing'). We use this information to develop predictive models that can suggest new structure/chemistries and/or properties. In this manner, BiTmO3-PbTiO3 and BiLuO3-PbTiO3 are predicted to have a TC of 730(°)C and 705(°)C, respectively. A quantitative structure-property relationship model similar to those used in biology and drug discovery not only predicts our new chemistries but also validates published reports.

  11. First high-statistics and high-resolution recoil-ion data from the WITCH retardation spectrometer

    Energy Technology Data Exchange (ETDEWEB)

    Finlay, P.; Breitenfeldt, M.; Porobic, T.; Wursten, E.; Couratin, C.; Soti, G.; Severijns, N. [KU Leuven University, Instituut voor Kern-en Stralingsfysica, Leuven (Belgium); Ban, G.; Fabian, X.; Flechard, X.; Lienard, E. [Normandie Univ., ENSICAEN, UNICAEN, CNRS/IN2P3, LPC Caen, Caen (France); Beck, M.; Friedag, P.; Weinheimer, C. [Universitaet Muenster, Institut fuer Kernphysik, Muenster (Germany); Glueck, F.; Kozlov, V.Y. [Karlsruhe Institute of Technology, Institut fuer Kernphysik, Karlsruhe (Germany); Herlert, A. [FAIR, Darmstadt (Germany); Knecht, A. [KU Leuven University, Instituut voor Kern-en Stralingsfysica, Leuven (Belgium); CERN, PH Department, Geneva (Switzerland); Tandecki, M. [TRIUMF, Vancouver BC (Canada); Traykov, E. [CEA/DSM-CNRS/IN2P3, GANIL, Caen (France); Van Gorp, S. [RIKEN, Atomic Physics Laboratory, Saitama (Japan); Zakoucky, D. [ASCR, Nuclear Physics Institute, Rez (Czech Republic)

    2016-07-15

    The first high-statistics and high-resolution data set for the integrated recoil-ion energy spectrum following the β{sup +} decay of {sup 35}Ar has been collected with the WITCH retardation spectrometer located at CERN-ISOLDE. Over 25 million recoil-ion events were recorded on a large-area multichannel plate (MCP) detector with a time-stamp precision of 2 ns and position resolution of 0.1 mm due to the newly upgraded data acquisition based on the LPC Caen FASTER protocol. The number of recoil ions was measured for more than 15 different settings of the retardation potential, complemented by dedicated background and half-life measurements. Previously unidentified systematic effects, including an energy-dependent efficiency of the main MCP and a radiation-induced time-dependent background, have been identified and incorporated into the analysis. However, further understanding and treatment of the radiation-induced background requires additional dedicated measurements and remains the current limiting factor in extracting a beta-neutrino angular correlation coefficient for {sup 35}Ar decay using the WITCH spectrometer. (orig.)

  12. Quality control of high-dose-rate brachytherapy: treatment delivery analysis using statistical process control.

    Science.gov (United States)

    Able, Charles M; Bright, Megan; Frizzell, Bart

    2013-03-01

    Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Quality Control of High-Dose-Rate Brachytherapy: Treatment Delivery Analysis Using Statistical Process Control

    Energy Technology Data Exchange (ETDEWEB)

    Able, Charles M., E-mail: cable@wfubmc.edu [Department of Radiation Oncology, Wake Forest School of Medicine, Winston-Salem, North Carolina (United States); Bright, Megan; Frizzell, Bart [Department of Radiation Oncology, Wake Forest School of Medicine, Winston-Salem, North Carolina (United States)

    2013-03-01

    Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.

  14. High statistics analysis using anisotropic clover lattices: (III) Baryon-baryon interactions

    Energy Technology Data Exchange (ETDEWEB)

    Beane, S; Detmold, W; Lin, H; Luu, T; Orginos, K; Savage, M; Torok, A; Walker-Loud, A

    2010-01-19

    Low-energy baryon-baryon interactions are calculated in a high-statistics lattice QCD study on a single ensemble of anisotropic clover gauge-field configurations at a pion mass of m{sub {pi}} {approx} 390 MeV, a spatial volume of L{sup 3} {approx} (2.5 fm){sup 3}, and a spatial lattice spacing of b {approx} 0.123 fm. Luescher's method is used to extract nucleon-nucleon, hyperon-nucleon and hyperon-hyperon scattering phase shifts at one momentum from the one- and two-baryon ground-state energies in the lattice volume. The isospin-3/2 N{Sigma} interactions are found to be highly spin-dependent, and the interaction in the {sup 3}S{sub 1} channel is found to be strong. In contrast, the N{Lambda} interactions are found to be spin-independent, within the uncertainties of the calculation, consistent with the absence of one-pion-exchange. The only channel for which a negative energy-shift is found is {Lambda}{Lambda}, indicating that the {Lambda}{Lambda} interaction is attractive, as anticipated from model-dependent discussions regarding the H-dibaryon. The NN scattering lengths are found to be small, clearly indicating the absence of any fine-tuning in the NN-sector at this pion mass. This is consistent with our previous Lattice QCD calculation of NN interactions. The behavior of the signal-to-noise ratio in the baryon-baryon correlation functions, and in the ratio of correlation functions that yields the ground-state energy splitting is explored. In particular, focus is placed on the window of time slices for which the signal-to-noise ratio does not degrade exponentially, as this provides the opportunity to extract quantitative information about multi-baryon systems.

  15. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  16. High Statistics Analysis using Anisotropic Clover Lattices: (III) Baryon-Baryon Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Beane, Silas [Univ. of New Hampshire, Durham, NH (United States); Detmold, William [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Lin, Huey-Wen [Univ. of Washington, Seattle, WA (United States); Luu, Thomas C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Orginos, Kostas [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin [Univ. of Washington, Seattle, WA (United States); Torok, Aaron M. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics; Walker-Loud, Andre [College of William and Mary, Williamsburg, VA (United States)

    2010-03-01

    Low-energy baryon-baryon interactions are calculated in a high-statistics lattice QCD study on a single ensemble of anisotropic clover gauge-field configurations at a pion mass of m_pi ~ 390 MeV, a spatial volume of L^3 ~ (2.5 fm)^3, and a spatial lattice spacing of b ~ 0.123 fm. Luscher’s method is used to extract nucleon-nucleon, hyperon-nucleon and hyperon-hyperon scattering phase shifts at one momentum from the one- and two-baryon ground-state energies in the lattice volume. The N-Sigma interactions are found to be highly spin-dependent, and the interaction in the ^3 S _1 channel is found to be strong. In contrast, the N-Lambda interactions are found to be spin-independent, within the uncertainties of the calculation, consistent with the absence of one-pion-exchange. The only channel for which a negative energy-shift is found is Lambda-Lambda, indicating that the Lambda-Lambda interaction is attractive, as anticipated from model-dependent discussions regarding the H-dibaryon. The NN scattering lengths are found to be small, clearly indicating the absence of any fine-tuning in the NN-sector at this pion mass. This is consistent with our previous Lattice QCD calculation of the NN interactions. The behavior of the signal-to-noise ratio in the baryon-baryon correlation functions, and in the ratio of correlation functions that yields the ground-state energy splitting

  17. Reference intervals of complete blood count constituents are highly correlated to waist circumference: should obese patients have their own "normal values?".

    Science.gov (United States)

    Vuong, Jennifer; Qiu, Yuelin; La, Myanh; Clarke, Gwen; Swinkels, Dorine W; Cembrowski, George

    2014-07-01

    Body mass index (BMI), the prevalent indicator of obesity, is not easily grasped by patients nor physicians. Waist circumference (WC) is correlated to obesity, is better understood and has a stronger relationship to the metabolic syndrome. We compiled WC, complete blood count (CBC) parameters as well as other pertinent data of 6766 25-55-year-old US volunteers sampled in the US National Health and Nutrition Examination Survey, in the years 2005-2010. To determine reference intervals of typical US patients visiting their clinician, we used minimal exclusion criteria. We compiled hemoglobin, red blood cell count, hematocrit, mean corpuscular volume (MCV), mean corpuscular hemoglobin concentration, mean cell hemoglobin (MCH), red cell distribution width (RDW), platelet count, mean platelet volume, and counts of white blood cells (WBC), neutrophils, lymphocytes, monocytes, eosinophils, and basophils. In addition, we also compiled serum C reactive protein and serum iron. The three major US races were studied and reference interval diagrams were constructed for each CBC parameter plotted against WC. WBC count, RDW, lymphocyte, neutrophil, and red blood cell count increase with WC. Conversely, serum iron and MCH and MCV decrease. These relationships may be related to insulin resistance and chronic activation of the immune system and the resulting low-grade inflammatory state. WC is a strong predictor for many CBC parameters, suggesting that WC should be taken into account when evaluating blood count results. Clinicians who take care of obese patients should be aware of altered hematology and investigate and treat accordingly.

  18. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    Science.gov (United States)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We

  19. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  20. Application of model output statistics to the GEM-AQ high resolution air quality forecast

    Science.gov (United States)

    Struzewska, J.; Kaminski, J. W.; Jefimow, M.

    2016-11-01

    The aim of the presented work was to analyse the impact of data stratification on the efficiency of the Model Output Statistics (MOS) methodology as applied to a high-resolution deterministic air quality forecast carried out with the GEM-AQ model. The following parameters forecasted by the GEM-AQ model were selected as predictors for the MOS equation: pollutant concentration, air temperature in the lowest model layer, wind speed in the lowest model layer, temperature inversion and the precipitation rate. A representative 2-year series were used to construct regression functions. Data series were divided into two subsets. Approximately 75% of the data (first 3 weeks of each month) were used to estimate the regression function parameters. Remaining 25% (last week of each month) were used to test the method (control period). The subsequent 12 months were used for method verification (verification period). A linear model fitted the function based on forecasted parameters to the observations. We have assumed four different temperature-based data stratification methods (for each method, separate equations were constructed). For PM10 and PM2.5, SO2 and NO2 the best correction results were obtained with the application of temperature thresholds in the cold season and seasonal distribution combined with temperature thresholds in the warm season. For the PM10, PM2.5 and SO2 the best results were obtained using a combination of two stratification methods separately for cold and warm seasons. For CO, the systematic bias of the forecasted concentrations was partly corrected. For ozone more sophisticated methods of data stratification did not bring a significant improvement.

  1. High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions

    Energy Technology Data Exchange (ETDEWEB)

    Will Detmold,Konstantinos Orginos,Silas R. Beane,Will Detmold,William Detmold,Thomas C. Luu,Konstantinos Orginos,Assumpta Parreno,Martin J. Savage,Aaron Torok,Andre Walker-Loud

    2009-06-01

    We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292,500 sets of measurements are made using 1194 gauge configurations of size 20^3 x 128 with an anisotropy parameter \\xi= b_s/b_t = 3.5, a spatial lattice spacing of b_s=0.1227\\pm 0.0008 fm, and pion mass of m_\\pi ~ 390 MeV. Ground state baryon masses are extracted with fully quantified uncertainties that are at or below the ~0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N\\pi threshold and, therefore, the isospin-1/2 \\pi N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.

  2. High Statistics Analysis using Anisotropic Clover Lattices: (I) Single Hadron Correlation Functions

    Energy Technology Data Exchange (ETDEWEB)

    Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A

    2009-03-23

    We present the results of high-statistics calculations of correlation functions generated with single-baryon interpolating operators on an ensemble of dynamical anisotropic gauge-field configurations generated by the Hadron Spectrum Collaboration using a tadpole-improved clover fermion action and Symanzik-improved gauge action. A total of 292, 500 sets of measurements are made using 1194 gauge configurations of size 20{sup 3} x 128 with an anisotropy parameter {zeta} = b{sub s}/b{sub t} = 3.5, a spatial lattice spacing of b{sub s} = 0.1227 {+-} 0.0008 fm, and pion mass of M{sub {pi}} {approx} 390 MeV. Ground state baryons masses are extracted with fully quantified uncertainties that are at or below the {approx} 0.2%-level in lattice units. The lowest-lying negative-parity states are also extracted albeit with a somewhat lower level of precision. In the case of the nucleon, this negative-parity state is above the N{pi} threshold and, therefore, the isospin-1/2 {pi}N s-wave scattering phase-shift can be extracted using Luescher's method. The disconnected contributions to this process are included indirectly in the gauge-field configurations and do not require additional calculations. The signal-to-noise ratio in the various correlation functions is explored and is found to degrade exponentially faster than naive expectations on many time-slices. This is due to backward propagating states arising from the anti-periodic boundary conditions imposed on the quark-propagators in the time-direction. We explore how best to distribute computational resources between configuration generation and propagator measurements in order to optimize the extraction of single baryon observables.

  3. A comparative verification of high resolution precipitation forecasts using model output statistics

    Science.gov (United States)

    van der Plas, Emiel; Schmeits, Maurice; Hooijman, Nicolien; Kok, Kees

    2017-04-01

    Verification of localized events such as precipitation has become even more challenging with the advent of high-resolution meso-scale numerical weather prediction (NWP). The realism of a forecast suggests that it should compare well against precipitation radar imagery with similar resolution, both spatially and temporally. Spatial verification methods solve some of the representativity issues that point verification gives rise to. In this study a verification strategy based on model output statistics is applied that aims to address both double penalty and resolution effects that are inherent to comparisons of NWP models with different resolutions. Using predictors based on spatial precipitation patterns around a set of stations, an extended logistic regression (ELR) equation is deduced, leading to a probability forecast distribution of precipitation for each NWP model, analysis and lead time. The ELR equations are derived for predictands based on areal calibrated radar precipitation and SYNOP observations. The aim is to extract maximum information from a series of precipitation forecasts, like a trained forecaster would. The method is applied to the non-hydrostatic model Harmonie (2.5 km resolution), Hirlam (11 km resolution) and the ECMWF model (16 km resolution), overall yielding similar Brier skill scores for the 3 post-processed models, but larger differences for individual lead times. Besides, the Fractions Skill Score is computed using the 3 deterministic forecasts, showing somewhat better skill for the Harmonie model. In other words, despite the realism of Harmonie precipitation forecasts, they only perform similarly or somewhat better than precipitation forecasts from the 2 lower resolution models, at least in the Netherlands.

  4. High-order statistics of microtexton for HEp-2 staining pattern classification.

    Science.gov (United States)

    Han, Xian-Hua; Wang, Jian; Xu, Gang; Chen, Yen-Wei

    2014-08-01

    This study addresses the classification problem of the HEp-2 cell using indirect immunofluorescent (IIF) image analysis, which can indicate the presence of autoimmune diseases by finding antibodies in the patient serum. Generally, the method used for IIF analysis remains subjective, and depends too heavily on the experience and expertise of the physician. Recently, studies have shown that it is possible to identify the cell patterns using IIF image analysis and machine learning techniques. However, it still has large gap in recognition rates to the physical experts' one. This paper explores an approach in which the discriminative features of HEp-2 cell images in IIF are extracted and then, the patterns of the HEp-2 cell are identified using machine learning techniques. Motivated by the progress in the research field of computer vision, as a result of which small local pixel pattern distributions can now be highly discriminative, the proposed strategy employs a parametric probability process to model local image patches (textons: microstructures in the cell image) and extract the higher-order statistics of the model parameters for the image description. The proposed strategy can adaptively characterize the microtexton space of HEp-2 cell images as a generative probability model, and discover the parameters that yield a better fitting of the training space, which would lead to a more discriminant representation for the cell image. The simple linear support vector machine is used for cell pattern identification because of its low computational cost, in particular for large-scale datasets. Experiments using the open HEp-2 cell dataset used in the ICIP2013 contest validate that the proposed strategy can achieve a much better performance than the widely used local binary pattern (LBP) histogram and its extensions, rotation invariant co-occurrence LBP, and pairwise rotation invariant co-occurrence LBP, and that the achieved recognition error rate is even very

  5. Sublattice Counting and Orbifolds

    CERN Document Server

    Hanany, Amihay; Reffert, Susanne

    2010-01-01

    Abelian orbifolds of C^3 are known to be encoded by hexagonal brane tilings. To date it is not known how to count all such orbifolds. We fill this gap by employing number theoretic techniques from crystallography, and by making use of Polya's Enumeration Theorem. The results turn out to be beautifully encoded in terms of partition functions and Dirichlet Series. The same methods apply to counting orbifolds of any toric non-compact Calabi-Yau singularity. As additional examples, we count the orbifolds of the conifold, of the L^{aba} theories, and of C^4.

  6. Early upper digestive tract side effects of zidovudine with tenofovir plus emtricitabine in West African adults with high CD4 counts

    Directory of Open Access Journals (Sweden)

    Eric Ouattara

    2013-04-01

    Full Text Available Introduction: Tenofovir (TDF with emtricitabine (FTC and zidovudine (ZDV is a recognized alternate first-line antiretroviral (ART regimen for patients who cannot start treatment with non-nucleoside reverse transcriptase inhibitors (NNRTIs. Clinical studies comparing TDF+FTC+ZDV to other regimens are lacking. Methods: Participants in a trial of early ART in Côte d'Ivoire (Temprano ANRS 12136 started treatment with TDF/FTC plus either efavirenz (EFV or ZDV (HIV-1+2 dually infected patients and women refusing contraception or previously treated with nevirapine. We compared rates of upper digestive serious adverse events (sAEs between TDF/FTC+EFV and TDF/FTC+ZDV patients during the first six months of treatment. sAEs were defined as either grade 3–4 AEs or persistent grade 1–2 AEs leading to drug discontinuation. Results: A total of 197 patients (76% women, median CD4 count 395/mm3 started therapy with TDF/FTC, 126 with EFV and 71 with ZDV. During the first six months of ART, 94 patients had digestive AEs (nausea/vomiting of any grade (EFV 36/126, 29%; ZDV 58/71, 82%, p<0.0001, including 20 sAEs (EFV 3/126, 5%; ZDV 17/71, 24%, p<0.0001. In-patients on TDF/FTC+ZDV with digestive AEs, the median time to the first symptom was two days (IQR: 1–4. Plasma ZDV (Cmax distributions and pill ZDV dosages were normal. Patients with digestive AEs had higher haemoglobin levels and tended to have higher body mass indices and more frequent past histories of cotrimoxazole (CTX prophylaxis. Conclusions: We observed an unexpectedly high rate of digestive sAEs in West African adults, mostly women, who started a 3-nuc ART with TDF/FTC+ZDV in Côte d'Ivoire. These adults were participating in a trial of early ART and had much higher CD4 counts than those who currently routinely start ART in sub-Saharan Africa. They all received CTX concomitantly with ZDV. We suggest that further early prescriptions of TDF+XTC+ZDV should be carefully monitored and that whenever

  7. Study of mast cell count in skin tags

    Directory of Open Access Journals (Sweden)

    Zaher Hesham

    2007-01-01

    Full Text Available Background: Skin tags or acrochordons are common tumors of middle-aged and elderly subjects. They consist of loose fibrous tissue and occur mainly on the neck and major flexures as small, soft, pedunculated protrusions. Objectives: The aim was to compare the mast cells count in skin tags to adjacent normal skin in diabetic and nondiabetic participants in an attempt to elucidate the possible role of mast cells in the pathogenesis of skin tags. Participants and Methods: Thirty participants with skin tags were divided into group I (15 nondiabetic participants and group II (15 diabetic participants. Three biopsies were obtained from each participant: a large skin tag, a small skin tag and adjacent normal skin. Mast cell count from all the obtained sections was carried out, and the mast cell density was expressed as the average mast cell count/high power field (HPF. Results: A statistically significant increase in mast cells count in skin tags in comparison to normal skin was detected in group I and group II. There was no statistically significant difference between mast cell counts in skin tags of both the groups. Conclusion: Both the mast cell mediators and hyperinsulinemia are capable of inducing fibroblast proliferation and epidermal hyperplasia that are the main pathologic abnormalities seen in all types of skin tags. However, the presence of mast cells in all examined skin tags regardless of diabetes and obesity may point to the possible crucial role of mast cells in the etiogenesis of skin tags through its interaction with fibroblasts and keratinocytes.

  8. HerMES: A search for high-redshift dusty galaxies in the HerMES Large Mode Survey - Catalogue, number counts and early results

    CERN Document Server

    Asboth, V; Sayers, J; Bethermin, M; Chapman, S C; Clements, D L; Cooray, A; Dannerbauer, H; Farrah, D; Glenn, J; Golwala, S R; Halpern, M; Ibar, E; Ivison, R J; Maloney, P R; Marques-Chaves, R; Martinez-Navajas, P I; Oliver, S J; Perez-Fournon, I; Riechers, D A; Rowan-Robinson, M; Scott, Douglas; Siegel, S R; Vieira, J D; Viero, M; Wang, L; Wardlow, J; Wheeler, J

    2016-01-01

    Selecting sources with rising flux densities towards longer wavelengths from Herschel/SPIRE maps is an efficient way to produce a catalogue rich in high-redshift (z > 4) dusty star-forming galaxies. The effectiveness of this approach has already been confirmed by spectroscopic follow-up observations, but the previously available catalogues made this way are limited by small survey areas. Here we apply a map-based search method to 274 deg$^2$ of the HerMES Large Mode Survey (HeLMS) and create a catalogue of 477 objects with SPIRE flux densities $S_{500} > S_{350} >S_{250}$ and a 5 \\sigma cut-off $S_{500}$ > 52 mJy. From this catalogue we determine that the total number of these "red" sources is at least an order of magnitude higher than predicted by galaxy evolution models. These results are in agreement with previous findings in smaller HerMES fields; however, due to our significantly larger sample size we are also able to investigate the shape of the red source counts for the first time. We examine the 500 $...

  9. 1996 : Track Count Protocol

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The goal of St. Vincent National Wildlife Refuge's Track Count Protocol is to provide an index to the population size of game animals inhabiting St. Vincent Island.

  10. Blood Count Tests

    Science.gov (United States)

    Your blood contains red blood cells (RBC), white blood cells (WBC), and platelets. Blood count tests measure the number and types of cells in your blood. This helps doctors check on your overall health. ...

  11. Counting Belief Propagation

    CERN Document Server

    Kersting, Kristian; Natarajan, Sriraam

    2012-01-01

    A major benefit of graphical models is that most knowledge is captured in the model structure. Many models, however, produce inference problems with a lot of symmetries not reflected in the graphical structure and hence not exploitable by efficient inference techniques such as belief propagation (BP). In this paper, we present a new and simple BP algorithm, called counting BP, that exploits such additional symmetries. Starting from a given factor graph, counting BP first constructs a compressed factor graph of clusternodes and clusterfactors, corresponding to sets of nodes and factors that are indistinguishable given the evidence. Then it runs a modified BP algorithm on the compressed graph that is equivalent to running BP on the original factor graph. Our experiments show that counting BP is applicable to a variety of important AI tasks such as (dynamic) relational models and boolean model counting, and that significant efficiency gains are obtainable, often by orders of magnitude.

  12. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  13. Housing Inventory Count

    Data.gov (United States)

    Department of Housing and Urban Development — This report displays the data communities reported to HUD about the nature of their dedicated homeless inventory, referred to as their Housing Inventory Count (HIC)....

  14. Allegheny County Traffic Counts

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Traffic sensors at over 1,200 locations in Allegheny County collect vehicle counts for the Pennsylvania Department of Transportation. Data included in the Health...

  15. Liquid Scintillation Counting

    OpenAIRE

    Carlsson, Sten

    1993-01-01

    In liquid scintillation counting (LSC) we use the process of luminescense to detect ionising radiation emit$ed from a radionuclide. Luminescense is emission of visible light of nonthermal origin. 1t was early found that certain organic molecules have luminescent properties and such molecules are used in LSC. Today LSC is the mostwidespread method to detect pure beta-ernitters like tritium and carbon-14. 1t has unique properties in its efficient counting geometry, deteetability and the lack of...

  16. Counting curves on surfaces

    OpenAIRE

    2015-01-01

    In this paper we consider an elementary, and largely unexplored, combinatorial problem in low-dimensional topology. Consider a real 2-dimensional compact surface $S$, and fix a number of points $F$ on its boundary. We ask: how many configurations of disjoint arcs are there on $S$ whose boundary is $F$? We find that this enumerative problem, counting curves on surfaces, has a rich structure. For instance, we show that the curve counts obey an effective recursion, in the general framework of to...

  17. Counting RG flows

    OpenAIRE

    Gukov, Sergei

    2016-01-01

    Interpreting renormalization group flows as solitons interpolating between different fixed points, we ask various questions that are normally asked in soliton physics but not in renormalization theory. Can one count RG flows? Are there different "topological sectors" for RG flows? What is the moduli space of an RG flow, and how does it compare to familiar moduli spaces of (supersymmetric) dowain walls? Analyzing these questions in a wide variety of contexts --- from counting RG walls to AdS/C...

  18. Resolved SZE Cluster Count

    Institute of Scientific and Technical Information of China (English)

    Jia-Yu Tang; Zu-Hui Fan

    2003-01-01

    We study the counts of resolved SZE (Sunyaev-Zel'dovich effect) clus-ters expected from an interferometric survey in different cosmological models underdifferent conditions. The self-similar universal gas model and Press-Schechter massfunction are used. We take the observing frequency to be 90 GHz, and consider twodish diameters, 1.2 m and 2.5 m. We calculate the number density of the galaxyclusters dN/(dΩdz) at a high flux limit Slimv = 100mJy and at a relative lowSlimv = 10 mJy. The total numbers of SZE clusters N in two low-Ω0 models arecompared. The results show that the influence of the resolved effect depends notonly on D, but also on Slimv: at a given D, the effect is more significant for a highthan for a low Slim Also, the resolved effect for a flat universe is more impressivethan that for an open universe. For D = 1.2m and Slimv= 10mJy, the resolvedeffect is very weak. Considering the designed interferometers which will be used tosurvey SZE clusters, we find that the resolved effect is insignificant when estimatingthe expected yield of the SZE cluster surveys.

  19. Design, theoretical analysis, and experimental verification of a CMOS current integrator with 1.2 × 2.05 µm2 microelectrode array for high-sensitivity bacterial counting

    Science.gov (United States)

    Gamo, Kohei; Nakazato, Kazuo; Niitsu, Kiichi

    2017-01-01

    In this paper, we present the design and experimental verification of an amperometric CMOS-based sensor with a current integrator and a 1.2 × 2.05 µm2 bacterial-sized microelectrode array for high-sensitivity bacterial counting. For high-sensitivity bacterial counting with a sufficient signal-to-noise ratio (SNR), noise must be reduced because bacterial-sized microelectrodes can handle only a low current of the order of 100 pA. Thus, we implement a current integrator that is highly effective for noise reduction. Furthermore, for the first time, we use the current integrator in conjunction with the bacterial-sized microelectrode array. On the basis of the results of the proposed current integration, we successfully reduce noise and achieve a high SNR of 30.4 dB. To verify the effectiveness of the proposed CMOS-based sensor, we perform two-dimensional counting of microbeads, which are almost of the same size as bacteria. The measurement results demonstrate successful high-sensitivity two-dimensional (2D) counting of microbeads with a high SNR of 27 dB.

  20. Neutron counting with cameras

    Energy Technology Data Exchange (ETDEWEB)

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo [Institut Laue Langevin, Grenoble (France)

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involved are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)

  1. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  2. Statistical downscaling of precipitation using local regression and high accuracy surface modeling method

    Science.gov (United States)

    Zhao, Na; Yue, Tianxiang; Zhou, Xun; Zhao, Mingwei; Liu, Yu; Du, Zhengping; Zhang, Lili

    2017-07-01

    Downscaling precipitation is required in local scale climate impact studies. In this paper, a statistical downscaling scheme was presented with a combination of geographically weighted regression (GWR) model and a recently developed method, high accuracy surface modeling method (HASM). This proposed method was compared with another downscaling method using the Coupled Model Intercomparison Project Phase 5 (CMIP5) database and ground-based data from 732 stations across China for the period 1976-2005. The residual which was produced by GWR was modified by comparing different interpolators including HASM, Kriging, inverse distance weighted method (IDW), and Spline. The spatial downscaling from 1° to 1-km grids for period 1976-2005 and future scenarios was achieved by using the proposed downscaling method. The prediction accuracy was assessed at two separate validation sites throughout China and Jiangxi Province on both annual and seasonal scales, with the root mean square error (RMSE), mean relative error (MRE), and mean absolute error (MAE). The results indicate that the developed model in this study outperforms the method that builds transfer function using the gauge values. There is a large improvement in the results when using a residual correction with meteorological station observations. In comparison with other three classical interpolators, HASM shows better performance in modifying the residual produced by local regression method. The success of the developed technique lies in the effective use of the datasets and the modification process of the residual by using HASM. The results from the future climate scenarios show that precipitation exhibits overall increasing trend from T1 (2011-2040) to T2 (2041-2070) and T2 to T3 (2071-2100) in RCP2.6, RCP4.5, and RCP8.5 emission scenarios. The most significant increase occurs in RCP8.5 from T2 to T3, while the lowest increase is found in RCP2.6 from T2 to T3, increased by 47.11 and 2.12 mm, respectively.

  3. High Statistics Analysis using Anisotropic Clover Lattices: (II) Three-Baryon Systems

    Energy Technology Data Exchange (ETDEWEB)

    Beane, S; Detmold, W; Luu, T; Orginos, K; Parreno, A; Savage, M; Torok, A; Walker-Loud, A

    2009-05-05

    We present the results of an exploratory Lattice QCD calculation of three-baryon systems through a high-statistics study of one ensemble of anisotropic clover gauge-field configurations with a pion mass of m{sub {pi}} {approx} 390 MeV. Because of the computational cost of the necessary contractions, we focus on correlation functions generated by interpolating-operators with the quantum numbers of the {Xi}{sup 0}{Xi}{sup 0}n system, one of the least demanding three baryon systems in terms of the number of contractions. We find that the ground state of this system has an energy of E{sub {Xi}{sup 0}{Xi}{sup 0}n} = 3877.9 {+-} 6.9 {+-} 9.2 {+-} 3.3 MeV corresponding to an energy-shift due to interactions of {delta}E{sub {Xi}{sup 0}{Xi}{sup 0}n} = E{sub {Xi}{sup 0}{Xi}{sup 0}n} - 2M{sub {Xi}{sup 0}} - M{sub n} = 4.6 {+-} 5.0 {+-} 7.9 {+-} 4.2 MeV. There are a significant number of time-slices in the three-baryon correlation function for which the signal-to-noise ratio is only slowly degrading with time. This is in contrast to the exponential degradation of the signal-to-noise ratio that is observed at larger times, and is due to the suppressed overlap of the source and sink interpolating-operators that are associated with the variance of the three-baryon correlation function onto the lightest eigenstates in the lattice volume (mesonic systems). As one of the motivations for this area of exploration is the calculation of the structure and reactions of light nuclei, we also present initial results for a system with the quantum numbers of the triton (pnn). This present work establishes a path to multi-baryon systems, and shows that Lattice QCD calculations of the properties and interactions of systems containing four and five baryons are now within sight.

  4. Statistical downscaling of precipitation using local regression and high accuracy surface modeling method

    Science.gov (United States)

    Zhao, Na; Yue, Tianxiang; Zhou, Xun; Zhao, Mingwei; Liu, Yu; Du, Zhengping; Zhang, Lili

    2016-03-01

    Downscaling precipitation is required in local scale climate impact studies. In this paper, a statistical downscaling scheme was presented with a combination of geographically weighted regression (GWR) model and a recently developed method, high accuracy surface modeling method (HASM). This proposed method was compared with another downscaling method using the Coupled Model Intercomparison Project Phase 5 (CMIP5) database and ground-based data from 732 stations across China for the period 1976-2005. The residual which was produced by GWR was modified by comparing different interpolators including HASM, Kriging, inverse distance weighted method (IDW), and Spline. The spatial downscaling from 1° to 1-km grids for period 1976-2005 and future scenarios was achieved by using the proposed downscaling method. The prediction accuracy was assessed at two separate validation sites throughout China and Jiangxi Province on both annual and seasonal scales, with the root mean square error (RMSE), mean relative error (MRE), and mean absolute error (MAE). The results indicate that the developed model in this study outperforms the method that builds transfer function using the gauge values. There is a large improvement in the results when using a residual correction with meteorological station observations. In comparison with other three classical interpolators, HASM shows better performance in modifying the residual produced by local regression method. The success of the developed technique lies in the effective use of the datasets and the modification process of the residual by using HASM. The results from the future climate scenarios show that precipitation exhibits overall increasing trend from T1 (2011-2040) to T2 (2041-2070) and T2 to T3 (2071-2100) in RCP2.6, RCP4.5, and RCP8.5 emission scenarios. The most significant increase occurs in RCP8.5 from T2 to T3, while the lowest increase is found in RCP2.6 from T2 to T3, increased by 47.11 and 2.12 mm, respectively.

  5. High Statistics Analysis using Anisotropic Clover Lattices: (II) Three-Baryon Systems

    Energy Technology Data Exchange (ETDEWEB)

    Andre Walker-Loud, Will Detmold, William Detmold, Aaron Torok, Konstantinos Orginos, Silas Beane, Tom Luu, Martin Savage, Assumpta Parreno

    2009-10-01

    We present the results of an exploratory Lattice QCD calculation of three-baryon systems through a high-statistics study of one ensemble of anisotropic clover gauge-field configurations with a pion mass of m_\\pi ~ 390 MeV. Because of the computational cost of the necessary contractions, we focus on correlation functions generated by interpolating-operators with the quantum numbers of the $\\Xi^0\\Xi^0 n$ system, one of the least demanding three baryon systems in terms of the number of contractions. We find that the ground state of this system has an energy of E_{\\Xi^0\\Xi^0n}= 3877.9\\pm 6.9\\pm 9.2\\pm3.3 MeV corresponding to an energy-shift due to interactions of \\delta E_{\\Xi^0\\Xi^0n}=E_{\\Xi^0\\Xi^0n}-2M_{\\Xi^0} -M_n=4.6\\pm 5.0\\pm 7.9\\pm 4.2 MeV. There are a significant number of time-slices in the three-baryon correlation function for which the signal-to-noise ratio is only slowly degrading with time. This is in contrast to the exponential degradation of the signal-to-noise ratio that is observed at larger times, and is due to the suppressed overlap of the source and sink interpolating-operators that are associated with the variance of the three-baryon correlation function onto the lightest eigenstates in the lattice volume (mesonic systems). As one of the motivations for this area of exploration is the calculation of the structure and reactions of light nuclei, we also present initial results for a system with the quantum numbers of the triton (pnn). This present work establishes a path to multi-baryon systems, and shows that Lattice QCD calculations of the properties and interactions of systems containing four and five baryons are now within sight.

  6. Relationship between high white blood cell count and insulin resistance (HOMA-IR) in Korean children and adolescents: Korean National Health and Nutrition Examination Survey 2008-2010.

    Science.gov (United States)

    Park, J-M; Lee, D-C; Lee, Y-J

    2017-05-01

    Increasing evidence has indicated that insulin resistance is associated with inflammation. However, few studies have investigated the association between white blood cell (WBC) count and insulin resistance, as measured by a homeostasis model assessment of insulin resistance (HOMA-IR) in a general pediatric population. This study aimed to examine the association between WBC count and insulin resistance as measured by HOMA-IR in a nationally representative sample of children and adolescents. In total, 2761 participants (1479 boys and 1282 girls) aged 10-18 years were selected from the 2008-2010 Korean National Health and Nutrition Examination Survey. Insulin resistance was defined as a HOMA-IR value greater than the 90th percentile. The odds ratios and 95% confidence intervals for insulin resistance were determined using multiple logistic regression analysis. The mean values of most cardiometabolic variables tended to increase proportionally with WBC count quartiles. The prevalence of insulin resistance significantly increased in accordance with WBC count quartiles in both boys and girls. Compared to individuals in the lowest WBC count quartile, the odds ratio for insulin resistance for individuals in the highest quartile was 2.84 in boys and 3.20 in girls, after adjusting for age, systolic blood pressure, body mass index, and waist circumference. A higher WBC count was positively associated with an increased risk of insulin resistance in Korean children and adolescents. This study suggests that WBC count could facilitate the identification of children and adolescents with insulin resistance. Copyright © 2017 The Italian Society of Diabetology, the Italian Society for the Study of Atherosclerosis, the Italian Society of Human Nutrition, and the Department of Clinical Medicine and Surgery, Federico II University. Published by Elsevier B.V. All rights reserved.

  7. Life-cycle assessment of high-voltage assets using statistical tool

    NARCIS (Netherlands)

    Chmura, L.A.

    2014-01-01

    Nowadays, utilities are confronted with assets reaching or even exceeding their designed life. This in turn, implies the occurrence of upcoming replacements to assure reliable network operation. In this thesis, the application of statistical tools for life-time and residual life assessment of

  8. [SEX HORMONE INFLUENCE ON PERIPHERAL NATURAL KILLER CELLS COUNT].

    Science.gov (United States)

    Ivanov, P; Konova, E; Blajeva, Sv; Lukanov, Tsv; Angelova, P; Georgieva, V; Totev, V; Komsa-Penkova, R

    2015-01-01

    Proper evaluation of immunological factors connected with pregnancy establishment increased the possibility for exact treatment in high risk gestation cases. Hormonal changes during an ovarian cycle may affect immune response, which is crucial for the embryonic implantation. Peripheral Natural killer (pNK) cells are key components of immune systems and their activities could be regulated by sex hormones. In the present study we investigated the effects of estrogen fluctuation on the number of NK cells in vivo during the early follicular and middle luteal phase of menstrual cycle. In 63 healthy women with at least one full term pregnancy and regular menstrual cycle with duration between 24 and 32 days, blood samples have been collected twice for investigation of CD3/CD16/CD56 positive lymphocytes. The mean pNK count in follicular phase was 11.6% with 4.7% variation. The median was 10.6%. The mean pNK count in luteal phase was 12.1% with 5.1% variation, respectively median for cell number 11.8%. The two-tailed t-test comparison did not find any statistical difference despite the slight elevation of pNK cells count in luteal phase. The insignificant variation in pNK cells count objected the suggestion to evaluate immunological status in women with adverse pregnancy outcome in specific phase of menstrual cycle.

  9. Modeling the envelope statistics of three-dimensional high-frequency ultrasound echo signals from dissected human lymph nodes

    Science.gov (United States)

    Bui, Thanh Minh; Coron, Alain; Mamou, Jonathan; Saegusa-Beecroft, Emi; Yamaguchi, Tadashi; Yanagihara, Eugene; Machi, Junji; Bridal, S. Lori; Feleppa, Ernest J.

    2014-01-01

    This work investigates the statistics of the envelope of three-dimensional (3D) high-frequency ultrasound (HFU) data acquired from dissected human lymph nodes (LNs). Nine distributions were employed, and their parameters were estimated using the method of moments. The Kolmogorov Smirnov (KS) metric was used to quantitatively compare the fit of each candidate distribution to the experimental envelope distribution. The study indicates that the generalized gamma distribution best models the statistics of the envelope data of the three media encountered: LN parenchyma, fat and phosphate-buffered saline (PBS). Furthermore, the envelope statistics of the LN parenchyma satisfy the pre-Rayleigh condition. In terms of high fitting accuracy and computationally efficient parameter estimation, the gamma distribution is the best choice to model the envelope statistics of LN parenchyma, while, the Weibull distribution is the best choice to model the envelope statistics of fat and PBS. These results will contribute to the development of more-accurate and automatic 3D segmentation of LNs for ultrasonic detection of clinically significant LN metastases. PMID:25346951

  10. Modeling the envelope statistics of three-dimensional high-frequency ultrasound echo signals from dissected human lymph nodes.

    Science.gov (United States)

    Bui, Thanh Minh; Coron, Alain; Mamou, Jonathan; Saegusa-Beecroft, Emi; Yamaguchi, Tadashi; Yanagihara, Eugene; Machi, Junji; Bridal, S Lori; Feleppa, Ernest J

    2014-01-01

    This work investigates the statistics of the envelope of three-dimensional (3D) high-frequency ultrasound (HFU) data acquired from dissected human lymph nodes (LNs). Nine distributions were employed, and their parameters were estimated using the method of moments. The Kolmogorov Smirnov (KS) metric was used to quantitatively compare the fit of each candidate distribution to the experimental envelope distribution. The study indicates that the generalized gamma distribution best models the statistics of the envelope data of the three media encountered: LN parenchyma, fat and phosphate-buffered saline (PBS). Furthermore, the envelope statistics of the LN parenchyma satisfy the pre-Rayleigh condition. In terms of high fitting accuracy and computationally efficient parameter estimation, the gamma distribution is the best choice to model the envelope statistics of LN parenchyma, while, the Weibull distribution is the best choice to model the envelope statistics of fat and PBS. These results will contribute to the development of more-accurate and automatic 3D segmentation of LNs for ultrasonic detection of clinically significant LN metastases.

  11. Kids Count in Delaware: Fact Book, 1995.

    Science.gov (United States)

    Delaware Univ., Newark. Kids Count in Delaware.

    This Kids Count fact book examines statewide trends in the well-being of Delaware's children. The statistical portrait is based on key indicators in four areas: single-parent families, births to teenage mothers, juvenile crime and violence, and education. Following brief sections on the state's demographics and economic status, the fact book…

  12. KIDS COUNT in Virginia, 2001 [Data Book].

    Science.gov (United States)

    Action Alliance for Virginia's Children and Youth, Richmond.

    This Kids Count data book details statewide trends in the well-being of Virginia's children. The statistical portrait is based on the following four areas of children's well-being: health and safety; education; family; and economy. Key indicators examined are: (1) prenatal care; (2) low birth weight babies; (3) infant mortality; (4) child abuse or…

  13. Lagrangian and Eulerian statistics of pipe flows measured with 3D-PTV at moderate and high Reynolds numbers

    NARCIS (Netherlands)

    Oliveira, J.L.G.; Geld, van der C.W.M.; Kuerten, J.G.M.

    2013-01-01

    Three-dimensional particle tracking velocimetry (3D-PTV) measurements have provided accurate Eulerian and Lagrangian high-order statistics of velocity and acceleration fluctuations and correlations at Reynolds number 10,300, based on the bulk velocity and the pipe diameter. Spatial resolution requir

  14. Statistical Association: Alignment of Current U.S. High School Textbooks with the Common Core State Standards for Mathematics

    Science.gov (United States)

    Tran, Dung

    2016-01-01

    This study examined the alignment of three selected U.S. high school textbooks series with the Common Core State Standards for Mathematics (CCSSM) regarding the treatment of statistical association. A framework grounded in the literature for inclusion and exclusion of reasoning about association topics was developed, and textbook entries were…

  15. A Multi-Institutional Study of the Relationship between High School Mathematics Achievement and Performance in Introductory College Statistics

    Science.gov (United States)

    Dupuis, Danielle N.; Medhanie, Amanuel; Harwell, Michael; LeBeau, Brandon; Monson, Debra; Post, Thomas R.

    2012-01-01

    In this study we examined the effects of prior mathematics achievement and completion of a commercially developed, National Science Foundation-funded, or University of Chicago School Mathematics Project high school mathematics curriculum on achievement in students' first college statistics course. Specifically, we examined the relationship between…

  16. DISTRIBUTION AND ORIGIN OF HIGH-VELOCITY CLOUDS .2. STATISTICAL-ANALYSIS OF THE WHOLE-SKY SURVEY

    NARCIS (Netherlands)

    WAKKER, BP

    1991-01-01

    A sensitive, almost complete, whole-sky survey of high-velocity clouds (HVCs) has been made available by Bajaja et al. (1985) and Hulsbosch & Wakker (1988, Paper I). This paper (Paper II in a series on HVCs) is dedicated to the analysis of the statistical properties of these surveys. The main conclu

  17. A Multi-Institutional Study of the Relationship between High School Mathematics Achievement and Performance in Introductory College Statistics

    Science.gov (United States)

    Dupuis, Danielle N.; Medhanie, Amanuel; Harwell, Michael; LeBeau, Brandon; Monson, Debra; Post, Thomas R.

    2012-01-01

    In this study we examined the effects of prior mathematics achievement and completion of a commercially developed, National Science Foundation-funded, or University of Chicago School Mathematics Project high school mathematics curriculum on achievement in students' first college statistics course. Specifically, we examined the relationship between…

  18. High white blood cell count is associated with a worsening of insulin sensitivity and predicts the development of type 2 diabetes.

    Science.gov (United States)

    Vozarova, Barbora; Weyer, Christian; Lindsay, Robert S; Pratley, Richard E; Bogardus, Clifton; Tataranni, P Antonio

    2002-02-01

    Chronic low-grade inflammation may be involved in the pathogenesis of insulin resistance and type 2 diabetes. We examined whether a high white blood cell count (WBC), a marker of inflammation, predicts a worsening of insulin action, insulin secretory function, and the development of type 2 diabetes in Pima Indians. We measured WBC in 352 nondiabetic Pima Indians (215 men and 137 women, aged 27 +/- 6 years [means +/- SD], body fat 32 +/- 8%, WBC 8,107 +/- 2,022 cells/mm(3)) who were characterized for body composition (by hydrodensitometry or dual-energy X-ray absorptiometry), glucose tolerance (by 75-g oral glucose tolerance test), insulin action (M; by hyperinsulinemic clamp), and acute insulin secretory response (AIR; by 25-g intravenous glucose challenge). Among 272 subjects who were normal glucose tolerant (NGT) at baseline, 54 developed diabetes over an average follow-up of 5.5 +/- 4.4 years. Among those who remained nondiabetic, 81 subjects had follow-up measurements of M and AIR. Cross-sectionally, WBC was related to percent body fat (r = 0.32, P < 0.0001) and M (r = -0.24, P < 0.0001), but not to AIR (r = 0.06, P = 0.4). In a multivariate analysis, when adjusted for age and sex, both percent body fat (P < 0.0001) and M (P = 0.03) were independently associated with WBC. A high WBC value predicted diabetes (relative hazard 90th vs. 10th percentiles [95%CI] of 2.7 [1.3-5.4], P = 0.007) when adjusted for age and sex. The predictive effect of WBC persisted after additional adjustment for established predictors of diabetes, i.e., percent body fat, M, and AIR (relative hazard 2.6 [1.1-6.2], P = 0.03). After adjustment for follow-up duration, a high WBC at baseline was associated with a subsequent worsening of M (P = 0.003), but not a worsening of AIR. A high WBC predicts a worsening of insulin action and the development of type 2 diabetes in Pima Indians. These findings are consistent with the hypothesis that a chronic activation of the immune system may play a

  19. [Blood Count Specimen].

    Science.gov (United States)

    Tamura, Takako

    2015-12-01

    The circulating blood volume accounts for 8% of the body weight, of which 45% comprises cellular components (blood cells) and 55% liquid components. We can measure the number and morphological features of blood cells (leukocytes, red blood cells, platelets), or count the amount of hemoglobin in a complete blood count: (CBC). Blood counts are often used to detect inflammatory diseases such as infection, anemia, a bleeding tendency, and abnormal cell screening of blood disease. This count is widely used as a basic data item of health examination. In recent years, clinical tests before consultation have become common among outpatient clinics, and the influence of laboratory values on consultation has grown. CBC, which is intended to count the number of raw cells and to check morphological features, is easily influenced by the environment, techniques, etc., during specimen collection procedures and transportation. Therefore, special attention is necessary to read laboratory data. Providing correct test values that accurately reflect a patient's condition from the laboratory to clinical side is crucial. Inappropriate medical treatment caused by erroneous values resulting from altered specimens should be avoided. In order to provide correct test values, the daily management of devices is a matter of course, and comprehending data variables and positively providing information to the clinical side are important. In this chapter, concerning sampling collection, blood collection tubes, dealing with specimens, transportation, and storage, I will discuss their effects on CBC, along with management or handling methods.

  20. Expectation Maximization for Hard X-ray Count Modulation Profiles

    CERN Document Server

    Benvenuto, Federico; Piana, Michele; Massone, Anna Maria

    2013-01-01

    This paper is concerned with the image reconstruction problem when the measured data are solar hard X-ray modulation profiles obtained from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI)} instrument. Our goal is to demonstrate that a statistical iterative method classically applied to the image deconvolution problem is very effective when utilized for the analysis of count modulation profiles in solar hard X-ray imaging based on Rotating Modulation Collimators. The algorithm described in this paper solves the maximum likelihood problem iteratively and encoding a positivity constraint into the iterative optimization scheme. The result is therefore a classical Expectation Maximization method this time applied not to an image deconvolution problem but to image reconstruction from count modulation profiles. The technical reason that makes our implementation particularly effective in this application is the use of a very reliable stopping rule which is able to regularize the solution providing, ...

  1. Statistical significance of variables driving systematic variation in high-dimensional data

    Science.gov (United States)

    Chung, Neo Christopher; Storey, John D.

    2015-01-01

    Motivation: There are a number of well-established methods such as principal component analysis (PCA) for automatically capturing systematic variation due to latent variables in large-scale genomic data. PCA and related methods may directly provide a quantitative characterization of a complex biological variable that is otherwise difficult to precisely define or model. An unsolved problem in this context is how to systematically identify the genomic variables that are drivers of systematic variation captured by PCA. Principal components (PCs) (and other estimates of systematic variation) are directly constructed from the genomic variables themselves, making measures of statistical significance artificially inflated when using conventional methods due to over-fitting. Results: We introduce a new approach called the jackstraw that allows one to accurately identify genomic variables that are statistically significantly associated with any subset or linear combination of PCs. The proposed method can greatly simplify complex significance testing problems encountered in genomics and can be used to identify the genomic variables significantly associated with latent variables. Using simulation, we demonstrate that our method attains accurate measures of statistical significance over a range of relevant scenarios. We consider yeast cell-cycle gene expression data, and show that the proposed method can be used to straightforwardly identify genes that are cell-cycle regulated with an accurate measure of statistical significance. We also analyze gene expression data from post-trauma patients, allowing the gene expression data to provide a molecularly driven phenotype. Using our method, we find a greater enrichment for inflammatory-related gene sets compared to the original analysis that uses a clinically defined, although likely imprecise, phenotype. The proposed method provides a useful bridge between large-scale quantifications of systematic variation and gene

  2. Identification of Microorganisms by High Resolution Tandem Mass Spectrometry with Accurate Statistical Significance

    Science.gov (United States)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo

    2016-02-01

    Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  3. Turbulence statistics in a spectral element code: a toolbox for High-Fidelity Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Vinuesa, Ricardo [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden); Fick, Lambert [Argonne National Lab. (ANL), Argonne, IL (United States); Negi, Prabal [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden); Marin, Oana [Argonne National Lab. (ANL), Argonne, IL (United States); Merzari, Elia [Argonne National Lab. (ANL), Argonne, IL (United States); Schlatter, Phillip [KTH Mechanics, Stockholm (Sweden); Swedish e-Science Research Center (SeRC), Stockholm (Sweden)

    2017-02-01

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with Lx = 2h, Ly = 2h and Lz = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the mesh is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).

  4. Improving the Precision of Tree Counting by Combining Tree Detection with Crown Delineation and Classification on Homogeneity Guided Smoothed High Resolution (50 cm Multispectral Airborne Digital Data

    Directory of Open Access Journals (Sweden)

    Masato Katoh

    2012-05-01

    Full Text Available A method of counting the number of coniferous trees by species within forest compartments was developed by combining an individual tree crown delineation technique with a treetop detection technique, using high spatial resolution optical sensor data. When this method was verified against field data from the Shinshu University Campus Forest composed of various cover types, the accuracy for the total number of trees per stand was higher than 84%. This shows improvements over the individual tree crown delineation technique alone which had accuracies lower than 62%, or the treetop detection technique alone which had accuracies lower than 78%. However, the accuracy of the number of trees classified by species was less than 84%. The total number of trees by species per stand was improved with exclusion of the understory species and ranged from 45.2% to 93.8% for Chamaecyparis obtusa and C. pisifera and from 37.9% to 98.1% for broad-leaved trees because many of these were understory species. The better overall results are attributable primarily to the overestimation of Pinus densiflora, Larix kaempferi and broad-leaved trees compensating for the underestimation of C. obtusa and C. pisifera. Practical forest management can be enhanced by registering the output resulting from this technology in a forest geographical information system database. This approach is mostly useful for conifer plantations containing medium to old age trees, which have a higher timber value.

  5. Expression of androgen-producing enzyme genes and testosterone concentration in Angus and Nellore heifers with high and low ovarian follicle count.

    Science.gov (United States)

    Loureiro, Bárbara; Ereno, Ronaldo L; Favoreto, Mauricio G; Barros, Ciro M

    2016-07-15

    Follicle population is important when animals are used in assisted reproductive programs. Bos indicus animals have more follicles per follicular wave than Bos taurus animals. On the other hand, B taurus animals present better fertility when compared with B indicus animals. Androgens are positively related with the number of antral follicles; moreover, they increase growth factor expression in granulose cells and oocytes. Experimentation was designed to compare testosterone concentration in plasma, and follicular fluid and androgen enzymes mRNA expression (CYP11A1, CYP17A1, 3BHSD, and 17BHSD) in follicles from Angus and Nellore heifers. Heifers were assigned into two groups according to the number of follicles: low and high follicle count groups. Increased testosterone concentration was measured in both plasma and follicular fluid of Angus heifers. However, there was no difference within groups. Expression of CYP11A1 gene was higher in follicles from Angus heifers; however, there was no difference within groups. Expression of CYP17A1, 3BHSD, and 17BHSD genes was higher in follicles from Nellore heifers, and expression of CYP17A1 and 3BHSD genes was also higher in HFC groups from both breeds. It was found that Nellore heifers have more antral follicles than Angus heifers. Testosterone concentration was higher in Angus heifers; this increase could be associated with the increased mRNA expression of CYP11A1. Increased expression of androgen-producing enzyme genes (CYP17A1, 3BHSD, and 17BHSD) was detected in Nellore heifers. It can be suggested that testosterone is acting through different mechanisms to increase follicle development in Nellore and improve fertility in Angus heifers. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. High Transmitter CD4+ T-Cell Count Shortly after the Time of Transmission in a Study of African Serodiscordant Couples.

    Directory of Open Access Journals (Sweden)

    Etienne Karita

    Full Text Available 2013 WHO guidelines recommend starting ART at CD4+ T-cell counts ≤500 cells/μL. We present the T-cell counts from adult Africans with HIV shortly following transmission to their sexual partners.HIV-discordant couples in Zambia, Uganda and Rwanda were followed prospectively and received couples counseling and condoms. HIV uninfected partners were tested for HIV at least quarterly and HIV-infected partners received HIV care and referral for ART per national guidelines. Upon diagnosis of incident HIV infection in the previously HIV-uninfected partner, a blood sample was collected from both partners to measure CD4+ T-cells and perform viral linkage. The estimated date of infection (EDI of the incident case was calculated based on testing history. EDI was unknown for suspected transmitting partners.From 2006-2011, 4,705 HIV-discordant couples were enrolled in this cohort, and 443 cases of incident HIV infection were documented. Virus linkage analysis was performed in 374 transmission pairs, and 273 (73% transmissions were linked genetically. CD4 counts in the transmitting partner were measured a median of 56 days after EDI (mean:90.5, min:10, max:396. The median CD4 count was 339 cells/μl (mean:386.4, min:15, max:1,434, and the proportion of partners with a CD4+ T-cell count above 500/μl was 25% (95% CI:21, 31.In our cohort of discordant couples, 73% of HIV transmissions occurred within the relationship, and the transmitter CD4+ T cell count shortly after the transmission event was frequently higher than the WHO 2013 ART-initiation guidelines.

  7. Statistics is Easy

    CERN Document Server

    Shasha, Dennis

    2010-01-01

    Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along

  8. Virological profile of pregnant HIV positive women with high levels of CD4 count in low income settings: Can viral load help as eligibility criteria for maternal triple ARV prophylaxis (WHO 2010 option B?

    Directory of Open Access Journals (Sweden)

    Anne Esther Njom Nlend

    2011-10-01

    Full Text Available INTRODUCTION: The objective of the study was to determine HIV-1 RNA load profile during pregnancy and assess the eligibility for the maternal triple antiretroviral prophylaxis. It was an observational cohort of pregnant HIV positive women ignorant of antiretroviral therapy with CD4 cell count of > 350/mm3. METHODS:Routine CD4 cell count assessment in HIV positive pregnant women completed by non exclusive measurement of the viral load by PCR /ARN in those with CD4 cell count > 350/mm3. Exclusion criteria: highly active antiretroviral therapy prior to pregnancy. RESULTS:Between January and December 2010, CD4 cell count was systematically performed in all pregnant women diagnosed as HIV-infected (n=266 in a referral center of 25 antenatal clinics. 63% (N=170 had CD4 cell count > 350/mm3, median: 528 (IQR: 421-625. 145 underwent measurement of viral load by PCR/RNA at a median gestational of 23 weeks of pregnancy (IQR: 19-28. Median viral load 4.4log10/ml, IQR (3.5-4.9.19/145(13% had an undetectable viral load of=1.8log10/ml. 89/145(61% had a viral load of = 4 log10/ml and were eligible for maternal triple ARV prophylaxis. CONCLUSION: More than 6 in 10 pregnant HIV positive women with CD4 cell count of > 350/mm3 may require triple antiretroviral for prophylaxis of MTCT. Regardless of cost, such results are conclusive and may be considered in HIV high burden countries for universal access to triple antiretroviral prophylaxis in order to move towards virtual elimination of HIV MTCT.

  9. phenix.model_vs_data: a high-level tool for the calculation of crystallographic model and data statistics.

    Science.gov (United States)

    Afonine, Pavel V; Grosse-Kunstleve, Ralf W; Chen, Vincent B; Headd, Jeffrey J; Moriarty, Nigel W; Richardson, Jane S; Richardson, David C; Urzhumtsev, Alexandre; Zwart, Peter H; Adams, Paul D

    2010-08-01

    phenix.model_vs_data is a high-level command-line tool for the computation of crystallographic model and data statistics, and the evaluation of the fit of the model to data. Analysis of all Protein Data Bank structures that have experimental data available shows that in most cases the reported statistics, in particular R factors, can be reproduced within a few percentage points. However, there are a number of outliers where the recomputed R values are significantly different from those originally reported. The reasons for these discrepancies are discussed.

  10. Dominance of statistical fluctuation in the factorial-moment study of chaos in low multiplicity events of high energy collisions

    Institute of Scientific and Technical Information of China (English)

    刘连寿; 傅菁华; 吴元芳

    2000-01-01

    Using Monte Carlo simulation it is shown that in low multiplicity events the single-event factorial moments are saturated by the statistical fluctuations. The diversification of the event-space moments Cp, q of single-event moments with the diminishing of phase space scale, called "erraticity", observed in experiment can readily be reproduced by a flat probability distribution with only statistical fluctuations and therefore it has nothing to do with chaos as suggested. The possibility of studying chaos in high multiplicity events using erraticity analysis is discussed.

  11. Photon counting digital holography

    Science.gov (United States)

    Demoli, Nazif; Skenderović, Hrvoje; Stipčević, Mario; Pavičić, Mladen

    2016-05-01

    Digital holography uses electronic sensors for hologram recording and numerical method for hologram reconstruction enabling thus the development of advanced holography applications. However, in some cases, the useful information is concealed in a very wide dynamic range of illumination intensities and successful recording requires an appropriate dynamic range of the sensor. An effective solution to this problem is the use of a photon-counting detector. Such detectors possess counting rates of the order of tens to hundreds of millions counts per second, but conditions of recording holograms have to be investigated in greater detail. Here, we summarize our main findings on this problem. First, conditions for optimum recording of digital holograms for detecting a signal significantly below detector's noise are analyzed in terms of the most important holographic measures. Second, for time-averaged digital holograms, optimum recordings were investigated for exposures shorter than the vibration cycle. In both cases, these conditions are studied by simulations and experiments.

  12. A high-resolution photon-counting breast CT system with tensor-framelet based iterative image reconstruction for radiation dose reduction

    Science.gov (United States)

    Ding, Huanjun; Gao, Hao; Zhao, Bo; Cho, Hyo-Min; Molloi, Sabee

    2014-10-01

    Both computer simulations and experimental phantom studies were carried out to investigate the radiation dose reduction with tensor framelet based iterative image reconstruction (TFIR) for a dedicated high-resolution spectral breast computed tomography (CT) based on a silicon strip photon-counting detector. The simulation was performed with a 10 cm-diameter water phantom including three contrast materials (polyethylene, 8 mg ml-1 iodine and B-100 bone-equivalent plastic). In the experimental study, the data were acquired with a 1.3 cm-diameter polymethylmethacrylate (PMMA) phantom containing iodine in three concentrations (8, 16 and 32 mg ml-1) at various radiation doses (1.2, 2.4 and 3.6 mGy) and then CT images were reconstructed using the filtered-back-projection (FBP) technique and the TFIR technique, respectively. The image quality between these two techniques was evaluated by the quantitative analysis on contrast-to-noise ratio (CNR) and spatial resolution that was evaluated using the task-based modulation transfer function (MTF). Both the simulation and experimental results indicated that the task-based MTF obtained from TFIR reconstruction with one-third of the radiation dose was comparable to that from the FBP reconstruction for low contrast target. For high contrast target, the TFIR was substantially superior to the FBP reconstruction in terms of spatial resolution. In addition, TFIR was able to achieve a factor of 1.6-1.8 increase in CNR, depending on the target contrast level. This study demonstrates that the TFIR can reduce the required radiation dose by a factor of two-thirds for a CT image reconstruction compared to the FBP technique. It achieves much better CNR and spatial resolution for high contrast target in addition to retaining similar spatial resolution for low contrast target. This TFIR technique has been implemented with a graphic processing unit system and it takes approximately 10 s to reconstruct a single-slice CT image

  13. Rainflow counting revisited

    Energy Technology Data Exchange (ETDEWEB)

    Soeker, H. [Deutsches Windenergie-Institut (Germany)

    1996-09-01

    As state of the art method the rainflow counting technique is presently applied everywhere in fatigue analysis. However, the author feels that the potential of the technique is not fully recognized in wind energy industries as it is used, most of the times, as a mere data reduction technique disregarding some of the inherent information of the rainflow counting results. The ideas described in the following aim at exploitation of this information and making it available for use in the design and verification process. (au)

  14. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  15. Evaluation of DAPI direct count, computer assisted and plate count methods

    OpenAIRE

    Chivu, Bogdan

    2010-01-01

    The feasibility of using automatic counting of bacteria stained with highly specific and sensitive fluorescing DNA stain DAPI, 4',6-diamidino-2-phenylindole, and direct manual counting to enumerate both pure culture of Pseudomonas putida overnight culture and sea water enhanced culture, was tested in correlation with plate direct counting, turbidity and absorbance at 600nm, to obtain cross validation. Six diluted samples from overnight pure culture of Pseudomonas putida and sea water culture ...

  16. Tsallis Statistical Interpretation of Transverse Momentum Spectra in High-Energy pA Collisions

    Directory of Open Access Journals (Sweden)

    Bao-Chun Li

    2015-01-01

    Full Text Available In Tsallis statistics, we investigate charged pion and proton production for pCu and pPb interactions at 3, 8, and 15 GeV/c. Two versions of Tsallis distribution are implemented in a multisource thermal model. A comparison with experimental data of the HARP-CDP group shows that they both can reproduce the transverse momentum spectra, but the improved form gives a better description. It is also found that the difference between q and q′ is small when the temperature T = T′ for the same incident momentum and angular interval, and the value of q is greater than q′ in most cases.

  17. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...

  18. Low white blood cell count and cancer

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/patientinstructions/000675.htm Low white blood cell count and cancer To use ... high blood pressure, or seizures Continue Reading How Low is too Low? When your blood is tested, ...

  19. Reference intervals of complete blood count constituents are highly correlated to waist circumference: Should obese patients have their own "normal values?"

    NARCIS (Netherlands)

    Vuong, J.; Qiu, Y.; La, M.; Clarke, G.; Swinkels, D.W.; Cembrowski, G.

    2014-01-01

    Body mass index (BMI), the prevalent indicator of obesity, is not easily grasped by patients nor physicians. Waist circumference (WC) is correlated to obesity, is better understood and has a stronger relationship to the metabolic syndrome. We compiled WC, complete blood count (CBC) parameters as wel

  20. Factors associated with development of opportunistic infections in HIV-1 infected adults with high CD4 cell counts: a EuroSIDA study

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Mocroft, A; Dragsted, Ulrik Bak

    2006-01-01

    BACKGROUND: Limited data exist on factors predicting the development of opportunistic infections (OIs) at higher-than-expected CD4(+) cell counts in human immunodeficiency virus (HIV) type 1-infected adults. METHODS: Multivariate Poisson regression models were used to determine factors related to...

  1. High white blood cell count at diagnosis of childhood acute lymphoblastic leukaemia: biological background and prognostic impact. Results from the NOPHO ALL-92 and ALL-2000 studies

    DEFF Research Database (Denmark)

    Vaitkeviciene, G; Forestier, E; Hellebostad, M;

    2011-01-01

    Prognostic impact of peripheral blood white blood cell count (WBC) at the diagnosis of childhood acute lymphoblastic leukaemia (ALL) was evaluated in a population-based consecutive series of 2666 children aged 1–15 treated for ALL between 1992 and 2008 in the five Nordic countries (Denmark, Finland...

  2. High-tech exports from developing countries: A symptom of technology spurts or statistical illusion?

    OpenAIRE

    Martin Srholec

    2005-01-01

    Specialization in high-tech products is frequently used to capture technology intensity of exports. The literature suggests that developing countries are increasingly becoming exporters of high-tech products, and some may even be among the most deeply specialized countries in the field of high-tech exports. The paper scrutinizes the relevance of the taxonomies that classify exports by technological intensity in this context. It is shown that specialization in high-tech exports typically does ...

  3. What Counts as Evidence?

    Science.gov (United States)

    Dougherty Stahl, Katherine A.

    2014-01-01

    Each disciplinary community has its own criteria for determining what counts as evidence of knowledge in their academic field. The criteria influence the ways that a community's knowledge is created, communicated, and evaluated. Situating reading, writing, and language instruction within the content areas enables teachers to explicitly…

  4. Reticulocyte Count Test

    Science.gov (United States)

    ... may be ordered when: CBC results show a decreased RBC count and/or a decreased hemoglobin and hematocrit A healthcare practitioner wants to ... and hematocrit, to help determine the degree and rate of overproduction of RBCs ... during pregnancy . Newborns have a higher percentage of reticulocytes, but ...

  5. What Counts as Prostitution?

    Directory of Open Access Journals (Sweden)

    Stuart P. Green

    2016-08-01

    Full Text Available What counts, or should count, as prostitution? In the criminal law today, prostitution is understood to involve the provision of sexual services in exchange for money or other benefits. But what exactly is a ‘sexual service’? And what exactly is the nature of the required ‘exchange’? The key to answering these questions is to recognize that how we choose to define prostitution will inevitably depend on why we believe one or more aspects of prostitution are wrong or harmful, or should be criminalized or otherwise deterred, in the first place. These judgements, in turn, will often depend on an assessment of the contested empirical evidence on which they rest. This article describes a variety of real-world contexts in which the ‘what counts as prostitution’ question has arisen, surveys a range of leading rationales for deterring prostitution, and demonstrates how the answer to the definition question depends on the answer to the normative question. The article concludes with some preliminary thoughts on how analogous questions about what should count as sexual conduct arise in the context of consensual offences such as adultery and incest, as well as non-consensual offences such as sexual assault.

  6. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  7. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  8. The Nonlinear Statistics of High-contrast Patches in Natural Images

    DEFF Research Database (Denmark)

    Lee, Ann; Pedersen, Kim Steenstrup; Mumford, David

    2003-01-01

    of natural images, not just marginals, and the need to understand the intrinsic dimensionality and nature of the data. We believe that object-like structures in the world and the sensor properties of the probing device generate observations that are concentrated along predictable shapes in state space. Our......Recently, there has been a great deal of interest in modeling the non-Gaussian structures of natural images. However, despite the many advances in the direction of sparse coding and multi-resolution analysis, the full probability distribution of pixel values in a neighborhood has not yet been...... study of natural image statistics accounts for local geometries (such as edges) in natural scenes, but does not impose such strong assumptions on the data as independent components or sparse coding by linear change of bases....

  9. Deep Source-Counting at 3 GHz

    Science.gov (United States)

    Vernstrom, Tessa; Wall, Jasper; Scott, Douglas

    2014-05-01

    We describe an analysis of 3-GHz confusion-limited data from the Karl J. Jansky Very Large Array (VLA). We show that with minimal model assumptions, P(D), Bayesian and Markov-Chain Mone-Carlo (MCMC) methods can define the source count to levels some 10 times fainter than the conventional confusion limit. Our verification process includes a full realistic simulation that considers known information on source angular extent and clustering. It appears that careful analysis of the statistical properties of an image is more effective than counting individual objects.

  10. Monitoring Milk Somatic Cell Counts

    Directory of Open Access Journals (Sweden)

    Gheorghe Şteţca

    2014-11-01

    Full Text Available The presence of somatic cells in milk is a widely disputed issue in milk production sector. The somatic cell counts in raw milk are a marker for the specific cow diseases such as mastitis or swollen udder. The high level of somatic cells causes physical and chemical changes to milk composition and nutritional value, and as well to milk products. Also, the mastitic milk is not proper for human consumption due to its contribution to spreading of certain diseases and food poisoning. According to these effects, EU Regulations established the maximum threshold of admitted somatic cells in raw milk to 400000 cells / mL starting with 2014. The purpose of this study was carried out in order to examine the raw milk samples provided from small farms, industrial type farms and milk processing units. There are several ways to count somatic cells in milk but the reference accepted method is the microscopic method described by the SR EN ISO 13366-1/2008. Generally samples registered values in accordance with the admissible limit. By periodical monitoring of the somatic cell count, certain technological process issues are being avoided and consumer’s health ensured.

  11. High frequency statistical energy analysis applied to fluid filled pipe systems

    NARCIS (Netherlands)

    Beek, P.J.G. van; Smeulers, J.P.M.

    2013-01-01

    In pipe systems, carrying gas with high velocities, broadband turbulent pulsations can be generated causing strong vibrations and fatigue failure, called Acoustic Fatigue. This occurs at valves with high pressure differences (i.e. chokes), relief valves and obstructions in the flow, such as sharp

  12. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  13. High frequency statistical energy analysis applied to fluid filled pipe systems

    NARCIS (Netherlands)

    Beek, P.J.G. van; Smeulers, J.P.M.

    2013-01-01

    In pipe systems, carrying gas with high velocities, broadband turbulent pulsations can be generated causing strong vibrations and fatigue failure, called Acoustic Fatigue. This occurs at valves with high pressure differences (i.e. chokes), relief valves and obstructions in the flow, such as sharp be

  14. Comparison of single-photon counting and charge-integrating detectors for X-ray high-resolution imaging of small biological objects

    Science.gov (United States)

    Frallicciardi, Paola Maria; Jakubek, Jan; Vavrik, Daniel; Dammer, Jiri

    2009-08-01

    This work presents a direct comparison of two pixel detectors: a charge-integrating flat panel imager coupled to a CsI:Tl scintillator and a hybrid silicon detector of Medipix2 type, working in a single-photon counting mode. The comparison concerns image quality in terms of system-spatial resolution, signal-to-noise ratio and contrast in imaging of small biological objects. It will be shown that, at photon energies below 40 keV and for low attenuating biological objects, single-photon counting detectors are more appropriate for small-animal imaging than flat panel devices right due to better spatial resolution, signal-to-noise ratio and contrast.

  15. Source Counts from the 15 microns ISOCAM Deep Surveys

    CERN Document Server

    Elbaz, D; Fadda, D; Aussel, H; Désert, F X; Franceschini, A; Flores, H; Harwit, M; Puget, J L; Starck, J L; Danese, L; Koo, D C; Mandolesi, R

    1999-01-01

    We present the results of the five mid-IR 15 microns (12-18 microns LW3 band) ISOCAM Guaranteed Time Extragalactic Surveys performed in the regions of the Lockman Hole and Marano Field. The roughly 1000 sources detected, 600 of which have a flux above the 80 % completeness limit, guarantee a very high statistical significance for the integral and differential source counts from 0.1 mJy up to 5 mJy. By adding the ISOCAM surveys of the HDF-North and South (plus flanking fields) and the lensing cluster A2390 at low fluxes and IRAS at high fluxes, we cover four decades in flux from 50 microJy to 0.3 Jy. The slope of the differential counts is very steep (alpha =-3.0) in the flux range 0.4-4 mJy, hence much above the Euclidean expectation of alpha =-2.5. When compared with no-evolution models based on IRAS, our counts show a factor of 10 excess at 400 microJy, and a fast convergence, with alpha =-1.6 at lower fluxes.

  16. Atypical manifestation of progressive outer retinal necrosis in AIDS patient with CD4+ T-cell counts more than 100 cells/microL on highly active antiretroviral therapy.

    Science.gov (United States)

    Vichitvejpaisal, Pornpattana; Reeponmahar, Somporn; Tantisiriwat, Woraphot

    2009-06-01

    Typical progressive outer retinal necrosis (PORN) is an acute ocular infectious disease in acquired immunodeficiency syndrome (AIDS) patients with extremely low CD4+ T-cell counts. It is a form of the Varicella- zoster virus (VZV) infection. This destructive infection has an extremely rapid course that may lead to blindness in affected eyes within days or weeks. Attempts at its treatment have had limited success. We describe the case of a bilateral PORN in an AIDS patient with an initial CD4+ T-cell count >100 cells/microL that developed after initiation of highly active antiretroviral therapy (HAART). A 29-year-old Thai female initially diagnosed with human immunodeficiency virus (HIV) in 1998, presented with bilaterally decreased visual acuity after initiating HAART two months earlier. Multiple yellowish spots appeared in the deep retina without evidence of intraocular inflammation or retinal vasculitis. Her CD4+ T-cell count was 127 cells/microL. She was diagnosed as having PORN based on clinical features and positive VZV in the aqueous humor and vitreous by polymerase chain reaction (PCR). Despite combined treatment with intravenous acyclovir and intravitreous ganciclovir, the patient's visual acuity worsened with no light-perception in either eye. This case suggests that PORN should be included in the differential diagnosis of reduced visual acuity in AIDS patients initiating HAART with higher CD4+ T-cell counts. PORN may be a manifestation of the immune reconstitution syndrome.

  17. Factors associated with development of opportunistic infections in HIV-1 infected adults with high CD4 cell counts: a EuroSIDA study

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Mocroft, A; Dragsted, Ulrik Bak;

    2006-01-01

    , incidence rate ratio [IRR] per 50% lower CD4(+) cell count, 5.37 [95% confidence interval {CI}, 3.71-7.77]; for group 2, 4.28 [95% CI, 2.98-6.14]). Injection drug use but not current CD4(+) cell count predicted risk in group 3. Use of antiretroviral treatment was associated with a lower incidence of OIs......BACKGROUND: Limited data exist on factors predicting the development of opportunistic infections (OIs) at higher-than-expected CD4(+) cell counts in human immunodeficiency virus (HIV) type 1-infected adults. METHODS: Multivariate Poisson regression models were used to determine factors related...... in all groups, likely by reducing HIV-1 RNA levels (IRR per 1-log(10) copies/mL higher HIV-1 RNA levels for group 1, 1.50 [95% CI, 1.15-1.95]; for group 2, 1.68 [95% CI, 1.40-2.02]; and for group 3, 1.89 [95% CI, 1.40-2.54]). CONCLUSION: Although the absolute incidence is low, the current CD4(+) cell...

  18. Additive effects in high-voltage layered-oxide cells: A statistics of mixtures approach

    Science.gov (United States)

    Sahore, Ritu; Peebles, Cameron; Abraham, Daniel P.; Gilbert, James; Bloom, Ira

    2017-09-01

    Li1.03(Ni0.5Mn0.3Co0.2)0.97O2 (NMC)-based coin cells containing the electrolyte additives vinylene carbonate (VC) and tris(trimethylsilyl)phosphite (TMSPi) in the range of 0-2 wt% were cycled between 3.0 and 4.4 V. The changes in capacity at rates of C/10 and C/1 and resistance at 60% state of charge were found to follow linear-with-time kinetic rate laws. Further, the C/10 capacity and resistance data were amenable to modeling by a statistics of mixtures approach. Applying physical meaning to the terms in the empirical models indicated that the interactions between the electrolyte and additives were not simple. For example, there were strong, synergistic interactions between VC and TMSPi affecting C/10 capacity loss, as expected, but there were other, more subtle interactions between the electrolyte components. The interactions between these components controlled the C/10 capacity decline and resistance increase.

  19. A Profile of Romanian Highly Educated Eco-Consumers Interested in Product Recycling A Statistical Approach

    Directory of Open Access Journals (Sweden)

    Simionescu Mihaela

    2014-07-01

    Full Text Available The objective of this research is to create a profile of the Romanian eco-consumer with university education. The profile is not limited to the information regarding environmental and economic benefits of recycling, but focuses on ecological behaviour. A detailed statistical analysis was made based on a large representative sample of respondents with secondary and university education. Indeed, the tendency of practical ecobehaviour becomes more pronounced for the people with university education. For people that are more than 30 years old the chance of being aware of the significance of the recycling symbols on the packages decreases, the lowest chance being given to people aged more than 50. The respondents that are interested in environment protection buy products with ecological symbols. However, those people who already know the meaning of these symbols do not buy this type of products for ecological reasons, even if they are interested in the environment protection. This research also offers an extensive description of its results, being an opportunity for the respondents to know more about the meaning of the recycling symbols. The results of this research also provide information being a guideline for consumers. This study achieves two main goals: the ecological component (the eco-consumers were identified and ordinary consumers were attracted through the ecological behaviour and the economic aspect (the resources allocation will be more efficient and the marketers will be able to address ecoconsumers who have specific characteristics.

  20. THE HIGH RESOLUTION MIMO RADAR SYSTEM BASED ON MINIMIZING THE STATISTICAL COHERENCE OF COMPRESSED SENSING MATRIX

    Institute of Scientific and Technical Information of China (English)

    Zhu Yanping; Song Yaoliang; Chen Jinli; Zhao Delin

    2012-01-01

    Compressed Sensing (CS) theory is a great breakthrough of the traditional Nyquist sampling theory.It can accomplish compressive sampling and signal recovery based on the sparsity of interested signal,the randomness of measurement matrix and nonlinear optimization method of signal recovery.Firstly,the CS principle is reviewed.Then the ambiguity function of Multiple-Input Multiple-Output (MIMO) radar is deduced.After that,combined with CS theory,the ambiguity function of MIMO radar is analyzed and simulated in detail.At last,the resolutions of coherent and non-coherent MIMO radars on the CS theory are discussed.Simulation results show that the coherent MIMO radar has better resolution performance than the non-coherent.But the coherent ambiguity function has higher side lobes,which caused a deterioration in radar target detection performances.The stochastic embattling method of sparse array based on minimizing the statistical coherence of sensing matrix is proposed.And simulation results show that it could effectively suppress side lobes of the ambiguity function and improve the capability of weak target detection.