WorldWideScience

Sample records for modeling counting statistics

  1. Statistical modelling for falls count data.

    Science.gov (United States)

    Ullah, Shahid; Finch, Caroline F; Day, Lesley

    2010-03-01

    Falls and their injury outcomes have count distributions that are highly skewed toward the right with clumping at zero, posing analytical challenges. Different modelling approaches have been used in the published literature to describe falls count distributions, often without consideration of the underlying statistical and modelling assumptions. This paper compares the use of modified Poisson and negative binomial (NB) models as alternatives to Poisson (P) regression, for the analysis of fall outcome counts. Four different count-based regression models (P, NB, zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB)) were each individually fitted to four separate fall count datasets from Australia, New Zealand and United States. The finite mixtures of P and NB regression models were also compared to the standard NB model. Both analytical (F, Vuong and bootstrap tests) and graphical approaches were used to select and compare models. Simulation studies assessed the size and power of each model fit. This study confirms that falls count distributions are over-dispersed, but not dispersed due to excess zero counts or heterogeneous population. Accordingly, the P model generally provided the poorest fit to all datasets. The fit improved significantly with NB and both zero-inflated models. The fit was also improved with the NB model, compared to finite mixtures of both P and NB regression models. Although there was little difference in fit between NB and ZINB models, in the interests of parsimony it is recommended that future studies involving modelling of falls count data routinely use the NB models in preference to the P or ZINB or finite mixture distribution. The fact that these conclusions apply across four separate datasets from four different samples of older people participating in studies of different methodology, adds strength to this general guiding principle.

  2. Mesoscopic full counting statistics and exclusion models

    Science.gov (United States)

    Roche, P.-E.; Derrida, B.; Douçot, B.

    2005-02-01

    We calculate the distribution of current fluctuations in two simple exclusion models. Although these models are classical, we recover even for small systems such as a simple or a double barrier, the same distibution of current as given by traditional formalisms for quantum mesoscopic conductors. Due to their simplicity, the full counting statistics in exclusion models can be reduced to the calculation of the largest eigenvalue of a matrix, the size of which is the number of internal configurations of the system. As examples, we derive the shot noise power and higher order statistics of current fluctuations (skewness, full counting statistics, ....) of various conductors, including multiple barriers, diffusive islands between tunnel barriers and diffusive media. A special attention is dedicated to the third cumulant, which experimental measurability has been demonstrated lately.

  3. Full counting statistics in the self-dual interacting resonant level model.

    Science.gov (United States)

    Carr, Sam T; Bagrets, Dmitry A; Schmitteckert, Peter

    2011-11-11

    We present a general technique to obtain the zero temperature cumulant generating function of the full counting statistics of charge transfer in interacting impurity models out of equilibrium from time-dependent simulations on a lattice. We demonstrate the technique with application to the self-dual interacting resonant level model, where very good agreement between numerical simulations using the density matrix renormalization group and those obtained analytically from the thermodynamic Bethe ansatz is found. We show from the exact form of counting statistics that the quasiparticles involved in transport carry charge 2e in the low bias regime and e/2 in the high bias regime.

  4. Oscillations in counting statistics

    CERN Document Server

    Wilk, Grzegorz

    2016-01-01

    The very large transverse momenta and large multiplicities available in present LHC experiments on pp collisions allow a much closer look at the corresponding distributions. Some time ago we discussed a possible physical meaning of apparent log-periodic oscillations showing up in p_T distributions (suggesting that the exponent of the observed power-like behavior is complex). In this talk we concentrate on another example of oscillations, this time connected with multiplicity distributions P(N). We argue that some combinations of the experimentally measured values of P(N) (satisfying the recurrence relations used in the description of cascade-stochastic processes in quantum optics) exhibit distinct oscillatory behavior, not observed in the usual Negative Binomial Distributions used to fit data. These oscillations provide yet another example of oscillations seen in counting statistics in many different, apparently very disparate branches of physics further demonstrating the universality of this phenomenon.

  5. Monte Carlo study of single-barrier structure based on exclusion model full counting statistics

    Institute of Scientific and Technical Information of China (English)

    Chen Hua; Du Lei; Qu Cheng-Li; He Liang; Chen Wen-Hao; Sun Peng

    2011-01-01

    Different from the usual full counting statistics theoretical work that focuses on the higher order cumulants computation by using cumulant generating function in electrical structures, Monte Carlo simulation of single-barrier structure is performed to obtain time series for two types of widely applicable exclusion models, counter-flows model,and tunnel model. With high-order spectrum analysis of Matlab, the validation of Monte Carlo methods is shown through the extracted first four cumulants from the time series, which are in agreement with those from cumulant generating function. After the comparison between the counter-flows model and the tunnel model in a single barrier structure, it is found that the essential difference between them consists in the strictly holding of Pauli principle in the former and in the statistical consideration of Pauli principle in the latter.

  6. Multiterminal counting statistics

    OpenAIRE

    2003-01-01

    The review is given of the calculational schemes that allows for easy evaluation of full current statistics (FCS) in multi-terminal mesoscopic systems. First, the scattering approach by Levitov {\\it et.al} to FCS is outlined. Then the multi-terminal FCS of the non-interacting electrons is considered. We show, that this theory appears to be a circuit theory of $2\\times 2$ matrices associated with Keldysh Green functions. Further on the FCS in the opposite situation of mesoscopic systems placed...

  7. Using spatiotemporal statistical models to estimate animal abundance and infer ecological dynamics from survey counts

    Science.gov (United States)

    Conn, Paul B.; Johnson, Devin S.; Ver Hoef, Jay M.; Hooten, Mevin B.; London, Joshua M.; Boveng, Peter L.

    2015-01-01

    Ecologists often fit models to survey data to estimate and explain variation in animal abundance. Such models typically require that animal density remains constant across the landscape where sampling is being conducted, a potentially problematic assumption for animals inhabiting dynamic landscapes or otherwise exhibiting considerable spatiotemporal variation in density. We review several concepts from the burgeoning literature on spatiotemporal statistical models, including the nature of the temporal structure (i.e., descriptive or dynamical) and strategies for dimension reduction to promote computational tractability. We also review several features as they specifically relate to abundance estimation, including boundary conditions, population closure, choice of link function, and extrapolation of predicted relationships to unsampled areas. We then compare a suite of novel and existing spatiotemporal hierarchical models for animal count data that permit animal density to vary over space and time, including formulations motivated by resource selection and allowing for closed populations. We gauge the relative performance (bias, precision, computational demands) of alternative spatiotemporal models when confronted with simulated and real data sets from dynamic animal populations. For the latter, we analyze spotted seal (Phoca largha) counts from an aerial survey of the Bering Sea where the quantity and quality of suitable habitat (sea ice) changed dramatically while surveys were being conducted. Simulation analyses suggested that multiple types of spatiotemporal models provide reasonable inference (low positive bias, high precision) about animal abundance, but have potential for overestimating precision. Analysis of spotted seal data indicated that several model formulations, including those based on a log-Gaussian Cox process, had a tendency to overestimate abundance. By contrast, a model that included a population closure assumption and a scale prior on total

  8. Unifying quantum heat transfer in a nonequilibrium spin-boson model with full counting statistics

    Science.gov (United States)

    Wang, Chen; Ren, Jie; Cao, Jianshu

    2017-02-01

    To study the full counting statistics of quantum heat transfer in a driven nonequilibrium spin-boson model, we develop a generalized nonequilibrium polaron-transformed Redfield equation with an auxiliary counting field. This enables us to study the impact of qubit-bath coupling ranging from weak to strong regimes. Without external modulations, we observe maximal values of both steady-state heat flux and noise power in moderate coupling regimes, below which we find that these two transport quantities are enhanced by the finite-qubit-energy bias. With external modulations, the geometric-phase-induced heat flux shows a monotonic decrease upon increasing the qubit-bath coupling at zero qubit energy bias (without bias). While under the finite-qubit-energy bias (with bias), the geometric-phase-induced heat flux exhibits an interesting reversal behavior in the strong coupling regime. Our results unify the seemingly contradictory results in weak and strong qubit-bath coupling regimes and provide detailed dissections for the quantum fluctuation of nonequilibrium heat transfer.

  9. Variability in faecal egg counts – a statistical model to achieve reliable determination of anthelmintic resistance in livestock

    DEFF Research Database (Denmark)

    Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret

    statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL...... was shown to be unaffected by single outlier horses on the farms, while traditional calculations were strongly biased. The statistical model combines information between farms to distinguish between variability and genuine reduction in efficacy and can be adapted to handle FECRT data obtained from other......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...

  10. Variability in faecal egg counts – a statistical model to achieve reliable determination of anthelmintic resistance in livestock

    DEFF Research Database (Denmark)

    Nielsen, Martin Krarup; Vidyashankar, Anand N.; Hanlon, Bret;

    arithmetic calculations classified nine farms (14.1 %) as resistant and 11 farms (17.2 %) as suspect resistant. Using 10000 Monte Carlo simulated data sets, our methodology provides a reliable classification of farms into different resistance categories with a false discovery rate of 1.02 %. The methodology...... statistical model was therefore developed for analysis of FECRT data from multiple farms. Horse age, gender, zip code and pre-treatment egg count were incorporated into the model. Horses and farms were kept as random effects. Resistance classifications were based on model-based 95% lower confidence limit (LCL......) values of predicted mean efficacies, and cutoff values were justified statistically. The model was used to evaluate the efficacy of pyrantel embonate paste on 64 Danish horse farms. Of 1644 horses, 614 had egg counts > 200 eggs per gram (EPG) and were treated. The cutoff LCL values used for classifying...

  11. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  12. The statistical distribution of the number of counted scintillation photons in digital silicon photomultipliers: model and validation.

    Science.gov (United States)

    van Dam, Herman T; Seifert, Stefan; Schaart, Dennis R

    2012-08-07

    In the design and application of scintillation detectors based on silicon photomultipliers (SiPMs), e.g. in positron emission tomography imaging, it is important to understand and quantify the non-proportionality of the SiPM response due to saturation, crosstalk and dark counts. A new type of SiPM, the so-called digital silicon photomultiplier (dSiPM), has recently been introduced. Here, we develop a model of the probability distribution of the number of fired microcells, i.e. the number of counted scintillation photons, in response to a given amount of energy deposited in a scintillator optically coupled to a dSiPM. Based on physical and functional principles, the model elucidates the statistical behavior of dSiPMs. The model takes into account the photon detection efficiency of the detector; the light yield, excess variance and time profile of the scintillator; and the crosstalk probability, dark count rate, integration time and the number of microcells of the dSiPM. Furthermore, relations for the expectation value and the variance of the number of fired cells are deduced. These relations are applied in the experimental validation of the model using a dSiPM coupled to a LSO:Ce,Ca scintillator. Finally, we propose an accurate method for the correction of energy spectra measured with dSiPM-based scintillation detectors.

  13. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration.

  14. Counting statistics for genetic switches based on effective interaction approximation

    Science.gov (United States)

    Ohkubo, Jun

    2012-09-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid having the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  15. Counting statistics for genetic switches based on effective interaction approximation

    CERN Document Server

    Ohkubo, Jun

    2012-01-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid to have the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  16. A statistical model investigating the prevalence of tuberculosis in New York City using counting processes with two change-points.

    Science.gov (United States)

    Achcar, J A; Martinez, E Z; Ruffino-Netto, A; Paulino, C D; Soares, P

    2008-12-01

    We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software.

  17. Statistical Methods for Unusual Count Data

    DEFF Research Database (Denmark)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads

    2016-01-01

    microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative...

  18. Use of the HPI Model 2080 pulsed neutron detector at the LANSCE complex - vulnerabilities and counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Jones, K.W. [Los Alamos National Lab., NM (United States); Browman, A. [Amparo Corp., Sante Fe, NM (United States)

    1997-01-01

    The BPI Model 2080 Pulsed Neutron Detector has been used for over seven years as an area radiation monitor and dose limiter at the LANSCE accelerator complex. Operating experience and changing environments over this time have revealed several vulnerabilities (susceptibility to electrical noise, paralysis in high dose rate fields, etc.). Identified vulnerabilities have been connected; these modifications include component replacement and circuit design changes. The data and experiments leading to these modifications will be presented and discussed. Calibration of the instrument is performed in mixed static gamma and neutron source fields. The statistical characteristics of the Geiger-Muller tubes coupled with significantly different sensitivity to gamma and neutron doses require that careful attention be paid to acceptable fluctuations in dose rate over time during calibration. The performance of the instrument has been modeled using simple Poisson statistics and the operating characteristics of the Geiger-Muller tubes. The results are in excellent agreement with measurements. The analysis and comparison with experimental data will be presented.

  19. Particle number counting statistics in ideal Bose gases

    National Research Council Canada - National Science Library

    Christoph Weiss; Martin Wilkens

    1997-01-01

    We discuss the exact particle number counting statistics of degenerate ideal Bose gases in the microcanonical, canonical, and grand-canonical ensemble, respectively, for various trapping potentials...

  20. Count response model for the CMB spots

    CERN Document Server

    Giovannini, Massimo

    2010-01-01

    The statistics of the curvature quanta generated during a stage of inflationary expansion is used to derive a count response model for the large-scale phonons determining, in the concordance lore, the warmer and the cooler spots of the large-scale temperature inhomogeneities. The multiplicity distributions for the counting statistics are shown to be generically overdispersed in comparison with conventional Poissonian regressions. The generalized count response model deduced hereunder accommodates an excess of correlations in the regime of high multiplicities and prompts dedicated analyses with forthcoming data collected by instruments of high angular resolution and high sensitivity to temperature variations per pixel.

  1. Full counting statistics of a nonadiabatic electron pump

    Science.gov (United States)

    Croy, Alexander; Saalmann, Ulf

    2016-04-01

    Nonadiabatic charge pumping through a single-level quantum dot with periodically modulated parameters is studied theoretically. By means of a quantum-master-equation approach the full counting statistics of the system is obtained. We find a trinomial-probability distribution of the charge transfer, which adequately describes the reversal of the pumping current by sweeping the driving frequency. Further, we derive equations of motion for current and noise and solve those numerically for two different driving schemes. Both show interesting features, which can be fully analyzed due to the simple and generic model studied.

  2. Counting Statistics and Ion Interval Density in AMS

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, J S; Ognibene, T; Palmblad, M; Reimer, P

    2004-08-03

    Confidence in the precisions of AMS and decay measurements must be comparable for the application of the {sup 14}C calibration to age determinations using both technologies. We confirmed the random nature of the temporal distribution of {sup 14}C ions in an AMS spectrometer for a number of sample counting rates and properties of the sputtering process. The temporal distribution of ion counts was also measured to confirm the applicability of traditional counting statistics.

  3. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  4. Finite-frequency counting statistics of electron transport: Markovian theory

    Energy Technology Data Exchange (ETDEWEB)

    Marcos, D; Aguado, R [Departamento de Teoria y Simulacion de Materiales, Instituto de Ciencia de Materiales de Madrid, CSIC, Cantoblanco 28049, Madrid (Spain); Emary, C; Brandes, T, E-mail: david.marcos@icmm.csic.es [Institut fuer Theoretische Physik, Hardenbergstrasse 36, TU Berlin, D-10623 Berlin (Germany)

    2010-12-15

    We present a theory of frequency-dependent counting statistics of electron transport through nanostructures within the framework of Markovian quantum master equations. Our method allows the calculation of finite-frequency current cumulants of arbitrary order, as we explicitly show for the second- and third-order cumulants. Our formulae generalize previous zero-frequency expressions in the literature and can be viewed as an extension of MacDonald's formula beyond shot noise. When combined with an appropriate treatment of tunneling using, e.g., the Liouvillian perturbation theory in Laplace space, our method can deal with arbitrary bias voltages and frequencies, as we illustrate with the paradigmatic example of transport through a single resonant level model. We discuss various interesting limits, including the recovery of the fluctuation-dissipation theorem near linear response, as well as some drawbacks inherent to the Markovian description arising from the neglect of quantum fluctuations.

  5. Submillimeter Number Counts From Statistical Analysis of BLAST Maps

    CERN Document Server

    Patanchon, Guillaume; Bock, James J; Chapin, Edward L; Devlin, Mark J; Dicker, Simon R; Griffin, Matthew; Gundersen, Joshua O; Halpern, Mark; Hargrave, Peter C; Hughes, David H; Klein, Jeff; Marsden, Gaelen; Mauskopf, Philip; Moncelsi, Lorenzo; Netterfield, Calvin B; Olmi, Luca; Pascale, Enzo; Rex, Marie; Scott, Douglas; Semisch, Christopher; Thomas, Nicholas; Truch, Matthew D P; Tucker, Carole; Tucker, Gregory S; Viero, Marco P; Wiebe, Donald V

    2009-01-01

    We describe the application of a statistical method to estimate submillimeter galaxy number counts from the confusion limited observations of the Balloon-borne Large Aperture Submillimeter Telescope (BLAST). Our method is based on a maximum likelihood fit to the pixel histogram, sometimes called 'P(D)', an approach which has been used before to probe faint counts, the difference being that here we advocate its use even for sources with relatively high signal-to-noise ratios. This method has an advantage over standard techniques of source extraction in providing an unbiased estimate of the counts from the bright end down to flux densities well below the confusion limit. We specifically analyse BLAST observations of a roughly 10 sq. deg map centered on the Great Observatories Origins Deep Survey South field. We provide estimates of number counts at the three BLAST wavelengths, 250, 350, and 500 microns, instead of counting sources in flux bins we estimate the counts at several flux density nodes connected with ...

  6. Predictive Model Assessment for Count Data

    Science.gov (United States)

    2007-09-05

    critique count regression models for patent data, and assess the predictive performance of Bayesian age-period-cohort models for larynx cancer counts...the predictive performance of Bayesian age-period-cohort models for larynx cancer counts in Germany. We consider a recent suggestion by Baker and...Figure 5. Boxplots for various scores for patent data count regressions. 11 Table 1 Four predictive models for larynx cancer counts in Germany, 1998–2002

  7. Probing the Conformations of Single Molecule via Photon Counting Statistics

    CERN Document Server

    Peng, Yonggang; Yang, Chuanlu; Zheng, Yujun

    2014-01-01

    We suggest an approach to detect the conformation of single molecule by using the photon counting statistics. The generalized Smoluchoswki equation is employed to describe the dynamical process of conformational change of single molecule. The resonant trajectories of the emission photon numbers $$ and the Mandel's $Q$ parameter, in the space of conformational coordinates $\\bm{\\mathcal{X}}$ and frequency $\\omega_L$ of external field ($\\bm{\\mathcal{X}}-\\omega_L$ space), can be used to rebuild the conformation of the single molecule. As an example, we consider Thioflavin T molecule. It demonstrates that the results of conformations extracted by employing the photon counting statistics is excellent agreement with the results of {\\it ab initio} computation.

  8. Large deviations of ergodic counting processes: a statistical mechanics approach.

    Science.gov (United States)

    Budini, Adrián A

    2011-07-01

    The large-deviation method allows to characterize an ergodic counting process in terms of a thermodynamic frame where a free energy function determines the asymptotic nonstationary statistical properties of its fluctuations. Here we study this formalism through a statistical mechanics approach, that is, with an auxiliary counting process that maximizes an entropy function associated with the thermodynamic potential. We show that the realizations of this auxiliary process can be obtained after applying a conditional measurement scheme to the original ones, providing is this way an alternative measurement interpretation of the thermodynamic approach. General results are obtained for renewal counting processes, that is, those where the time intervals between consecutive events are independent and defined by a unique waiting time distribution. The underlying statistical mechanics is controlled by the same waiting time distribution, rescaled by an exponential decay measured by the free energy function. A scale invariance, shift closure, and intermittence phenomena are obtained and interpreted in this context. Similar conclusions apply for nonrenewal processes when the memory between successive events is induced by a stochastic waiting time distribution.

  9. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-01-25

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials.

  10. Regression Models for Count Data in R

    Directory of Open Access Journals (Sweden)

    Christian Kleiber

    2008-06-01

    Full Text Available The classical Poisson, geometric and negative binomial regression models for count data belong to the family of generalized linear models and are available at the core of the statistics toolbox in the R system for statistical computing. After reviewing the conceptual and computational features of these methods, a new implementation of hurdle and zero-inflated regression models in the functions hurdle( and zeroinfl( from the package pscl is introduced. It re-uses design and functionality of the basic R functions just as the underlying conceptual tools extend the classical models. Both hurdle and zero-inflated model, are able to incorporate over-dispersion and excess zeros-two problems that typically occur in count data sets in economics and the social sciences—better than their classical counterparts. Using cross-section data on the demand for medical care, it is illustrated how the classical as well as the zero-augmented models can be fitted, inspected and tested in practice.

  11. Extreme value statistics of weak lensing shear peak counts

    CERN Document Server

    Reischke, Robert; Bartelmann, Matthias

    2015-01-01

    The statistics of peaks in weak gravitational lensing maps is a promising technique to constrain cosmological parameters in present and future surveys. Here we investigate its power when using general extreme value statistics which is very sensitive to the exponential tail of the halo mass function. To this end, we use an analytic method to quantify the number of weak lensing peaks caused by galaxy clusters, large-scale structures and observational noise. Doing so, we further improve the method in the regime of high signal-to-noise ratios dominated by non-linear structures by accounting for the embedding of those counts into the surrounding shear caused by large scale structures. We derive the extreme value and order statistics for both over-densities (positive peaks) and under-densities (negative peaks) and provide an optimized criterion to split a wide field survey into sub-fields in order to sample the distribution of extreme values such that the expected objects causing the largest signals are mostly due ...

  12. Gene coexpression measures in large heterogeneous samples using count statistics.

    Science.gov (United States)

    Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan

    2014-11-18

    With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.

  13. Reprint of : Full counting statistics of Majorana interferometers

    Science.gov (United States)

    Strübi, Grégory; Belzig, Wolfgang; Schmidt, Thomas L.; Bruder, Christoph

    2016-08-01

    We study the full counting statistics of interferometers for chiral Majorana fermions with two incoming and two outgoing Dirac fermion channels. In the absence of interactions, the FCS can be obtained from the 4×4 scattering matrix S that relates the outgoing Dirac fermions to the incoming Dirac fermions. After presenting explicit expressions for the higher-order current correlations for a modified Hanbury Brown-Twiss interferometer, we note that the cumulant-generating function can be interpreted such that unit-charge transfer processes correspond to two independent half-charge transfer processes, or alternatively, to two independent electron-hole conversion processes. By a combination of analytical and numerical approaches, we verify that this factorization property holds for a general SO(4) scattering matrix, i.e. for a general interferometer geometry.

  14. Particle number counting statistics in ideal Bose gases.

    Science.gov (United States)

    Weiss, C; Wilkens, M

    1997-11-10

    We discuss the exact particle number counting statistics of degenerate ideal Bose gases in the microcanonical, canonical, and grand-canonical ensemble, respectively, for various trapping potentials. We then invoke the Maxwell's Demon ensemble [Navez et el., Phys. Rev. Lett. (1997)] and show that for large total number of particles the root-mean-square fluctuation of the condensate occupation scales n0 / [T=Tc] r N s with scaling exponents r = 3=2, s = 1=2 for the3D harmonic oscillator trapping potential, and r = 1,s= 2=3 for the 3D box. We derive an explicit expression for r and s in terms of spatial dimension D and spectral index sigma of the single-particle energy spectrum. Our predictions also apply to systems where Bose-Einstein condensation does not occur. We point out that the condensate fluctuations in the microcanonical and canonical ensemble respect the principle of thermodynamic equivalence.

  15. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study

  16. Statistical analysis of dark count rate in Geiger-mode APD FPAs

    Science.gov (United States)

    Itzler, Mark A.; Krishnamachari, Uppili; Chau, Quan; Jiang, Xudong; Entwistle, Mark; Owens, Mark; Slomkowski, Krystyna

    2014-10-01

    We present a temporal statistical analysis of the array-level dark count behavior of Geiger-mode avalanche photodiode (GmAPD) focal plane arrays that distinguishes between Poissonian intrinsic dark count rate and non-Poissonian crosstalk counts by considering "inter-arrival" times between successive counts from the entire array. For 32 x 32 format sensors with 100 μm pixel pitch, we show the reduction of crosstalk for smaller active area sizes within the pixel. We also compare the inter-arrival time behavior for arrays with narrow band (900 - 1100 nm) and broad band (900 - 1600 nm) spectral response. We then consider a similar analysis of larger format 128 x 32 arrays. As a complement to the temporal analysis, we describe the results of a spatial analysis of crosstalk events. Finally, we propose a simple model for the impact of crosstalk events on the Poissonian statistics of intrinsic dark counts that provides a qualitative explanation for the results of the inter-arrival time analysis for arrays with varying degrees of crosstalk.

  17. Statistical analysis of data from dilution assays with censored correlated counts.

    Science.gov (United States)

    Quiroz, Jorge; Wilson, Jeffrey R; Roychoudhury, Satrajit

    2012-01-01

    Frequently, count data obtained from dilution assays are subject to an upper detection limit, and as such, data obtained from these assays are usually censored. Also, counts from the same subject at different dilution levels are correlated. Ignoring the censoring and the correlation may provide unreliable and misleading results. Therefore, any meaningful data modeling requires that the censoring and the correlation be simultaneously addressed. Such comprehensive approaches of modeling censoring and correlation are not widely used in the analysis of dilution assays data. Traditionally, these data are analyzed using a general linear model on a logarithmic-transformed average count per subject. However, this traditional approach ignores the between-subject variability and risks, providing inconsistent results and unreliable conclusions. In this paper, we propose the use of a censored negative binomial model with normal random effects to analyze such data. This model addresses, in addition to the censoring and the correlation, any overdispersion that may be present in count data. The model is shown to be widely accessible through the use of several modern statistical software. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Modelling of fire count data: fire disaster risk in Ghana.

    Science.gov (United States)

    Boadi, Caleb; Harvey, Simon K; Gyeke-Dako, Agyapomaa

    2015-01-01

    Stochastic dynamics involved in ecological count data require distribution fitting procedures to model and make informed judgments. The study provides empirical research, focused on the provision of an early warning system and a spatial graph that can detect societal fire risks. It offers an opportunity for communities, organizations, risk managers, actuaries and governments to be aware of, and understand fire risks, so that they will increase the direct tackling of the threats posed by fire. Statistical distribution fitting method that best helps identify the stochastic dynamics of fire count data is used. The aim is to provide a fire-prediction model and fire spatial graph for observed fire count data. An empirical probability distribution model is fitted to the fire count data and compared to the theoretical probability distribution of the stochastic process of fire count data. The distribution fitted to the fire frequency count data helps identify the class of models that are exhibited by the fire and provides time leading decisions. The research suggests that fire frequency and loss (fire fatalities) count data in Ghana are best modelled with a Negative Binomial Distribution. The spatial map of observed fire frequency and fatality measured over 5 years (2007-2011) offers in this study a first regional assessment of fire frequency and fire fatality in Ghana.

  19. RESPONSE OF NEUTRON MONITORS TO COSMIC RAY COUNTS: A STATISTICAL APPROACH

    Directory of Open Access Journals (Sweden)

    R. BHATTACHARYA

    2013-09-01

    Full Text Available Study of cosmic ray became a subject of study with the invention of neutron monitor by Simpson. But recording of cosmic ray counts was started regularly from International Geophysical Year at different locations having different climatic zones over the globe. Here statistical analysis is performed to investigate the degree of response of different monitors towards cosmic ray counts. No significant difference is observed in statistical analysis if cosmic ray counts are normalized with respect to their mean counts in respective solar cycles. Correlation between cosmic ray counts of any two stations is found ranges from 0.88 to 0.99.

  20. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al. Th...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....

  1. Experimental reconstruction of photon statistics without photon counting.

    Science.gov (United States)

    Zambra, Guido; Andreoni, Alessandra; Bondani, Maria; Gramegna, Marco; Genovese, Marco; Brida, Giorgio; Rossi, Andrea; Paris, Matteo G A

    2005-08-05

    Experimental reconstructions of photon number distributions of both continuous-wave and pulsed light beams are reported. Our scheme is based on on/off avalanche photo-detection assisted by maximum-likelihood estimation and does not involve photon counting. Reconstructions of the distribution for both semiclassical and quantum states of light are reported for single-mode as well as for multi-mode beams.

  2. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    Science.gov (United States)

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  3. Modelling, simulation and inference for multivariate time series of counts

    OpenAIRE

    Veraart, Almut E. D.

    2016-01-01

    This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...

  4. An analytical model of crater count equilibrium

    Science.gov (United States)

    Hirabayashi, Masatoshi; Minton, David A.; Fassett, Caleb I.

    2017-06-01

    Crater count equilibrium occurs when new craters form at the same rate that old craters are erased, such that the total number of observable impacts remains constant. Despite substantial efforts to understand this process, there remain many unsolved problems. Here, we propose an analytical model that describes how a heavily cratered surface reaches a state of crater count equilibrium. The proposed model formulates three physical processes contributing to crater count equilibrium: cookie-cutting (simple, geometric overlap), ejecta-blanketing, and sandblasting (diffusive erosion). These three processes are modeled using a degradation parameter that describes the efficiency for a new crater to erase old craters. The flexibility of our newly developed model allows us to represent the processes that underlie crater count equilibrium problems. The results show that when the slope of the production function is steeper than that of the equilibrium state, the power law of the equilibrium slope is independent of that of the production function slope. We apply our model to the cratering conditions in the Sinus Medii region and at the Apollo 15 landing site on the Moon and demonstrate that a consistent degradation parameterization can successfully be determined based on the empirical results of these regions. Further developments of this model will enable us to better understand the surface evolution of airless bodies due to impact bombardment.

  5. A Statistical Method to Constrain Faint Radio Source Counts Below the Detection Threshold

    CERN Document Server

    Mitchell-Wynne, Ketron; Afonso, Jose; Jarvis, Matt J

    2013-01-01

    We present a statistical method based on a maximum likelihood approach to constrain the number counts of extragalactic sources below the nominal flux-density limit of continuum imaging surveys. We extract flux densities from a radio map using positional information from an auxiliary catalogue and show that we can model the number counts of this undetected population down to flux density levels well below the detection threshold of the radio survey. We demonstrate the capabilities that our method will have with future generation wide-area radio surveys by performing simulations over various sky areas with a power-law dN/dS model. We generate a simulated power-law distribution with flux densities ranging from 0.1 \\sigma to 2 \\sigma, convolve this distribution with a Gaussian noise distribution rms of 10 micro-Jy/beam, and are able to recover the counts from the noisy distribution. We then demonstrate the application of our method using data from the Faint Images of the Radio Sky at Twenty-Centimeters survey (FI...

  6. It's not the voting that's democracy, it's the counting: Statistical detection of systematic election irregularities

    CERN Document Server

    Klimek, Peter; Hanel, Rudolf; Thurner, Stefan

    2012-01-01

    Democratic societies are built around the principle of free and fair elections, that each citizen's vote should count equal. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies certain statistical consequences for the polling results which can be used to identify election irregularities. Using a suitable data collapse, we find that vote distributions of elections with alleged fraud show a kurtosis of hundred times more than normal elections. As an example we show that reported irregularities in the 2011 Duma election are indeed well explained by systematic ballot stuffing and develop a parametric model quantifying to which extent fraudulent mechanisms are present. We show that if specific statistical properties are present in an election, the results do not represent the will of the people. We formulate a parametric test detecting these stati...

  7. Counting

    Institute of Scientific and Technical Information of China (English)

    许有国

    2005-01-01

    Most people began to count in tens because they had ten fingers on their hands. But in some countries, people counted on one hand and used the three parts of their four fingers. So they counted in twelves, not in tens.

  8. Ecotoxicology is not normal: A comparison of statistical approaches for analysis of count and proportion data in ecotoxicology.

    Science.gov (United States)

    Szöcs, Eduard; Schäfer, Ralf B

    2015-09-01

    Ecotoxicologists often encounter count and proportion data that are rarely normally distributed. To meet the assumptions of the linear model, such data are usually transformed or non-parametric methods are used if the transformed data still violate the assumptions. Generalized linear models (GLMs) allow to directly model such data, without the need for transformation. Here, we compare the performance of two parametric methods, i.e., (1) the linear model (assuming normality of transformed data), (2) GLMs (assuming a Poisson, negative binomial, or binomially distributed response), and (3) non-parametric methods. We simulated typical data mimicking low replicated ecotoxicological experiments of two common data types (counts and proportions from counts). We compared the performance of the different methods in terms of statistical power and Type I error for detecting a general treatment effect and determining the lowest observed effect concentration (LOEC). In addition, we outlined differences on a real-world mesocosm data set. For count data, we found that the quasi-Poisson model yielded the highest power. The negative binomial GLM resulted in increased Type I errors, which could be fixed using the parametric bootstrap. For proportions, binomial GLMs performed better than the linear model, except to determine LOEC at extremely low sample sizes. The compared non-parametric methods had generally lower power. We recommend that counts in one-factorial experiments should be analyzed using quasi-Poisson models and proportions from counts by binomial GLMs. These methods should become standard in ecotoxicology.

  9. Modeling cosmic void statistics

    Science.gov (United States)

    Hamaus, Nico; Sutter, P. M.; Wandelt, Benjamin D.

    2016-10-01

    Understanding the internal structure and spatial distribution of cosmic voids is crucial when considering them as probes of cosmology. We present recent advances in modeling void density- and velocity-profiles in real space, as well as void two-point statistics in redshift space, by examining voids identified via the watershed transform in state-of-the-art ΛCDM n-body simulations and mock galaxy catalogs. The simple and universal characteristics that emerge from these statistics indicate the self-similarity of large-scale structure and suggest cosmic voids to be among the most pristine objects to consider for future studies on the nature of dark energy, dark matter and modified gravity.

  10. Statistical Methods for Unusual Count Data: Examples From Studies of Microchimerism.

    Science.gov (United States)

    Guthrie, Katherine A; Gammill, Hilary S; Kamper-Jørgensen, Mads; Tjønneland, Anne; Gadi, Vijayakrishna K; Nelson, J Lee; Leisenring, Wendy

    2016-10-21

    Natural acquisition of small amounts of foreign cells or DNA, referred to as microchimerism, occurs primarily through maternal-fetal exchange during pregnancy. Microchimerism can persist long-term and has been associated with both beneficial and adverse human health outcomes. Quantitative microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per total cell equivalents tested utilizes all available data and facilitates a comparison of rates between groups. We found that both the marginalized zero-inflated Poisson model and the negative binomial model can provide unbiased and consistent estimates of the overall association of exposure or study group with microchimerism detection rates. The negative binomial model remains the more accessible of these 2 approaches; thus, we conclude that the negative binomial model may be most appropriate for analyzing quantitative microchimerism data. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...... on high LTV mortgages. Borrowers seeking to finance more than 80% of a house's value with a mortgage usually either purchase mortgage insurance, allowing a first mortgage greater than 80% from many lenders, or use second mortgages. Are there differences in performance between loans financed...

  12. Fitting statistical distributions to sea duck count data: implications for survey design and abundance estimation

    Science.gov (United States)

    Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.

    2014-01-01

    Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are

  13. Bias Expansion of Spatial Statistics and Approximation of Differenced Lattice Point Counts

    Indian Academy of Sciences (India)

    Daniel J Nordman; Soumendra N Lahiri

    2011-05-01

    Investigations of spatial statistics, computed from lattice data in the plane, can lead to a special lattice point counting problem. The statistical goal is to expand the asymptotic expectation or large-sample bias of certain spatial covariance estimators, where this bias typically depends on the shape of a spatial sampling region. In particular, such bias expansions often require approximating a difference between two lattice point counts, where the counts correspond to a set of increasing domain (i.e., the sampling region) and an intersection of this set with a vector translate of itself. Non-trivially, the approximation error needs to be of smaller order than the spatial region’s perimeter length. For all convex regions in 2-dimensional Euclidean space and certain unions of convex sets, we show that a difference in areas can approximate a difference in lattice point counts to this required accuracy, even though area can poorly measure the lattice point count of any single set involved in the difference. When investigating large-sample properties of spatial estimators, this approximation result facilitates direct calculation of limiting bias, because, unlike counts, differences in areas are often tractable to compute even with non-rectangular regions. We illustrate the counting approximations with two statistical examples.

  14. Count data modeling and classification using finite mixtures of distributions.

    Science.gov (United States)

    Bouguila, Nizar

    2011-02-01

    In this paper, we consider the problem of constructing accurate and flexible statistical representations for count data, which we often confront in many areas such as data mining, computer vision, and information retrieval. In particular, we analyze and compare several generative approaches widely used for count data clustering, namely multinomial, multinomial Dirichlet, and multinomial generalized Dirichlet mixture models. Moreover, we propose a clustering approach via a mixture model based on a composition of the Liouville family of distributions, from which we select the Beta-Liouville distribution, and the multinomial. The novel proposed model, which we call multinomial Beta-Liouville mixture, is optimized by deterministic annealing expectation-maximization and minimum description length, and strives to achieve a high accuracy of count data clustering and model selection. An important feature of the multinomial Beta-Liouville mixture is that it has fewer parameters than the recently proposed multinomial generalized Dirichlet mixture. The performance evaluation is conducted through a set of extensive empirical experiments, which concern text and image texture modeling and classification and shape modeling, and highlights the merits of the proposed models and approaches.

  15. Counting statistics of chaotic resonances at optical frequencies: Theory and experiments

    Science.gov (United States)

    Lippolis, Domenico; Wang, Li; Xiao, Yun-Feng

    2017-07-01

    A deformed dielectric microcavity is used as an experimental platform for the analysis of the statistics of chaotic resonances, in the perspective of testing fractal Weyl laws at optical frequencies. In order to surmount the difficulties that arise from reading strongly overlapping spectra, we exploit the mixed nature of the phase space at hand, and only count the high-Q whispering-gallery modes (WGMs) directly. That enables us to draw statistical information on the more lossy chaotic resonances, coupled to the high-Q regular modes via dynamical tunneling. Three different models [classical, Random-Matrix-Theory (RMT) based, semiclassical] to interpret the experimental data are discussed. On the basis of least-squares analysis, theoretical estimates of Ehrenfest time, and independent measurements, we find that a semiclassically modified RMT-based expression best describes the experiment in all its realizations, particularly when the resonator is coupled to visible light, while RMT alone still works quite well in the infrared. In this work we reexamine and substantially extend the results of a short paper published earlier [L. Wang et al., Phys. Rev. E 93, 040201(R) (2016), 10.1103/PhysRevE.93.040201].

  16. Algebraic Statistics for Network Models

    Science.gov (United States)

    2014-02-19

    AFRL-OSR-VA-TR-2014-0070 (DARPA) Algebraic Statistics for Network Models SONJA PETROVIC PENNSYLVANIA STATE UNIVERSITY 02/19/2014 Final Report...DARPA GRAPHS Phase I Algebraic Statistics for Network Models FA9550-12-1-0392 Sonja Petrović petrovic@psu.edu1 Department of Statistics Pennsylvania...Department of Statistics, Heinz College , Machine Learning Department, Cylab Carnegie Mellon University 1. Abstract This project focused on the family of

  17. Statistical connection of peak counts to power spectrum and moments in weak-lensing field

    Science.gov (United States)

    Shirasaki, Masato

    2017-02-01

    The number density of local maxima of weak-lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak-lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field K to a new Gaussian field y, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of K can be reproduced from a single Gaussian field y and monotonic relation between y and K. Therefore, the correct information of two-point clustering and any order of moments in weak-lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to explain weak-lensing peak counts in the absence of shape noise. The prediction by local-Gaussianized transformation underestimates the simulated peak counts with a level of ˜20-30 per cent over a wide range of peak heights. Local-Gaussianized transformation can predict the weak-lensing peak counts with an ˜10 per cent accuracy in the presence of shape noise. Our analyses suggest that the cosmological information beyond power spectrum and its moments would be necessary to predict the weak-lensing peak counts with a percent-level accuracy, which is an expected statistical uncertainty in upcoming wide-field galaxy surveys.

  18. Statistical Model for Content Extraction

    DEFF Research Database (Denmark)

    2011-01-01

    We present a statistical model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features to predict significance of the node towards overall content...

  19. Methods of statistical model estimation

    CERN Document Server

    Hilbe, Joseph

    2013-01-01

    Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. Th

  20. LP Approach to Statistical Modeling

    OpenAIRE

    Mukhopadhyay, Subhadeep; Parzen, Emanuel

    2014-01-01

    We present an approach to statistical data modeling and exploratory data analysis called `LP Statistical Data Science.' It aims to generalize and unify traditional and novel statistical measures, methods, and exploratory tools. This article outlines fundamental concepts along with real-data examples to illustrate how the `LP Statistical Algorithm' can systematically tackle different varieties of data types, data patterns, and data structures under a coherent theoretical framework. A fundament...

  1. Full counting statistics of level renormalization in electron transport through double quantum dots.

    Science.gov (United States)

    Luo, JunYan; Jiao, HuJun; Shen, Yu; Cen, Gang; He, Xiao-Ling; Wang, Changrong

    2011-04-13

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  2. Full counting statistics of level renormalization in electron transport through double quantum dots

    Energy Technology Data Exchange (ETDEWEB)

    Luo Junyan; Shen Yu; Cen Gang; He Xiaoling; Wang Changrong [School of Science, Zhejiang University of Science and Technology, Hangzhou 310023 (China); Jiao Hujun, E-mail: jyluo@zust.edu.cn [Department of Physics, Shanxi University, Taiyuan, Shanxi 030006 (China)

    2011-04-13

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  3. Full-counting statistics of energy transport of molecular junctions in the polaronic regime

    Science.gov (United States)

    Tang, Gaomin; Yu, Zhizhou; Wang, Jian

    2017-08-01

    We investigate the full-counting statistics (FCS) of energy transport carried by electrons in molecular junctions for the Anderson-Holstein model in the polaronic regime. Using the two-time quantum measurement scheme, the generating function (GF) for the energy transport is derived and expressed as a Fredholm determinant in terms of Keldysh nonequilibrium Green’s function in the time domain. Dressed tunneling approximation is used in decoupling the phonon cloud operator in the polaronic regime. This formalism enables us to analyze the time evolution of energy transport dynamics after a sudden switch-on of the coupling between the dot and the leads towards the stationary state. The steady state energy current cumulant GF in the long time limit is obtained in the energy domain as well. Universal relations for steady state energy current FCS are derived under a finite temperature gradient with zero bias and this enabled us to express the equilibrium energy current cumulant by a linear combination of lower order cumulants. The behaviors of energy current cumulants in steady state under temperature gradient and external bias are numerically studied and explained. The transient dynamics of energy current cumulants is numerically calculated and analyzed. Universal scaling of normalized transient energy cumulants is found under both temperature gradient and external bias.

  4. Full counting statistics of laser excited Rydberg aggregates in a one-dimensional geometry

    CERN Document Server

    Schempp, H; Robert-de-Saint-Vincent, M; Hofmann, C S; Breyel, D; Komnik, A; Schönleber, D W; Gärttner, M; Evers, J; Whitlock, S; Weidemüller, M

    2014-01-01

    We experimentally study the full counting statistics of few-body Rydberg aggregates excited from a quasi-one-dimensional Rydberg gas. We measure asymmetric excitation spectra and increased second and third order statistical moments of the Rydberg number distribution, from which we determine the average aggregate size. Direct comparisons with numerical simulations reveal the presence of liquid-like spatial correlations, and indicate sequential growth of the aggregates around an initial grain. These findings demonstrate the importance of dissipative effects in strongly correlated Rydberg gases and introduce a way to study spatio-temporal correlations in strongly-interacting many-body quantum systems without imaging.

  5. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  6. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  7. Modeling patterns in count data using loglinear and related models

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, C.L.

    1995-12-01

    This report explains the use of loglinear and logit models, for analyzing Poisson and binomial counts in the presence of explanatory variables. The explanatory variables may be unordered categorical variables or numerical variables, or both. The report shows how to construct models to fit data, and how to test whether a model is too simple or too complex. The appropriateness of the methods with small data sets is discussed. Several example analyses, using the SAS computer package, illustrate the methods.

  8. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  9. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  10. Injury count model for quantification of risk of occupational injury.

    Science.gov (United States)

    Khanzode, Vivek V; Maiti, J; Ray, P K

    2011-06-01

    Reduction of risk of occupational injuries is one of the most challenging problems faced by industry. Assessing and comparing risks involved in different jobs is one of the important steps towards reducing injury risk. In this study, a comprehensive scheme is given for assessing and comparing injury risks with the development of injury count model, injury risk model and derived statistics. The hazards present in a work system and the nature of the job carried out by workers are perceived as important drivers of injury potential of a work system. A loglinear model is used to quantify injury counts and the event-tree approach with joint, marginal and conditional probabilities is used to quantify injury risk. A case study was carried out in an underground coal mine. Finally a number of indices are proposed for the case study mine to capture risk of injury in different jobs. The findings of this study will help in designing injury intervention strategies for the mine studied. The job-wise risk profiles will be used to prioritise the jobs for redesign. The absolute indices can be applied for benchmarking job-wise risks and the relative indices can be used for comparing job-wise risks across work systems.

  11. Unveiling the Gamma-ray Source Count Distribution below the Fermi Detection Limit with Photon Statistics

    CERN Document Server

    Zechlin, Hannes-S; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea

    2015-01-01

    The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi-LAT photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b|>30 deg) between 1 GeV and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into: (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6-year Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power-law of index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2x10^{-11} cm^{-2}s^{-1}, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken pow...

  12. Sensometrics: Thurstonian and Statistical Models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen

    of human senses. Thurstonian models provide a stochastic model for the data-generating mechanism through a psychophysical model for the cognitive processes and in addition provides an independent measure for quantification of sensory differences. In the interest of cost-reduction and health...... of generalized linear mixed models, cumulative link models and cumulative link mixed models. The relation between the Wald, likelihood and score statistics is expanded upon using the shape of the (profile) likelihood function as common reference....

  13. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  14. Automated counting of morphologically normal red blood cells by using digital holographic microscopy and statistical methods

    Science.gov (United States)

    Moon, Inkyu; Yi, Faliu

    2015-09-01

    In this paper we overview a method to automatically count morphologically normal red blood cells (RBCs) by using off-axis digital holographic microscopy and statistical methods. Three kinds of RBC are used as training and testing data. All of the RBC phase images are obtained with digital holographic microscopy (DHM) that is robust to transparent or semitransparent biological cells. For the determination of morphologically normal RBCs, the RBC's phase images are first segmented with marker-controlled watershed transform algorithm. Multiple features are extracted from the segmented cells. Moreover, the statistical method of Hotelling's T-square test is conducted to show that the 3D features from 3D imaging method can improve the discrimination performance for counting of normal shapes of RBCs. Finally, the classifier is designed by using statistical Bayesian algorithm and the misclassification rates are measured with leave-one-out technique. Experimental results show the feasibility of the classification method for calculating the percentage of each typical normal RBC shape.

  15. Modeling Zero-Inflated and Overdispersed Count Data: An Empirical Study of School Suspensions

    Science.gov (United States)

    Desjardins, Christopher David

    2016-01-01

    The purpose of this article is to develop a statistical model that best explains variability in the number of school days suspended. Number of school days suspended is a count variable that may be zero-inflated and overdispersed relative to a Poisson model. Four models were examined: Poisson, negative binomial, Poisson hurdle, and negative…

  16. Statistical analysis of the fluctuating counts of fecal bacteria in the water of Lake Kinneret.

    Science.gov (United States)

    Hadas, Ora; Corradini, Maria G; Peleg, Micha

    2004-01-01

    Counts of E. coli, Enteroccoci and fecal coliforms in four sites around Lake Kinneret (The Sea of Galilee), collected every 2-4 weeks for about 5 years during 1995-2002 showed irregular fluctuations punctuated by aperiodic outbursts of variable magnitude. Because of the haphazard nature of fecal contamination and large intervals between successive counts, these patterns were described by probabilistic models, based on the truncated Laplace or Extreme Value distribution. Their applicability was tested by comparing the predicted frequencies of counts exceeding different levels calculated from the first half of each record with those actually observed in its second half. Despite the records imperfections and minor violations of the underlying models' assumptions, there was a reasonable agreement between the estimated and actual frequencies. This demonstrated that it is possible to translate the irregular fluctuation pattern into a set of probabilities of future high counts. In principle, such probabilities can be used to quantify the water's fecal contamination pattern and as a tool to assess the efficacy of preventive measures to reduce it.

  17. Statistical connection of peak counts to power spectrum and moments in weak lensing field

    CERN Document Server

    Shirasaki, Masato

    2016-01-01

    The number density of local maxima of weak lensing field, referred to as weak-lensing peak counts, can be used as a cosmological probe. However, its relevant cosmological information is still unclear. We study the relationship between the peak counts and other statistics in weak lensing field by using 1000 ray-tracing simulations. We construct a local transformation of lensing field $\\cal K$ to a new Gaussian field $y$, named local-Gaussianized transformation. We calibrate the transformation with numerical simulations so that the one-point distribution and the power spectrum of $\\cal K$ can be reproduced from a single Gaussian field $y$ and monotonic relation between $y$ and $\\cal K$. Therefore, the correct information of two-point clustering and any order of moments in weak lensing field should be preserved under local-Gaussianized transformation. We then examine if local-Gaussianized transformation can predict weak-lensing peak counts in simulations. The local-Gaussianized transformation is insufficient to ...

  18. Statistical Measurement of the Gamma-ray Source-count Distribution as a Function of Energy

    CERN Document Server

    Zechlin, Hannes-S; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco

    2016-01-01

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. We employ the 1-point probability distribution function of 6 years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 GeV and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of 50 GeV. The index below the break is between 1.95 and 2.0. For higher energies, a simple power law fits the data, with an index of 2.2^{+0.7}_{-0.3} in the energy band between 50 GeV and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point source populations probed by this method can explain 83^{+7}_{-13}% (81^{+52}_{-19}%) of the extrag...

  19. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    assignment  mechanism, which is based on the discretionary choice of case workers. This is done in a duration model context, using the timing-of-events framework to identify causal effects. We compare different assignment  mechanisms, and the results suggest that a significant reduction in the average...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  20. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  1. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    When treatment effects of active labour market programmes are heterogeneous in an observable way  across the population, the allocation of the unemployed into different programmes becomes a particularly  important issue. In this paper, we present a statistical model designed to improve the present...... assignment  mechanism, which is based on the discretionary choice of case workers. This is done in a duration model context, using the timing-of-events framework to identify causal effects. We compare different assignment  mechanisms, and the results suggest that a significant reduction in the average...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  2. Image quantization: statistics and modeling

    Science.gov (United States)

    Whiting, Bruce R.; Muka, Edward

    1998-07-01

    A method for analyzing the effects of quantization, developed for temporal one-dimensional signals, is extended to two- dimensional radiographic images. By calculating the probability density function for the second order statistics (the differences between nearest neighbor pixels) and utilizing its Fourier transform (the characteristic function), the effect of quantization on image statistics can be studied by the use of standard communication theory. The approach is demonstrated by characterizing the noise properties of a storage phosphor computed radiography system and the image statistics of a simple radiographic object (cylinder) and by comparing the model to experimental measurements. The role of quantization noise and the onset of contouring in image degradation are explained.

  3. Optimization of statistical methods for HpGe gamma-ray spectrometer used in wide count rate ranges

    Science.gov (United States)

    Gervino, G.; Mana, G.; Palmisano, C.

    2016-07-01

    The need to perform γ-ray measurements with HpGe detectors is a common technique in many fields such as nuclear physics, radiochemistry, nuclear medicine and neutron activation analysis. The use of HpGe detectors is chosen in situations where isotope identification is needed because of their excellent resolution. Our challenge is to obtain the "best" spectroscopy data possible in every measurement situation. "Best" is a combination of statistical (number of counts) and spectral quality (peak, width and position) over a wide range of counting rates. In this framework, we applied Bayesian methods and the Ellipsoidal Nested Sampling (a multidimensional integration technique) to study the most likely distribution for the shape of HpGe spectra. In treating these experiments, the prior information suggests to model the likelihood function with a product of Poisson distributions. We present the efforts that have been done in order to optimize the statistical methods to HpGe detector outputs with the aim to evaluate to a better order of precision the detector efficiency, the absolute measured activity and the spectra background. Reaching a more precise knowledge of statistical and systematic uncertainties for the measured physical observables is the final goal of this research project.

  4. Full counting statistics of renormalized dynamics in open quantum transport system

    Energy Technology Data Exchange (ETDEWEB)

    Luo, JunYan, E-mail: jyluo@zust.edu.cn [School of Science, Zhejiang University of Science and Technology, Hangzhou, 310023 (China); Shen, Yu; He, Xiao-Ling [School of Science, Zhejiang University of Science and Technology, Hangzhou, 310023 (China); Li, Xin-Qi [Department of Chemistry, Hong Kong University of Science and Technology, Kowloon, Hong Kong SAR (China); State Key Laboratory for Superlattices and Microstructures, Institute of Semiconductors, Chinese Academy of Sciences, P.O. Box 912, Beijing 100083 (China); Department of Physics, Beijing Normal University, Beijing 100875 (China); Yan, YiJing [Department of Chemistry, Hong Kong University of Science and Technology, Kowloon, Hong Kong SAR (China)

    2011-11-28

    The internal dynamics of a double quantum dot system is renormalized due to coupling respectively with transport electrodes and a dissipative heat bath. Their essential differences are identified unambiguously in the context of full counting statistics. The electrode coupling caused level detuning renormalization gives rise to a fast-to-slow transport mechanism, which is not resolved at all in the average current, but revealed uniquely by pronounced super-Poissonian shot noise and skewness. The heat bath coupling introduces an interdot coupling renormalization, which results in asymmetric Fano factor and an intriguing change of line shape in the skewness. -- Highlights: ► We study full counting statistics of electron transport through double quantum dots. ► Essential differences due to coupling to the electrodes and heat bath are identified. ► Level detuning induced by electrodes results in strongly enhanced shot noise and skewness. ► Interdot coupling renormalization due to heat bath leads to asymmetric noise and intriguing skewness.

  5. Statistical modeling of program performance

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available A task of evaluation of program performance often occurs in the process of design of computer systems or during iterative compilation. A traditional way to solve this problem is emulation of program execution on the target system. A modern alternative approach to evaluation of program performance is based on statistical modeling of program performance on a computer under investigation. This statistical method of modeling program performance called Velocitas was introduced in this work. The method and its implementation in the Adaptor framework were presented. Investigation of the method's effectiveness showed high adequacy of program performance prediction.

  6. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  7. Syndromic surveillance: STL for modeling, visualizing, and monitoring disease counts

    Directory of Open Access Journals (Sweden)

    Abusalah Ahmad

    2009-04-01

    Full Text Available Abstract Background Public health surveillance is the monitoring of data to detect and quantify unusual health events. Monitoring pre-diagnostic data, such as emergency department (ED patient chief complaints, enables rapid detection of disease outbreaks. There are many sources of variation in such data; statistical methods need to accurately model them as a basis for timely and accurate disease outbreak methods. Methods Our new methods for modeling daily chief complaint counts are based on a seasonal-trend decomposition procedure based on loess (STL and were developed using data from the 76 EDs of the Indiana surveillance program from 2004 to 2008. Square root counts are decomposed into inter-annual, yearly-seasonal, day-of-the-week, and random-error components. Using this decomposition method, we develop a new synoptic-scale (days to weeks outbreak detection method and carry out a simulation study to compare detection performance to four well-known methods for nine outbreak scenarios. Result The components of the STL decomposition reveal insights into the variability of the Indiana ED data. Day-of-the-week components tend to peak Sunday or Monday, fall steadily to a minimum Thursday or Friday, and then rise to the peak. Yearly-seasonal components show seasonal influenza, some with bimodal peaks. Some inter-annual components increase slightly due to increasing patient populations. A new outbreak detection method based on the decomposition modeling performs well with 90 days or more of data. Control limits were set empirically so that all methods had a specificity of 97%. STL had the largest sensitivity in all nine outbreak scenarios. The STL method also exhibited a well-behaved false positive rate when run on the data with no outbreaks injected. Conclusion The STL decomposition method for chief complaint counts leads to a rapid and accurate detection method for disease outbreaks, and requires only 90 days of historical data to be put into

  8. Systematic and Statistical Errors Associated with Nuclear Decay Constant Measurements Using the Counting Technique

    Science.gov (United States)

    Koltick, David; Wang, Haoyu; Liu, Shih-Chieh; Heim, Jordan; Nistor, Jonathan

    2016-03-01

    Typical nuclear decay constants are measured at the accuracy level of 10-2. There are numerous reasons: tests of unconventional theories, dating of materials, and long term inventory evolution which require decay constants accuracy at a level of 10-4 to 10-5. The statistical and systematic errors associated with precision measurements of decays using the counting technique are presented. Precision requires high count rates, which introduces time dependent dead time and pile-up corrections. An approach to overcome these issues is presented by continuous recording of the detector current. Other systematic corrections include, the time dependent dead time due to background radiation, control of target motion and radiation flight path variation due to environmental conditions, and the time dependent effects caused by scattered events are presented. The incorporation of blind experimental techniques can help make measurement independent of past results. A spectrometer design and data analysis is reviewed that can accomplish these goals. The author would like to thank TechSource, Inc. and Advanced Physics Technologies, LLC. for their support in this work.

  9. Parameter counting in models with global symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Joshua [Institute for High Energy Phenomenology, Newman Laboratory of Elementary Particle Physics, Cornell University, Ithaca, NY 14853 (United States)], E-mail: jb454@cornell.edu; Grossman, Yuval [Institute for High Energy Phenomenology, Newman Laboratory of Elementary Particle Physics, Cornell University, Ithaca, NY 14853 (United States)], E-mail: yuvalg@lepp.cornell.edu

    2009-05-18

    We present rules for determining the number of physical parameters in models with exact flavor symmetries. In such models the total number of parameters (physical and unphysical) needed to described a matrix is less than in a model without the symmetries. Several toy examples are studied in order to demonstrate the rules. The use of global symmetries in studying the minimally supersymmetric standard model (MSSM) is examined.

  10. RCT: Module 2.03, Counting Errors and Statistics, Course 8768

    Energy Technology Data Exchange (ETDEWEB)

    Hillmer, Kurt T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-01

    Radiological sample analysis involves the observation of a random process that may or may not occur and an estimation of the amount of radioactive material present based on that observation. Across the country, radiological control personnel are using the activity measurements to make decisions that may affect the health and safety of workers at those facilities and their surrounding environments. This course will present an overview of measurement processes, a statistical evaluation of both measurements and equipment performance, and some actions to take to minimize the sources of error in count room operations. This course will prepare the student with the skills necessary for radiological control technician (RCT) qualification by passing quizzes, tests, and the RCT Comprehensive Phase 1, Unit 2 Examination (TEST 27566) and by providing in the field skills.

  11. Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland   First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...

  12. Improved model for statistical alignment

    Energy Technology Data Exchange (ETDEWEB)

    Miklos, I.; Toroczkai, Z. (Zoltan)

    2001-01-01

    The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.

  13. A Model for Positively Correlated Count Variables

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    2010-01-01

    An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields and their poten......An α-permanental random field is briefly speaking a model for a collection of non-negative integer valued random variables with positive associations. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of α-permanental random fields...

  14. Unveiling the Gamma-Ray Source Count Distribution Below the Fermi Detection Limit with Photon Statistics

    Science.gov (United States)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea

    2016-08-01

    The source-count distribution as a function of their flux, {dN}/{dS}, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (| b| ≥slant 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6 yr Fermi-LAT data set (P7REP), we show that the {dN}/{dS} distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure {dN}/{dS} down to an integral flux of ˜ 2× {10}-11 {{cm}}-2 {{{s}}}-1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall {dN}/{dS} distribution is consistent with a broken power law, with a break at {2.1}-1.3+1.0× {10}-8 {{cm}}-2 {{{s}}}-1. The power-law index {n}1={3.1}-0.5+0.7 for bright sources above the break hardens to {n}2=1.97+/- 0.03 for fainter sources below the break. A possible second break of the {dN}/{dS} distribution is constrained to be at fluxes below 6.4× {10}-11 {{cm}}-2 {{{s}}}-1 at 95% confidence level. The high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ˜25% point sources, ˜69.3% diffuse Galactic foreground emission, and ˜6% isotropic diffuse background.

  15. Counting statistics of transport through Coulomb blockade nanostructures: High-order cumulants and non-Markovian effects

    DEFF Research Database (Denmark)

    Flindt, Christian; Novotny, Tomás; Braggio, Alessandro

    2010-01-01

    Recent experimental progress has made it possible to detect in real-time single electrons tunneling through Coulomb blockade nanostructures, thereby allowing for precise measurements of the statistical distribution of the number of transferred charges, the so-called full counting statistics...

  16. Mixture Models for the Analysis of Repeated Count Data.

    NARCIS (Netherlands)

    van Duijn, M.A.J.; Böckenholt, U

    1995-01-01

    Repeated count data showing overdispersion are commonly analysed by using a Poisson model with varying intensity parameter. resulting in a mixed model. A mixed model with a gamma distribution for the Poisson parameter does not adequately fit a data set on 721 children's spelling errors. An

  17. Applications of some discrete regression models for count data

    Directory of Open Access Journals (Sweden)

    B. M. Golam Kibria

    2006-01-01

    Full Text Available In this paper we have considered several regression models to fit the count data that encounter in the field of Biometrical, Environmental, Social Sciences and Transportation Engineering. We have fitted Poisson (PO, Negative Binomial (NB, Zero-Inflated Poisson (ZIP and Zero-Inflated Negative Binomial (ZINB regression models to run-off-road (ROR crash data which collected on arterial roads in south region (rural of Florida State. To compare the performance of these models, we analyzed data with moderate to high percentage of zero counts. Because the variances were almost three times greater than the means, it appeared that both NB and ZINB models performed better than PO and ZIP models for the zero inflated and over dispersed count data.

  18. Statistical bootstrap model and annihilations

    CERN Document Server

    Möhring, H J

    1974-01-01

    The statistical bootstrap model (SBM) describes the decay of single, high mass, hadronic states (fireballs, clusters) into stable particles. Coupling constants B, one for each isospin multiplet of stable particles, are the only free parameter of the model. They are related to the maximum temperature parameter T/sub 0/. The various versions of the SMB can be classified into two groups: full statistical bootstrap models and linear ones. The main results of the model are the following: i) All momentum spectra are isotropic; especially the exclusive ones are described by invariant phase space. The inclusive and semi-inclusive single-particle distributions are asymptotically of pure exponential shape; the slope is governed by T /sub 0/ only. ii) The model parameter B for pions has been obtained by fitting the multiplicity distribution in pp and pn at rest, and corresponds to T/sub 0/=0.167 GeV in the full SBM with exotics. The average pi /sup -/ multiplicity for the linear and the full SBM (both with exotics) is c...

  19. Flexible models for spike count data with both over- and under- dispersion.

    Science.gov (United States)

    Stevenson, Ian H

    2016-08-01

    A key observation in systems neuroscience is that neural responses vary, even in controlled settings where stimuli are held constant. Many statistical models assume that trial-to-trial spike count variability is Poisson, but there is considerable evidence that neurons can be substantially more or less variable than Poisson depending on the stimuli, attentional state, and brain area. Here we examine a set of spike count models based on the Conway-Maxwell-Poisson (COM-Poisson) distribution that can flexibly account for both over- and under-dispersion in spike count data. We illustrate applications of this noise model for Bayesian estimation of tuning curves and peri-stimulus time histograms. We find that COM-Poisson models with group/observation-level dispersion, where spike count variability is a function of time or stimulus, produce more accurate descriptions of spike counts compared to Poisson models as well as negative-binomial models often used as alternatives. Since dispersion is one determinant of parameter standard errors, COM-Poisson models are also likely to yield more accurate model comparison. More generally, these methods provide a useful, model-based framework for inferring both the mean and variability of neural responses.

  20. Details of Programming a Model of Children's Counting in ACTP.

    Science.gov (United States)

    Riley, Mary S.; Greeno, James G.

    Presented is an introduction to the operation and mechanics of the ACTP production system, a version of Anderson's (1976) ACT system. ACTP is already in use modeling geometry theorem proving and counting of a set of objects, and has been identified as a potentially useful programing framework for developing models of the cognitive processes used…

  1. Photon counts statistics of squeezed and multi-mode thermal states of light on multiplexed on-off detectors

    CERN Document Server

    Chrapkiewicz, Radosław

    2015-01-01

    Photon number resolving detectors can be highly useful for studying the statistics of multi-photon quantum states of light. In this work we study the counts statistics of different states of light measured on multiplexed on-off detectors. We put special emphasis on artificial nonclassical features of the statistics obtained. We show new ways to derive analytical formulas for counts statistics and their moments. Using our approach we are the first to derive statistics moments for multi-mode thermal states measured on multiplexed on-off detectors. We use them to determine empirical Mandel parameters and recently proposed subbinomial parameters suitable for tests of nonclassicality of the measured states. Additionally, we investigate subpoissonian and superbunching properties of the two-mode squeezed state measured on a pair of multiplexed detectors and we present results of the Fano factor and second-order correlation function for these states.

  2. Statistical models for trisomic phenotypes

    Energy Technology Data Exchange (ETDEWEB)

    Lamb, N.E.; Sherman, S.L.; Feingold, E. [Emory Univ., Atlanta, GA (United States)

    1996-01-01

    Certain genetic disorders are rare in the general population but more common in individuals with specific trisomies, which suggests that the genes involved in the etiology of these disorders may be located on the trisomic chromosome. As with all aneuploid syndromes, however, a considerable degree of variation exists within each phenotype so that any given trait is present only among a subset of the trisomic population. We have previously presented a simple gene-dosage model to explain this phenotypic variation and developed a strategy to map genes for such traits. The mapping strategy does not depend on the simple model but works in theory under any model that predicts that affected individuals have an increased likelihood of disomic homozygosity at the trait locus. This paper explores the robustness of our mapping method by investigating what kinds of models give an expected increase in disomic homozygosity. We describe a number of basic statistical models for trisomic phenotypes. Some of these are logical extensions of standard models for disomic phenotypes, and some are more specific to trisomy. Where possible, we discuss genetic mechanisms applicable to each model. We investigate which models and which parameter values give an expected increase in disomic homozygosity in individuals with the trait. Finally, we determine the sample sizes required to identify the increased disomic homozygosity under each model. Most of the models we explore yield detectable increases in disomic homozygosity for some reasonable range of parameter values, usually corresponding to smaller trait frequencies. It therefore appears that our mapping method should be effective for a wide variety of moderately infrequent traits, even though the exact mode of inheritance is unlikely to be known. 21 refs., 8 figs., 1 tab.

  3. Marginalized zero-altered models for longitudinal count data.

    Science.gov (United States)

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  4. Counting Vesicular Release Events Reveals Binomial Release Statistics at Single Glutamatergic Synapses.

    Science.gov (United States)

    Malagon, Gerardo; Miki, Takafumi; Llano, Isabel; Neher, Erwin; Marty, Alain

    2016-04-06

    Many central glutamatergic synapses contain a single presynaptic active zone and a single postsynaptic density. However, the basic functional properties of such "simple synapses" remain unclear. One important step toward understanding simple synapse function is to analyze the number of synaptic vesicles released in such structures per action potential, but this goal has remained elusive until now. Here, we describe procedures that allow reliable vesicular release counting at simple synapses between parallel fibers and molecular layer interneurons of rat cerebellar slices. Our analysis involves local extracellular stimulation of single parallel fibers and deconvolution of resulting EPSCs using quantal signals as template. We observed a reduction of quantal amplitudes (amplitude occlusion) in pairs of consecutive EPSCs due to receptor saturation. This effect is larger (62%) than previously reported and primarily reflects receptor activation rather than desensitization. In addition to activation-driven amplitude occlusion, each EPSC reduces amplitudes of subsequent events by an estimated 3% due to cumulative desensitization. Vesicular release counts at simple synapses follow binomial statistics with a maximum that varies from 2 to 10 among experiments. This maximum presumably reflects the number of docking sites at a given synapse. These results show striking similarities, as well as significant quantitative differences, with respect to previous results at simple GABAergic synapses. It is generally accepted that the output signal of individual central synapses saturates at high release probability, but it remains unclear whether the source of saturation is presynaptic, postsynaptic, or both presynaptic and postsynaptic. To clarify this and other issues concerning the function of synapses, we have developed new recording and analysis methods at single central glutamatergic synapses. We find that individual release events engage a high proportion of postsynaptic

  5. Evaluation of models of spectral distortions in photon-counting detectors for computed tomography.

    Science.gov (United States)

    Cammin, Jochen; Kappler, Steffen; Weidinger, Thomas; Taguchi, Katsuyuki

    2016-04-01

    A semi-analytical model describing spectral distortions in photon-counting detectors (PCDs) for clinical computed tomography was evaluated using simulated data. The distortions were due to count rate-independent spectral response effects and count rate-dependent pulse-pileup effects and the model predicted both the mean count rates and the spectral shape. The model parameters were calculated using calibration data. The model was evaluated by comparing the predicted x-ray spectra to Monte Carlo simulations of a PCD at various count rates. The data-model agreement expressed as weighted coefficient of variation [Formula: see text] was better than [Formula: see text] for dead time losses up to 28% and [Formula: see text] or smaller for dead time losses up to 69%. The accuracy of the model was also tested for the purpose of material decomposition by estimating material thicknesses from simulated projection data. The estimated attenuator thicknesses generally agreed with the true values within one standard deviation of the statistical uncertainty obtained from multiple noise realizations.

  6. A review on models for count data with extra zeros

    Science.gov (United States)

    Zamri, Nik Sarah Nik; Zamzuri, Zamira Hasanah

    2017-04-01

    Typically, the zero inflated models are usually used in modelling count data with excess zeros. The existence of the extra zeros could be structural zeros or random which occur by chance. These types of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences. As found in the literature, the most popular zero inflated models used are zero inflated Poisson and zero inflated negative binomial. Recently, more complex models have been developed to account for overdispersion and unobserved heterogeneity. In addition, more extended distributions are also considered in modelling data with this feature. In this paper, we review related literature, provide a recent development and summary on models for count data with extra zeros.

  7. Bayesian Model Selection and Statistical Modeling

    CERN Document Server

    Ando, Tomohiro

    2010-01-01

    Bayesian model selection is a fundamental part of the Bayesian statistical modeling process. The quality of these solutions usually depends on the goodness of the constructed Bayesian model. Realizing how crucial this issue is, many researchers and practitioners have been extensively investigating the Bayesian model selection problem. This book provides comprehensive explanations of the concepts and derivations of the Bayesian approach for model selection and related criteria, including the Bayes factor, the Bayesian information criterion (BIC), the generalized BIC, and the pseudo marginal lik

  8. Multimode model for projective photon-counting measurements

    DEFF Research Database (Denmark)

    Tualle-Brouri, Rosa; Ourjoumtsev, Alexei; Dantan, Aurélien

    2009-01-01

    We present a general model to account for the multimode nature of the quantum electromagnetic field in projective photon-counting measurements. We focus on photon-subtraction experiments, where non-Gaussian states are produced conditionally. These are useful states for continuous-variable quantum...

  9. Bilinear modulation models for seasonal tables of counts

    NARCIS (Netherlands)

    B.D. Marx (Brian); P.H.C. Eilers (Paul); J. Gampe (Jutta); R. Rau (Roland)

    2010-01-01

    textabstractWe propose generalized linear models for time or age-time tables of seasonal counts, with the goal of better understanding seasonal patterns in the data. The linear predictor contains a smooth component for the trend and the product of a smooth component (the modulation) and a periodic t

  10. Full-counting statistics of heat transport in harmonic junctions: transient, steady states, and fluctuation theorems.

    Science.gov (United States)

    Agarwalla, Bijay Kumar; Li, Baowen; Wang, Jian-Sheng

    2012-05-01

    We study the statistics of heat transferred in a given time interval t_{M}, through a finite harmonic chain, called the center, which is connected to two heat baths, the left (L) and the right (R), that are maintained at two temperatures. The center atoms are driven by external time-dependent forces. We calculate the cumulant generating function (CGF) for the heat transferred out of the left lead, Q_{L}, based on the two-time quantum measurement concept and using the nonequilibrium Green's function method. The CGF can be concisely expressed in terms of Green's functions of the center and an argument-shifted self-energy of the lead. The expression of the CGF is valid in both transient and steady-state regimes. We consider three initial conditions for the density operator and show numerically, for a one-atom junction, how their transient behaviors differ from each other but, finally, approach the same steady state, independent of the initial distributions. We also derive the CGF for the joint probability distribution P(Q_{L},Q_{R}), and discuss the correlations between Q_{L} and Q_{R}. We calculate the CGF for total entropy production in the reservoirs. In the steady state we explicitly show that the CGFs obey steady-state fluctuation theorems. We obtain classical results by taking ℏ→0. We also apply our method to the counting of the electron number and electron energy, for which the associated self-energy is obtained from the usual lead self-energy by multiplying a phase and shifting the contour time, respectively.

  11. Statistical Analysis by Statistical Physics Model for the STOCK Markets

    Science.gov (United States)

    Wang, Tiansong; Wang, Jun; Fan, Bingli

    A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.

  12. Visualizing statistical models and concepts

    CERN Document Server

    Farebrother, RW

    2002-01-01

    Examines classic algorithms, geometric diagrams, and mechanical principles for enhancing visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming.

  13. Cosmological constraints with weak lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, Francois; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-01-01

    Peak statistics in weak lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. To prepare for the high precision afforded by next-generation weak lensing surveys, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how CAMELUS---a fast stochastic model for predicting peaks---can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. We measure the abundance histogram of peaks in a mock shear catalogue of approximately 5,000 deg2 using a multiscale mass map filtering technique, and we then constrain the parameters of the mock survey using CAMELUS combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. We find that peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, indicating the need to better understand and control the model's systematics before applying it to a real survey of this size or larger. We perform a calibration of the model to remove the bias and compare results to those from the two-point correlation functions (2PCF) measured on the same field. In this case, we find the derived parameter Σ8 = σ8(Ωm/0.27)α = 0.76 (-0.03 +0.02) with α = 0.65 for peaks, while for 2PCF the values are Σ8 = 0.76 (-0.01 +0.02) and α = 0.70. We conclude that the constraining power can therefore be comparable between the two weak lensing observables in large-field surveys. Furthermore, the tilt in the σ8-Ωm degeneracy direction for peaks with respect to that of 2PCF suggests that a combined analysis would yield tighter constraints than either measure alone. As expected, w0de cannot be

  14. The joint statistics of mildly non-linear cosmological densities and slopes in count-in-cells

    CERN Document Server

    Bernardeau, Francis; Pichon, Christophe

    2015-01-01

    In the context of count-in-cells statistics, the joint probability distribution of the density in two concentric spherical shells is predicted from first first principle for sigmas of the order of one. The agreement with simulation is found to be excellent. This statistics allows us to deduce the conditional one dimensional probability distribution function of the slope within under dense (resp. overdense) regions, or of the density for positive or negative slopes. The former conditional distribution is likely to be more robust in constraining the cosmological parameters as the underlying dynamics is less evolved in such regions. A fiducial dark energy experiment is implemented on such counts derived from Lambda-CDM simulations.

  15. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  16. Certification Can Count: The Case of Aircraft Mechanics. Issues in Labor Statistics. Summary 02-03.

    Science.gov (United States)

    Bureau of Labor Statistics, Washington, DC.

    This document is a summary of aerospace industry technician statistics gathered by the Occupational Employment Statistics Survey for the year 2000 by the Department of Labor, Bureau of Labor Statistics. The data includes the following: (1) a comparison of wages earned by Federal Aviation Administration (FAA) certified and non-FAA certified…

  17. The Invention of Counting: The Statistical Measurement of Literacy in Nineteenth-Century England

    Science.gov (United States)

    Vincent, David

    2014-01-01

    This article examines the invention of counting literacy on a national basis in nineteenth-century Britain. Through an analysis of Registrar Generals' reports, it describes how the early statisticians wrestled with the implications of their new-found capacity to describe a nation's communications skills in a single table and how they were unable…

  18. The Invention of Counting: The Statistical Measurement of Literacy in Nineteenth-Century England

    Science.gov (United States)

    Vincent, David

    2014-01-01

    This article examines the invention of counting literacy on a national basis in nineteenth-century Britain. Through an analysis of Registrar Generals' reports, it describes how the early statisticians wrestled with the implications of their new-found capacity to describe a nation's communications skills in a single table and how they were unable…

  19. Fermi breakup and the statistical multifragmentation model

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, B.V., E-mail: brett@ita.br [Departamento de Fisica, Instituto Tecnologico de Aeronautica - CTA, 12228-900 Sao Jose dos Campos (Brazil); Donangelo, R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Facultad de Ingenieria, Universidad de la Republica, Julio Herrera y Reissig 565, 11.300 Montevideo (Uruguay); Souza, S.R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Universidade Federal do Rio Grande do Sul, Av. Bento Goncalves 9500, CP 15051, 91501-970, Porto Alegre (Brazil); Lynch, W.G.; Steiner, A.W.; Tsang, M.B. [Joint Institute for Nuclear Astrophysics, National Superconducting Cyclotron Laboratory and the Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States)

    2012-02-15

    We demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical statistical multifragmentation model used to describe the disintegration of highly excited fragments of nuclear reactions. We argue that such a model better fulfills the hypothesis of statistical equilibrium than the Fermi breakup model generally used to describe statistical disintegration of light mass nuclei.

  20. Reference analysis of the signal + background model in counting experiments

    Science.gov (United States)

    Casadei, D.

    2012-01-01

    The model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered from a Bayesian point of view. This is a widely used model for the searches of rare or exotic events in presence of a background source, as for example in the searches performed by high-energy physics experiments. In the assumption of prior knowledge about the background yield, a reference prior is obtained for the signal alone and its properties are studied. Finally, the properties of the full solution, the marginal reference posterior, are illustrated with few examples.

  1. Dual adaptive statistical approach for quantitative noise reduction in photon-counting medical imaging: application to nuclear medicine images.

    Science.gov (United States)

    Hannequin, Pascal Paul

    2015-06-07

    Noise reduction in photon-counting images remains challenging, especially at low count levels. We have developed an original procedure which associates two complementary filters using a Wiener-derived approach. This approach combines two statistically adaptive filters into a dual-weighted (DW) filter. The first one, a statistically weighted adaptive (SWA) filter, replaces the central pixel of a sliding window with a statistically weighted sum of its neighbors. The second one, a statistical and heuristic noise extraction (extended) (SHINE-Ext) filter, performs a discrete cosine transformation (DCT) using sliding blocks. Each block is reconstructed using its significant components which are selected using tests derived from multiple linear regression (MLR). The two filters are weighted according to Wiener theory. This approach has been validated using a numerical phantom and a real planar Jaszczak phantom. It has also been illustrated using planar bone scintigraphy and myocardial single-photon emission computed tomography (SPECT) data. Performances of filters have been tested using mean normalized absolute error (MNAE) between the filtered images and the reference noiseless or high-count images.Results show that the proposed filters quantitatively decrease the MNAE in the images and then increase the signal-to-noise Ratio (SNR). This allows one to work with lower count images. The SHINE-Ext filter is well suited to high-size images and low-variance areas. DW filtering is efficient for low-size images and in high-variance areas. The relative proportion of eliminated noise generally decreases when count level increases. In practice, SHINE filtering alone is recommended when pixel spacing is less than one-quarter of the effective resolution of the system and/or the size of the objects of interest. It can also be used when the practical interest of high frequencies is low. In any case, DW filtering will be preferable.The proposed filters have been applied to nuclear

  2. A new model for the simplification of particle counting data

    Directory of Open Access Journals (Sweden)

    M. F. Fadal

    2012-06-01

    Full Text Available This paper proposes a three-parameter mathematical model to describe the particle size distribution in a water sample. The proposed model offers some conceptual advantages over two other models reported on previously, and also provides a better fit to the particle counting data obtained from 321 water samples taken over three years at a large South African drinking water supplier. Using the data from raw water samples taken from a moderately turbid, large surface impoundment, as well as samples from the same water after treatment, typical ranges of the model parameters are presented for both raw and treated water. Once calibrated, the model allows the calculation and comparison of total particle number and volumes over any randomly selected size interval of interest.

  3. Cosmological constraints with weak-lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, François; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-03-01

    Peak statistics in weak-lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. Next-generation galaxy surveys, with their advanced optics and large areas, will measure the cosmic weak-lensing signal with unprecedented precision. To prepare for these anticipated data sets, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how Camelus, a fast stochastic model for predicting peaks, can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. Considering peaks with a signal-to-noise ratio ≥ 1, we measure the abundance histogram in a mock shear catalogue of approximately 5000 deg2 using a multiscale mass-map filtering technique. We constrain the parameters of the mock survey using Camelus combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. Peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, as measured by the width ΔΣ8 of the 1σ contour. We find Σ8 = σ8(Ωm/ 0.27)α = 0.77-0.05+0.06 with α = 0.75 for a flat ΛCDM model. The strong bias indicates the need to better understand and control the model systematics before applying it to a real survey of this size or larger. We perform a calibration of the model and compare results to those from the two-point correlation functions ξ± measured on the same field. We calibrate the ξ± result as well, since its contours are also biased, although not as severely as for peaks. In this case, we find for peaks Σ8 = 0.76-0.03+0.02 with α = 0.65, while for the combined ξ+ and ξ- statistics the values are Σ8 = 0.76-0.01+0.02 and α = 0

  4. Full-counting statistics and phase transition in an open quantum system of non-interacting electrons

    Science.gov (United States)

    Medvedyeva, Mariya; Kehrein, Stefan

    2014-03-01

    We develop a method for calculating the full-counting statistics for a non-interacting fermionic system coupled to memory-less reservoirs. The evolution of the system is described by the Lindblad equation. We introduce the counting field in the Lindblad equation which yields the generating function and allows us to obtain all cumulants of the charge transport. In a uniform system the cumulants of order k are independent of the system size for systems longer than k+1 sites. The counting statistics from the Lindblad approach does not take into account the interference in the reservoirs which gives a decreased value of noise in comparison to the Green function approach which describes phase coherent leads. The two methods yield the same value for the current, which is due to current conservation. The Fano factors are different (and linearly related) and allow us to distinguish between memory-less and phase coherent reservoirs. We also consider the influence of dissipation along the chain allowing for both tunneling into and out of the chain along its length. Infinitesimally small dissipation along the chain induces a quantum phase transition which manifests itself as a discontinuity in transport properties and entropy.

  5. Statistical modelling of fish stocks

    DEFF Research Database (Denmark)

    Kvist, Trine

    1999-01-01

    for modelling the dynamics of a fish population is suggested. A new approach is introduced to analyse the sources of variation in age composition data, which is one of the most important sources of information in the cohort based models for estimation of stock abundancies and mortalities. The approach combines...... and it is argued that an approach utilising stochastic differential equations might be advantagous in fish stoch assessments....

  6. Statistical modelling of fish stocks

    DEFF Research Database (Denmark)

    Kvist, Trine

    1999-01-01

    for modelling the dynamics of a fish population is suggested. A new approach is introduced to analyse the sources of variation in age composition data, which is one of the most important sources of information in the cohort based models for estimation of stock abundancies and mortalities. The approach combines...... and it is argued that an approach utilising stochastic differential equations might be advantagous in fish stoch assessments....

  7. Statistical modelling for ship propulsion efficiency

    DEFF Research Database (Denmark)

    Petersen, Jóan Petur; Jacobsen, Daniel J.; Winther, Ole

    2012-01-01

    This paper presents a state-of-the-art systems approach to statistical modelling of fuel efficiency in ship propulsion, and also a novel and publicly available data set of high quality sensory data. Two statistical model approaches are investigated and compared: artificial neural networks...

  8. Statistical Models and Methods for Lifetime Data

    CERN Document Server

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  9. Actinic defect counting statistics over 1 cm2 area of EUVL mask blank

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Seongtae; Lai, Chih-Wei; Rekawa, Seno; Walton, Chris W.; Bokor, Jeffrey

    2000-02-18

    As a continuation of comparison experiments between EUV inspection and visible inspection of defects on EUVL mask blanks, we report on the result of an experiment where the EUV defect inspection tool is used to perform at-wavelength defect counting over 1 cm{sup 2} of EUVL mask blank. Initial EUV inspection found five defects over the scanned area and the subsequent optical scattering inspection was able to detect all of the five defects. Therefore, if there are any defects that are only detectable by EUV inspection, the density is lower than the order of unity per cm2. An upgrade path to substantially increase the overall throughput of the EUV inspection system is also identified in the manuscript.

  10. Full counting statistics of transport electrons through a two-level quantum dot with spin–orbit coupling

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.M. [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China); Xue, H.B. [College of Physics and Optoelectronics, Taiyuan University of Technology, Taiyuan 030024 (China); Xue, N.T. [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China); Liang, J.-Q., E-mail: jqliang@sxu.edu.cn [Institute of Theoretical Physics and Department of Physics, Shanxi University, Taiyuan 030006 (China)

    2015-02-15

    We study the full counting statistics of transport electrons through a semiconductor two-level quantum dot with Rashba spin–orbit (SO) coupling, which acts as a nonabelian gauge field and thus induces the electron transition between two levels along with the spin flip. By means of the quantum master equation approach, shot noise and skewness are obtained at finite temperature with two-body Coulomb interaction. We particularly demonstrate the crucial effect of SO coupling on the super-Poissonian fluctuation of transport electrons, in terms of which the SO coupling can be probed by the zero-frequency cumulants. While the charge currents are not sensitive to the SO coupling.

  11. Finite-time full counting statistics and factorial cumulants for transport through a quantum dot with normal and superconducting leads

    Science.gov (United States)

    Droste, Stephanie; Governale, Michele

    2016-04-01

    We study the finite-time full counting statistics for subgap transport through a single-level quantum dot tunnel-coupled to one normal and one superconducting lead. In particular, we determine the factorial and the ordinary cumulants both for finite times and in the long-time limit. We find that the factorial cumulants violate the sign criterion, indicating a non-binomial distribution, even in absence of Coulomb repulsion due to the presence of superconducting correlations. At short times the cumulants exhibit oscillations which are a signature of the coherent transfer of Cooper pairs between the dot and the superconductor.

  12. Statistical Modeling of Bivariate Data.

    Science.gov (United States)

    1982-08-01

    end identify by lock nsum br) joint density-quantile function, dependence-density, non-parametric bivariate density estimation, entropy , exponential...estimated, by autoregressive or exponential model estimators I with maximum entropy properties, is investigated in this thesis. The results provide...important and useful procedures for nonparametric bivariate density estimation. The thesis discusses estimators of the entropy H(d) of ul2) which seem to me

  13. Study of Distortions in Statistics of Counts in CCD Observations using the Fano Factor

    CERN Document Server

    Afanasieva, I V

    2016-01-01

    Factors distorting the statistics of photocounts when acquiring objects with low fluxes were considered here. Measurements of the Fano factor for existing CCD systems were conducted. The study allows one to conclude on the quality of the CCD video signal processing channel. The optimal strategy for faint object observations was suggested.

  14. Modeling of heterotrophic bacteria counts in a water distribution system.

    Science.gov (United States)

    Francisque, Alex; Rodriguez, Manuel J; Miranda-Moreno, Luis F; Sadiq, Rehan; Proulx, François

    2009-03-01

    Heterotrophic plate count (HPC) constitutes a common indicator for monitoring of microbiological water quality in distribution systems (DS). This paper aims to identify factors explaining the spatiotemporal distribution of heterotrophic bacteria and model their occurrence in the distribution system. The case under study is the DS of Quebec City, Canada. The study is based on a robust database resulting from a sampling campaign carried out in about 50 DS locations, monitored bi-weekly over a three-year period. Models for explaining and predicting HPC levels were based on both one-level and multi-level Poisson regression techniques. The latter take into account the nested structure of data, the possible spatiotemporal correlation among HPC observations and the fact that sampling points, months and/or distribution sub-systems may represent clusters. Models show that the best predictors for spatiotemporal occurrence of HPC in the DS are: free residual chlorine that has an inverse relation with the HPC levels, water temperature and water ultraviolet absorbance, both having a positive impact on HPC levels. A sensitivity analysis based on the best performing model (two-level model) allowed for the identification of seasonal-based strategies to reduce HPC levels.

  15. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  16. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  17. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  18. Multistructure Statistical Model Applied To Factor Analysis

    Science.gov (United States)

    Bentler, Peter M.

    1976-01-01

    A general statistical model for the multivariate analysis of mean and covariance structures is described. Matrix calculus is used to develop the statistical aspects of one new special case in detail. This special case separates the confounding of principal components and factor analysis. (DEP)

  19. Topology for statistical modeling of petascale data.

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  20. Semantic Importance Sampling for Statistical Model Checking

    Science.gov (United States)

    2015-01-16

    approach called Statistical Model Checking (SMC) [16], which relies on Monte - Carlo -based simulations to solve this verification task more scalably...Conclusion Statistical model checking (SMC) is a prominent approach for rigorous analysis of stochastic systems using Monte - Carlo simulations. In this... Monte - Carlo simulations, for computing the bounded probability that a specific event occurs during a stochastic system’s execution. Estimating the

  1. Infinite Random Graphs as Statistical Mechanical Models

    DEFF Research Database (Denmark)

    Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria

    2011-01-01

    We discuss two examples of infinite random graphs obtained as limits of finite statistical mechanical systems: a model of two-dimensional dis-cretized quantum gravity defined in terms of causal triangulated surfaces, and the Ising model on generic random trees. For the former model we describe...

  2. Probability and Statistics in Sensor Performance Modeling

    Science.gov (United States)

    2010-12-01

    transformed Rice- Nakagami distribution ......................................................................... 49 Report Documentation Page...acoustic or electromagnetic waves are scattered by both objects and turbulent wind. A version of the Rice- Nakagami model (specifically with a...Gaussian, lognormal, exponential, gamma, and the 2XX → transformed Rice- Nakagami —as well as a discrete model. (Other examples of statistical models

  3. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  4. Matrix Tricks for Linear Statistical Models

    CERN Document Server

    Puntanen, Simo; Styan, George PH

    2011-01-01

    In teaching linear statistical models to first-year graduate students or to final-year undergraduate students there is no way to proceed smoothly without matrices and related concepts of linear algebra; their use is really essential. Our experience is that making some particular matrix tricks very familiar to students can substantially increase their insight into linear statistical models (and also multivariate statistical analysis). In matrix algebra, there are handy, sometimes even very simple "tricks" which simplify and clarify the treatment of a problem - both for the student and

  5. Distributions with given marginals and statistical modelling

    CERN Document Server

    Fortiana, Josep; Rodriguez-Lallena, José

    2002-01-01

    This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.

  6. Using Poisson statistics to analyze supernova remnant emission in the low counts X-ray regime

    Science.gov (United States)

    Roper, Quentin Jeffrey

    We utilize a Poisson likelihood in a maximum likelihood statistical analysis to analyze X-ray spectragraphic data. Specifically, we examine four extragalactic supernova remnants (SNR). IKT 5 (SNR 0047-73.5), IKT 25 (SNR 0104-72.3), and DEM S 128 (SNR 0103-72.4) which are designated as Type Ia in the literature due to their spectra and morphology. This is troublesome because of their asymmetry, a trait not usually associated with young Type Ia remnants. We present Chandra X-ray Observatory data on these three remnants, and perform a maximum likelihood analysis on their spectra. We find that the X-ray emission is dominated by interactions with the interstellar medium. In spite of this, we find a significant Fe overabundance in all three remnants. Through examination of radio, optical, and infrared data, we conclude that these three remnants are likely not "classical" Type Ia SNR, but may be examples of so-called "prompt" Type Ia SNR. We detect potential point sources that may be members of the progenitor systems of both DEM S 128 and IKT 5, which could suggest a new subclass of prompt Type Ia SNR, Fe-rich CC remnants. In addition, we examine IKT 18. This remnant is positionally coincident with the X-ray point source HD 5980. Due to an outburst in 1994, in which its brightness changed by 3 magnitudes (corrsponding to an increase in luminosity by a factor of 16) HD 5980 was classified as a luminous blue variable star. We examine this point source and the remnant IKT 18 in the X-ray, and find that its non-thermal photon index has decreased from 2002 to 2013, corresponding to a larger proportion of more energetic X-rays, which is unexpected.

  7. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  8. Statistical Modeling for Radiation Hardness Assurance

    Science.gov (United States)

    Ladbury, Raymond L.

    2014-01-01

    We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.

  9. Simple statistical model for branched aggregates

    DEFF Research Database (Denmark)

    Lemarchand, Claire; Hansen, Jesper Schmidt

    2015-01-01

    , given that it already has bonds with others. The model is applied here to asphaltene nanoaggregates observed in molecular dynamics simulations of Cooee bitumen. The variation with temperature of the probabilities deduced from this model is discussed in terms of statistical mechanics arguments......We propose a statistical model that can reproduce the size distribution of any branched aggregate, including amylopectin, dendrimers, molecular clusters of monoalcohols, and asphaltene nanoaggregates. It is based on the conditional probability for one molecule to form a new bond with a molecule....... The relevance of the statistical model in the case of asphaltene nanoaggregates is checked by comparing the predicted value of the probability for one molecule to have exactly i bonds with the same probability directly measured in the molecular dynamics simulations. The agreement is satisfactory...

  10. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  11. Statistical Model Checking for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Du, Dehui; Larsen, Kim Guldstrand

    2012-01-01

    This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique ap...

  12. Dielectronic recombination rate in statistical model

    OpenAIRE

    Demura A.V.; Leontyev D.S.; Lisitsa V.S.; Shurigyn V.A.

    2017-01-01

    The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear...

  13. Dielectronic recombination rate in statistical model

    Directory of Open Access Journals (Sweden)

    Demura A.V.

    2017-01-01

    Full Text Available The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear plasmas with the tungsten impurities.

  14. Dielectronic recombination rate in statistical model

    Science.gov (United States)

    Demura, A. V.; Leontyev, D. S.; Lisitsa, V. S.; Shurigyn, V. A.

    2016-12-01

    The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear plasmas with the tungsten impurities.

  15. Using the negative binomial distribution to model overdispersion in ecological count data.

    Science.gov (United States)

    Lindén, Andreas; Mäntyniemi, Samu

    2011-07-01

    A Poisson process is a commonly used starting point for modeling stochastic variation of ecological count data around a theoretical expectation. However, data typically show more variation than implied by the Poisson distribution. Such overdispersion is often accounted for by using models with different assumptions about how the variance changes with the expectation. The choice of these assumptions can naturally have apparent consequences for statistical inference. We propose a parameterization of the negative binomial distribution, where two overdispersion parameters are introduced to allow for various quadratic mean-variance relationships, including the ones assumed in the most commonly used approaches. Using bird migration as an example, we present hypothetical scenarios on how overdispersion can arise due to sampling, flocking behavior or aggregation, environmental variability, or combinations of these factors. For all considered scenarios, mean-variance relationships can be appropriately described by the negative binomial distribution with two overdispersion parameters. To illustrate, we apply the model to empirical migration data with a high level of overdispersion, gaining clearly different model fits with different assumptions about mean-variance relationships. The proposed framework can be a useful approximation for modeling marginal distributions of independent count data in likelihood-based analyses.

  16. CUSUM chart to monitor autocorrelated counts using Negative Binomial GARMA model.

    Science.gov (United States)

    Albarracin, Orlando Yesid Esparza; Alencar, Airlane Pereira; Lee Ho, Linda

    2017-01-01

    Cumulative sum control charts have been used for health surveillance due to its efficiency to detect soon small shifts in the monitored series. However, these charts may fail when data are autocorrelated. An alternative procedure is to build a control chart based on the residuals after fitting autoregressive moving average models, but these models usually assume Gaussian distribution for the residuals. In practical health surveillance, count series can be modeled by Poisson or Negative Binomial regression, this last to control overdispersion. To include serial correlations, generalized autoregressive moving average models are proposed. The main contribution of the current article is to measure the impact, in terms of average run length on the performance of cumulative sum charts when the serial correlation is neglected in the regression model. Different statistics based on transformations, the deviance residual, and the likelihood ratio are used to build cumulative sum control charts to monitor counts with time varying means, including trend and seasonal effects. The monitoring of the weekly number of hospital admissions due to respiratory diseases for people aged over 65 years in the city São Paulo-Brazil is considered as an illustration of the current method.

  17. Bayesian dynamic modeling of time series of dengue disease case counts.

    Directory of Open Access Journals (Sweden)

    Daniel Adyro Martínez-Bello

    2017-07-01

    Full Text Available The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease

  18. Growth curve models and statistical diagnostics

    CERN Document Server

    Pan, Jian-Xin

    2002-01-01

    Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.

  19. Three Generative, Lexicalised Models for Statistical Parsing

    CERN Document Server

    Collins, M

    1997-01-01

    In this paper we first propose a new statistical parsing model, which is a generative model of lexicalised context-free grammar. We then extend the model to include a probabilistic treatment of both subcategorisation and wh-movement. Results on Wall Street Journal text show that the parser performs at 88.1/87.5% constituent precision/recall, an average improvement of 2.3% over (Collins 96).

  20. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Bremer, P. -T. [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, the approach of the entire team involving all three institutions is based on the complementary techniques of combinatorial topology and statistical modelling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modelling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. The overall technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modelling, and (3) new integrated topological and statistical methods. Roughly speaking, the division of labor between our 3 groups (Sandia Labs in Livermore, Texas A&M in College Station, and U Utah in Salt Lake City) is as follows: the Sandia group focuses on statistical methods and their formulation in algebraic terms, and finds the application problems (and data sets) most relevant to this project, the Texas A&M Group develops new algebraic geometry algorithms, in particular with fewnomial theory, and the Utah group develops new algorithms in computational topology via Discrete Morse Theory. However, we hasten to point out that our three groups stay in tight contact via videconference every 2 weeks, so there is much synergy of ideas between the groups. The following of this document is focused on the contributions that had grater direct involvement from the team at the University of Utah in Salt Lake City.

  1. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  2. An R companion to linear statistical models

    CERN Document Server

    Hay-Jahans, Christopher

    2011-01-01

    Focusing on user-developed programming, An R Companion to Linear Statistical Models serves two audiences: those who are familiar with the theory and applications of linear statistical models and wish to learn or enhance their skills in R; and those who are enrolled in an R-based course on regression and analysis of variance. For those who have never used R, the book begins with a self-contained introduction to R that lays the foundation for later chapters.This book includes extensive and carefully explained examples of how to write programs using the R programming language. These examples cove

  3. Statistical transmutation in doped quantum dimer models.

    Science.gov (United States)

    Lamas, C A; Ralko, A; Cabra, D C; Poilblanc, D; Pujol, P

    2012-07-06

    We prove a "statistical transmutation" symmetry of doped quantum dimer models on the square, triangular, and kagome lattices: the energy spectrum is invariant under a simultaneous change of statistics (i.e., bosonic into fermionic or vice versa) of the holes and of the signs of all the dimer resonance loops. This exact transformation enables us to define the duality equivalence between doped quantum dimer Hamiltonians and provides the analytic framework to analyze dynamical statistical transmutations. We investigate numerically the doping of the triangular quantum dimer model with special focus on the topological Z(2) dimer liquid. Doping leads to four (instead of two for the square lattice) inequivalent families of Hamiltonians. Competition between phase separation, superfluidity, supersolidity, and fermionic phases is investigated in the four families.

  4. STATISTICAL MODELS OF REPRESENTING INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2016-07-01

    Full Text Available This article entitled Statistical Models of Representing Intellectual Capital approaches and analyses the concept of intellectual capital, as well as the main models which can support enterprisers/managers in evaluating and quantifying the advantages of intellectual capital. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. Among the group of static models for evaluating organisational intellectual capital the canonical model stands out. This model enables the structuring of organisational intellectual capital in: human capital, structural capital and relational capital. Although the model is widely spread, it is a static one and can thus create a series of errors in the process of evaluation, because all the three entities mentioned above are not independent from the viewpoint of their contents, as any logic of structuring complex entities requires.

  5. Simulation on Poisson and negative binomial models of count road accident modeling

    Science.gov (United States)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  6. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Rojas, Maurice [Texas A & M Univ., College Station, TX (United States)

    2014-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled "Topology for Statistical Modeling of Petascale Data", funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program.

  7. Statistical Modeling Efforts for Headspace Gas

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Brian Phillip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-17

    The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.

  8. Nonperturbative approach to the modified statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Magdy, M.A.; Bekmezci, A.; Sever, R. [Middle East Technical Univ., Ankara (Turkey)

    1993-12-01

    The modified form of the statistical model is used without making any perturbation. The mass spectra of the lowest S, P and D levels of the (Q{bar Q}) and the non-self-conjugate (Q{bar q}) mesons are studied with the Song-Lin potential. The authors results are in good agreement with the experimental and theoretical findings.

  9. Statistical Model Checking for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Du, Dehui; Larsen, Kim Guldstrand

    2012-01-01

    This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique...... applied for implementing this semantics in the UPPAAL-SMC simulation engine. We report on two applications of the resulting tool-set coming from systems biology and energy aware buildings....

  10. Statistical modeling of space shuttle environmental data

    Science.gov (United States)

    Tubbs, J. D.; Brewer, D. W.

    1983-01-01

    Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.

  11. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  12. Statistical physical models of cellular motility

    Science.gov (United States)

    Banigan, Edward J.

    Cellular motility is required for a wide range of biological behaviors and functions, and the topic poses a number of interesting physical questions. In this work, we construct and analyze models of various aspects of cellular motility using tools and ideas from statistical physics. We begin with a Brownian dynamics model for actin-polymerization-driven motility, which is responsible for cell crawling and "rocketing" motility of pathogens. Within this model, we explore the robustness of self-diffusiophoresis, which is a general mechanism of motility. Using this mechanism, an object such as a cell catalyzes a reaction that generates a steady-state concentration gradient that propels the object in a particular direction. We then apply these ideas to a model for depolymerization-driven motility during bacterial chromosome segregation. We find that depolymerization and protein-protein binding interactions alone are sufficient to robustly pull a chromosome, even against large loads. Next, we investigate how forces and kinetics interact during eukaryotic mitosis with a many-microtubule model. Microtubules exert forces on chromosomes, but since individual microtubules grow and shrink in a force-dependent way, these forces lead to bistable collective microtubule dynamics, which provides a mechanism for chromosome oscillations and microtubule-based tension sensing. Finally, we explore kinematic aspects of cell motility in the context of the immune system. We develop quantitative methods for analyzing cell migration statistics collected during imaging experiments. We find that during chronic infection in the brain, T cells run and pause stochastically, following the statistics of a generalized Levy walk. These statistics may contribute to immune function by mimicking an evolutionarily conserved efficient search strategy. Additionally, we find that naive T cells migrating in lymph nodes also obey non-Gaussian statistics. Altogether, our work demonstrates how physical

  13. Neutrino event counts from Type Ia supernova models

    Science.gov (United States)

    Nagaraj, Gautam; Scholberg, Kate

    2016-01-01

    Core collapse supernovae (SNe) are widely known to be among the universe's primary neutrino factories, releasing ˜99% of their energy, or ˜1053 ergs, in the form of the tiny leptons. On the other hand, less than 4% of the energy of Type Ia SNe is released via neutrinos, hence making Ia SNe impossible to detect (through neutrino observations) at typical supernova distances. For this reason, neutrino signatures from these explosions have very rarely been modeled. We ran time-sliced fluences from non-oscillation pure deflagration and delayed detonation (DDT) Ia models by Odrzywolek and Plewa (2011) through SNOwGLoBES, a software that calculates event rates and other observed quantities of supernova neutrinos in various detectors. We determined Ia neutrino event rates in Hyper-K, a proposed water Cherenkov detector, JUNO, a scintillator detector under construction, and DUNE, a proposed argon detector, and identified criteria to distinguish between the two models (pure deflagration and DDT) based on data from a real supernova (statistically represented by a Poisson distribution around the expected result). We found that up to distances of 8.00, 1.54, and 2.37 kpc (subject to change based on oscillation effects and modified detector efficiencies), we can discern the explosion mechanism with ≥90% confidence in Hyper-K, JUNO, and DUNE, respectively, thus learning more about Ia progenitors.

  14. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  15. Pitfalls in statistical landslide susceptibility modelling

    Science.gov (United States)

    Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut

    2010-05-01

    The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible

  16. Equilibrium statistical mechanics of lattice models

    CERN Document Server

    Lavis, David A

    2015-01-01

    Most interesting and difficult problems in equilibrium statistical mechanics concern models which exhibit phase transitions. For graduate students and more experienced researchers this book provides an invaluable reference source of approximate and exact solutions for a comprehensive range of such models. Part I contains background material on classical thermodynamics and statistical mechanics, together with a classification and survey of lattice models. The geometry of phase transitions is described and scaling theory is used to introduce critical exponents and scaling laws. An introduction is given to finite-size scaling, conformal invariance and Schramm—Loewner evolution. Part II contains accounts of classical mean-field methods. The parallels between Landau expansions and catastrophe theory are discussed and Ginzburg—Landau theory is introduced. The extension of mean-field theory to higher-orders is explored using the Kikuchi—Hijmans—De Boer hierarchy of approximations. In Part III the use of alge...

  17. Statistical shape and appearance models of bones.

    Science.gov (United States)

    Sarkalkan, Nazli; Weinans, Harrie; Zadpoor, Amir A

    2014-03-01

    When applied to bones, statistical shape models (SSM) and statistical appearance models (SAM) respectively describe the mean shape and mean density distribution of bones within a certain population as well as the main modes of variations of shape and density distribution from their mean values. The availability of this quantitative information regarding the detailed anatomy of bones provides new opportunities for diagnosis, evaluation, and treatment of skeletal diseases. The potential of SSM and SAM has been recently recognized within the bone research community. For example, these models have been applied for studying the effects of bone shape on the etiology of osteoarthritis, improving the accuracy of clinical osteoporotic fracture prediction techniques, design of orthopedic implants, and surgery planning. This paper reviews the main concepts, methods, and applications of SSM and SAM as applied to bone.

  18. Statistical Compressed Sensing of Gaussian Mixture Models

    CERN Document Server

    Yu, Guoshen

    2011-01-01

    A novel framework of compressed sensing, namely statistical compressed sensing (SCS), that aims at efficiently sampling a collection of signals that follow a statistical distribution, and achieving accurate reconstruction on average, is introduced. SCS based on Gaussian models is investigated in depth. For signals that follow a single Gaussian model, with Gaussian or Bernoulli sensing matrices of O(k) measurements, considerably smaller than the O(k log(N/k)) required by conventional CS based on sparse models, where N is the signal dimension, and with an optimal decoder implemented via linear filtering, significantly faster than the pursuit decoders applied in conventional CS, the error of SCS is shown tightly upper bounded by a constant times the best k-term approximation error, with overwhelming probability. The failure probability is also significantly smaller than that of conventional sparsity-oriented CS. Stronger yet simpler results further show that for any sensing matrix, the error of Gaussian SCS is u...

  19. Counting Tensor Model Observables and Branched Covers of the 2-Sphere

    CERN Document Server

    Geloun, Joseph Ben

    2013-01-01

    Lattice gauge theories of permutation groups with a simple topological action (henceforth permutation-TFTs) have recently found several applications in the combinatorics of quantum field theories (QFTs). They have been used to solve counting problems of Feynman graphs in QFTs and ribbon graphs of large $N$, often revealing inter-relations between different counting problems. In another recent development, tensor theories generalizing matrix theories have been actively developed as models of random geometry in three or more dimensions. Here, we apply permutation-TFT methods to count gauge invariants for tensor models (colored as well as non-colored), exhibiting a relationship with counting problems of branched covers of the 2-sphere, where the rank $d$ of the tensor gets related to a number of branch points. We give explicit generating functions for the relevant counting and describe algorithms for the enumeration of the invariants. As well as the classic count of Hurwitz equivalence classes of branched covers...

  20. Effect of finite Coulomb interaction on full counting statistics of electronic transport through single-molecule magnet

    Energy Technology Data Exchange (ETDEWEB)

    Xue Haibin, E-mail: xhb98326110@163.co [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Nie, Y.-H., E-mail: nieyh@sxu.edu.c [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China); Li, Z.-J.; Liang, J.-Q. [Institute of Theoretical Physics, Shanxi University, Taiyuan, Shanxi 030006 (China)

    2011-01-17

    We study the full counting statistics (FCS) in a single-molecule magnet (SMM) with finite Coulomb interaction U. For finite U the FCS, differing from U{yields}{infinity}, shows a symmetric gate-voltage-dependence when the coupling strengths with two electrodes are interchanged, which can be observed experimentally just by reversing the bias-voltage. Moreover, we find that the effect of finite U on shot noise depends on the internal level structure of the SMM and the coupling asymmetry of the SMM with two electrodes as well. When the coupling of the SMM with the incident-electrode is stronger than that with the outgoing-electrode, the super-Poissonian shot noise in the sequential tunneling regime appears under relatively small gate-voltage and relatively large finite U, and dose not for U{yields}{infinity}; while it occurs at relatively large gate-voltage for the opposite coupling case. The formation mechanism of super-Poissonian shot noise can be qualitatively attributed to the competition between fast and slow transport channels.

  1. Full counting statistics of phonon-assisted Andreev tunneling through a quantum dot coupled to normal and superconducting leads

    Science.gov (United States)

    Dong, Bing; Ding, G. H.; Lei, X. L.

    2017-01-01

    We present a theoretical investigation for the full counting statistics of the Andreev tunneling through a quantum dot (QD) embedded between superconducting (SC) and normal leads in the presence of a strong on-site electron-phonon interaction using nonequilibrium Green function method. For this purpose, we generalize the dressed tunneling approximation (DTA) recently developed in dealing with inelastic tunneling in a normal QD system to the Andreev transport issue. This method takes account of vibrational effect in evaluation of electronic tunneling self energy in comparison with other simple approaches and meanwhile allows us to derive an explicit analytical formula for the cumulant generating function at the subgap region. We then analyze the interplay of polaronic and SC proximity effects on the Andreev reflection spectrum, current-voltage characteristics, and current fluctuations of the hybrid system. Our main findings include: (1) no phonon side peaks in the linear Andreev conductance; (2) a negative differential conductance stemming from the suppressed Andreev reflection spectrum; (3) a novel inelastic resonant peak in the differential conductance due to phonon assisted Andreev reflection; (4) enhancement or suppression of shot noise for the symmetric or asymmetric tunnel-coupling system, respectively.

  2. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  3. Modeling Conservative Updates in Multi-Hash Approximate Count Sketches

    OpenAIRE

    2012-01-01

    Multi-hash-based count sketches are fast and memory efficient probabilistic data structures that are widely used in scalable online traffic monitoring applications. Their accuracy significantly improves with an optimization, called conservative update, which is especially effective when the aim is to discriminate a relatively small number of heavy hitters in a traffic stream consisting of an extremely large number of flows. Despite its widespread application, a thorough u...

  4. Statistical modeling of geopressured geothermal reservoirs

    Science.gov (United States)

    Ansari, Esmail; Hughes, Richard; White, Christopher D.

    2017-06-01

    Identifying attractive candidate reservoirs for producing geothermal energy requires predictive models. In this work, inspectional analysis and statistical modeling are used to create simple predictive models for a line drive design. Inspectional analysis on the partial differential equations governing this design yields a minimum number of fifteen dimensionless groups required to describe the physics of the system. These dimensionless groups are explained and confirmed using models with similar dimensionless groups but different dimensional parameters. This study models dimensionless production temperature and thermal recovery factor as the responses of a numerical model. These responses are obtained by a Box-Behnken experimental design. An uncertainty plot is used to segment the dimensionless time and develop a model for each segment. The important dimensionless numbers for each segment of the dimensionless time are identified using the Boosting method. These selected numbers are used in the regression models. The developed models are reduced to have a minimum number of predictors and interactions. The reduced final models are then presented and assessed using testing runs. Finally, applications of these models are offered. The presented workflow is generic and can be used to translate the output of a numerical simulator into simple predictive models in other research areas involving numerical simulation.

  5. Statistical Language Model for Chinese Text Proofreading

    Institute of Scientific and Technical Information of China (English)

    张仰森; 曹元大

    2003-01-01

    Statistical language modeling techniques are investigated so as to construct a language model for Chinese text proofreading. After the defects of n-gram model are analyzed, a novel statistical language model for Chinese text proofreading is proposed. This model takes full account of the information located before and after the target word wi, and the relationship between un-neighboring words wi and wj in linguistic environment(LE). First, the word association degree between wi and wj is defined by using the distance-weighted factor, wj is l words apart from wi in the LE, then Bayes formula is used to calculate the LE related degree of word wi, and lastly, the LE related degree is taken as criterion to predict the reasonability of word wi that appears in context. Comparing the proposed model with the traditional n-gram in a Chinese text automatic error detection system, the experiments results show that the error detection recall rate and precision rate of the system have been improved.

  6. Statistics, Computation, and Modeling in Cosmology

    Science.gov (United States)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and

  7. A quantile count model of water depth constraints on Cape Sable seaside sparrows

    Science.gov (United States)

    Cade, B.S.; Dong, Q.

    2008-01-01

    1. A quantile regression model for counts of breeding Cape Sable seaside sparrows Ammodramus maritimus mirabilis (L.) as a function of water depth and previous year abundance was developed based on extensive surveys, 1992-2005, in the Florida Everglades. The quantile count model extends linear quantile regression methods to discrete response variables, providing a flexible alternative to discrete parametric distributional models, e.g. Poisson, negative binomial and their zero-inflated counterparts. 2. Estimates from our multiplicative model demonstrated that negative effects of increasing water depth in breeding habitat on sparrow numbers were dependent on recent occupation history. Upper 10th percentiles of counts (one to three sparrows) decreased with increasing water depth from 0 to 30 cm when sites were not occupied in previous years. However, upper 40th percentiles of counts (one to six sparrows) decreased with increasing water depth for sites occupied in previous years. 3. Greatest decreases (-50% to -83%) in upper quantiles of sparrow counts occurred as water depths increased from 0 to 15 cm when previous year counts were 1, but a small proportion of sites (5-10%) held at least one sparrow even as water depths increased to 20 or 30 cm. 4. A zero-inflated Poisson regression model provided estimates of conditional means that also decreased with increasing water depth but rates of change were lower and decreased with increasing previous year counts compared to the quantile count model. Quantiles computed for the zero-inflated Poisson model enhanced interpretation of this model but had greater lack-of-fit for water depths > 0 cm and previous year counts 1, conditions where the negative effect of water depths were readily apparent and fitted better with the quantile count model.

  8. EM Adaptive LASSO-A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes.

    Science.gov (United States)

    Mallick, Himel; Tiwari, Hemant K

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice.

  9. EM Adaptive LASSO – A Multilocus Modeling Strategy for Detecting SNPs Associated With Zero-inflated Count Phenotypes

    Directory of Open Access Journals (Sweden)

    Himel eMallick

    2016-03-01

    Full Text Available Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP, Negative Binomial, and Zero-inflated Negative Binomial (ZINB. However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely

  10. Statistical assessment of predictive modeling uncertainty

    Science.gov (United States)

    Barzaghi, Riccardo; Marotta, Anna Maria

    2017-04-01

    When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.

  11. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  12. Spatial Statistical Procedures to Validate Input Data in Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.

    2006-01-01

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  13. Spatial Statistical Procedures to Validate Input Data in Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence Livermore National Laboratory

    2006-01-27

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy-related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the above-mentioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  14. Infinite Random Graphs as Statistical Mechanical Models

    DEFF Research Database (Denmark)

    Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria

    2011-01-01

    We discuss two examples of infinite random graphs obtained as limits of finite statistical mechanical systems: a model of two-dimensional dis-cretized quantum gravity defined in terms of causal triangulated surfaces, and the Ising model on generic random trees. For the former model we describe...... a relation to the so-called uniform infinite tree and results on the Hausdorff and spectral dimension of two-dimensional space-time obtained in B. Durhuus, T. Jonsson, J.F. Wheater, J. Stat. Phys. 139, 859 (2010) are briefly outlined. For the latter we discuss results on the absence of spontaneous...... magnetization and argue that, in the generic case, the values of the Hausdorff and spectral dimension of the underlying infinite trees are not influenced by the coupling to an Ising model in a constant magnetic field (B. Durhuus, G.M. Napolitano, in preparation)...

  15. A survey of statistical network models

    CERN Document Server

    Goldenberg, Anna; Fienberg, Stephen E; Airoldi, Edoardo M

    2009-01-01

    Networks are ubiquitous in science and have become a focal point for discussion in everyday life. Formal statistical models for the analysis of network data have emerged as a major topic of interest in diverse areas of study, and most of these involve a form of graphical representation. Probability models on graphs date back to 1959. Along with empirical studies in social psychology and sociology from the 1960s, these early works generated an active network community and a substantial literature in the 1970s. This effort moved into the statistical literature in the late 1970s and 1980s, and the past decade has seen a burgeoning network literature in statistical physics and computer science. The growth of the World Wide Web and the emergence of online networking communities such as Facebook, MySpace, and LinkedIn, and a host of more specialized professional network communities has intensified interest in the study of networks and network data. Our goal in this review is to provide the reader with an entry poin...

  16. Statistical Modelling of the Soil Dielectric Constant

    Science.gov (United States)

    Usowicz, Boguslaw; Marczewski, Wojciech; Bogdan Usowicz, Jerzy; Lipiec, Jerzy

    2010-05-01

    The dielectric constant of soil is the physical property being very sensitive on water content. It funds several electrical measurement techniques for determining the water content by means of direct (TDR, FDR, and others related to effects of electrical conductance and/or capacitance) and indirect RS (Remote Sensing) methods. The work is devoted to a particular statistical manner of modelling the dielectric constant as the property accounting a wide range of specific soil composition, porosity, and mass density, within the unsaturated water content. Usually, similar models are determined for few particular soil types, and changing the soil type one needs switching the model on another type or to adjust it by parametrization of soil compounds. Therefore, it is difficult comparing and referring results between models. The presented model was developed for a generic representation of soil being a hypothetical mixture of spheres, each representing a soil fraction, in its proper phase state. The model generates a serial-parallel mesh of conductive and capacitive paths, which is analysed for a total conductive or capacitive property. The model was firstly developed to determine the thermal conductivity property, and now it is extended on the dielectric constant by analysing the capacitive mesh. The analysis is provided by statistical means obeying physical laws related to the serial-parallel branching of the representative electrical mesh. Physical relevance of the analysis is established electrically, but the definition of the electrical mesh is controlled statistically by parametrization of compound fractions, by determining the number of representative spheres per unitary volume per fraction, and by determining the number of fractions. That way the model is capable covering properties of nearly all possible soil types, all phase states within recognition of the Lorenz and Knudsen conditions. In effect the model allows on generating a hypothetical representative of

  17. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation

  18. The power of statistical tests using field trial count data of non-target organisms in enviromental risk assessment of genetically modified plants

    NARCIS (Netherlands)

    Voet, van der H.; Goedhart, P.W.

    2015-01-01

    Publications on power analyses for field trial count data comparing transgenic and conventional crops have reported widely varying requirements for the replication needed to obtain statistical tests with adequate power. These studies are critically reviewed and complemented with a new simulation stu

  19. Electronic noise modeling in statistical iterative reconstruction.

    Science.gov (United States)

    Xu, Jingyan; Tsui, Benjamin M W

    2009-06-01

    We consider electronic noise modeling in tomographic image reconstruction when the measured signal is the sum of a Gaussian distributed electronic noise component and another random variable whose log-likelihood function satisfies a certain linearity condition. Examples of such likelihood functions include the Poisson distribution and an exponential dispersion (ED) model that can approximate the signal statistics in integration mode X-ray detectors. We formulate the image reconstruction problem as a maximum-likelihood estimation problem. Using an expectation-maximization approach, we demonstrate that a reconstruction algorithm can be obtained following a simple substitution rule from the one previously derived without electronic noise considerations. To illustrate the applicability of the substitution rule, we present examples of a fully iterative reconstruction algorithm and a sinogram smoothing algorithm both in transmission CT reconstruction when the measured signal contains additive electronic noise. Our simulation studies show the potential usefulness of accurate electronic noise modeling in low-dose CT applications.

  20. Statistical model with a standard Γ distribution

    Science.gov (United States)

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-07-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .

  1. Statistical model with a standard Gamma distribution

    Science.gov (United States)

    Chakraborti, Anirban; Patriarca, Marco

    2005-03-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T (λ), where particles exchange energy in a space with an effective dimension D (λ).

  2. Statistical Decision-Tree Models for Parsing

    CERN Document Server

    Magerman, D M

    1995-01-01

    Syntactic natural language parsers have shown themselves to be inadequate for processing highly-ambiguous large-vocabulary text, as is evidenced by their poor performance on domains like the Wall Street Journal, and by the movement away from parsing-based approaches to text-processing in general. In this paper, I describe SPATTER, a statistical parser based on decision-tree learning techniques which constructs a complete parse for every sentence and achieves accuracy rates far better than any published result. This work is based on the following premises: (1) grammars are too complex and detailed to develop manually for most interesting domains; (2) parsing models must rely heavily on lexical and contextual information to analyze sentences accurately; and (3) existing {$n$}-gram modeling techniques are inadequate for parsing models. In experiments comparing SPATTER with IBM's computer manuals parser, SPATTER significantly outperforms the grammar-based parser. Evaluating SPATTER against the Penn Treebank Wall ...

  3. Statistical Model Checking for Product Lines

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2016-01-01

    average cost of products (in terms of the attributes of the products’ features) and the probability of features to be (un)installed at runtime. The product lines must be modelled in QFLan, which extends the probabilistic feature-oriented language PFLan with novel quantitative constraints among features......We report on the suitability of statistical model checking for the analysis of quantitative properties of product line models by an extended treatment of earlier work by the authors. The type of analysis that can be performed includes the likelihood of specific product behaviour, the expected...... and on behaviour and with advanced feature installation options. QFLan is a rich process-algebraic specification language whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and probabilistic...

  4. ARSENIC CONTAMINATION IN GROUNDWATER: A STATISTICAL MODELING

    Directory of Open Access Journals (Sweden)

    Palas Roy

    2013-01-01

    Full Text Available High arsenic in natural groundwater in most of the tubewells of the Purbasthali- Block II area of Burdwan district (W.B, India has recently been focused as a serious environmental concern. This paper is intending to illustrate the statistical modeling of the arsenic contaminated groundwater to identify the interrelation of that arsenic contain with other participating groundwater parameters so that the arsenic contamination level can easily be predicted by analyzing only such parameters. Multivariate data analysis was done with the collected groundwater samples from the 132 tubewells of this contaminated region shows that three variable parameters are significantly related with the arsenic. Based on these relationships, a multiple linear regression model has been developed that estimated the arsenic contamination by measuring such three predictor parameters of the groundwater variables in the contaminated aquifer. This model could also be a suggestive tool while designing the arsenic removal scheme for any affected groundwater.

  5. Challenges in Dental Statistics: Data and Modelling

    Directory of Open Access Journals (Sweden)

    Domenica Matranga

    2013-03-01

    Full Text Available The aim of this work is to present the reflections and proposals derived from the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona (Italy on 28th September 2011. STATDENT began as a forum of comparison and discussion for statisticians working in the field of dental research in order to suggest new and improve existing biostatistical and clinical epidemiological methods. During the meeting, we dealt with very important topics of statistical methodology for the analysis of dental data, covering the analysis of hierarchically structured and over-dispersed data, the issue of calibration and reproducibility, as well as some problems related to survey methodology, such as the design and construction of unbiased statistical indicators and of well conducted clinical trials. This paper gathers some of the methodological topics discussed during the meeting, concerning multilevel and zero-inflated models for the analysis of caries data and methods for the training and calibration of raters in dental epidemiology.

  6. Statistical Model Checking for Biological Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2014-01-01

    Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic...... proved very useful for identifying interesting properties of biological systems. Our aim is to offer the best of the two worlds: optimal domain specific interfaces and formalisms suited to biology combined with powerful SMC analysis techniques for stochastic and hybrid systems. This goal is obtained...

  7. Statistical shape and appearance models in osteoporosis.

    Science.gov (United States)

    Castro-Mateos, Isaac; Pozo, Jose M; Cootes, Timothy F; Wilkinson, J Mark; Eastell, Richard; Frangi, Alejandro F

    2014-06-01

    Statistical models (SMs) of shape (SSM) and appearance (SAM) have been acquiring popularity in medical image analysis since they were introduced in the early 1990s. They have been primarily used for segmentation, but they are also a powerful tool for 3D reconstruction and classification. All these tasks may be required in the osteoporosis domain, where fracture detection and risk estimation are key to reducing the mortality and/or morbidity of this bone disease. In this article, we review the different applications of SSMs and SAMs in the context of osteoporosis, and it concludes with a discussion of their advantages and disadvantages for this application.

  8. A Statistical Model of Skewed Associativity

    OpenAIRE

    Michaud, Pierre

    2002-01-01

    This paper presents a statistical model of set-associativity, victim caching and skewed-associativity, with an emphasis on skewed-associativity. We show that set-associativity is not efficient when the working-set size is close to the cache size. We refer to this as the unit working-set problem. We show that victim-caching is not a practical solution to the unit working-se- t problem either, although victim caching emulates full associativity for working-sets much larger than the victim buffe...

  9. Local influence diagnostics for hierarchical count data models with overdispersion and excess zeros.

    Science.gov (United States)

    Rakhmawati, Trias Wahyuni; Molenberghs, Geert; Verbeke, Geert; Faes, Christel

    2016-11-01

    We consider models for hierarchical count data, subject to overdispersion and/or excess zeros. Molenberghs et al. () and Molenberghs et al. () extend the Poisson-normal generalized linear-mixed model by including gamma random effects to accommodate overdispersion. Excess zeros are handled using either a zero-inflation or a hurdle component. These models were studied by Kassahun et al. (). While flexible, they are quite elaborate in parametric specification and therefore model assessment is imperative. We derive local influence measures to detect and examine influential subjects, that is subjects who have undue influence on either the fit of the model as a whole, or on specific important sub-vectors of the parameter vector. The latter include the fixed effects for the Poisson and for the excess-zeros components, the variance components for the normal random effects, and the parameters describing gamma random effects, included to accommodate overdispersion. Interpretable influence components are derived. The method is applied to data from a longitudinal clinical trial involving patients with epileptic seizures. Even though the data were extensively analyzed in earlier work, the insight gained from the proposed diagnostics, statistically and clinically, is considerable. Possibly, a small but important subgroup of patients has been identified. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A Model Comparison for Count Data with a Positively Skewed Distribution with an Application to the Number of University Mathematics Courses Completed

    Science.gov (United States)

    Liou, Pey-Yan

    2009-01-01

    The current study examines three regression models: OLS (ordinary least square) linear regression, Poisson regression, and negative binomial regression for analyzing count data. Simulation results show that the OLS regression model performed better than the others, since it did not produce more false statistically significant relationships than…

  11. Statistical pairwise interaction model of stock market

    Science.gov (United States)

    Bury, Thomas

    2013-03-01

    Financial markets are a classical example of complex systems as they are compound by many interacting stocks. As such, we can obtain a surprisingly good description of their structure by making the rough simplification of binary daily returns. Spin glass models have been applied and gave some valuable results but at the price of restrictive assumptions on the market dynamics or they are agent-based models with rules designed in order to recover some empirical behaviors. Here we show that the pairwise model is actually a statistically consistent model with the observed first and second moments of the stocks orientation without making such restrictive assumptions. This is done with an approach only based on empirical data of price returns. Our data analysis of six major indices suggests that the actual interaction structure may be thought as an Ising model on a complex network with interaction strengths scaling as the inverse of the system size. This has potentially important implications since many properties of such a model are already known and some techniques of the spin glass theory can be straightforwardly applied. Typical behaviors, as multiple equilibria or metastable states, different characteristic time scales, spatial patterns, order-disorder, could find an explanation in this picture.

  12. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  13. Projecting Policy Effects with Statistical Models Projecting Policy Effects with Statistical Models

    Directory of Open Access Journals (Sweden)

    Christopher Sims

    1988-03-01

    Full Text Available This paper attempts to briefly discus the current frontiers in quantitative modeling for forecastina and policy analvsis. It does so by summarizing some recent developmenrs in three areas: reduced form forecasting models; theoretical models including elements of stochastic optimization; and identification. In the process, the paper tries to provide some remarks on the direction we seem to be headed. Projecting Policy Effects with Statistical Models

  14. Revealed Preference and Effectiveness of Public Investment in Ecological River Restoration Projects: An Application of the Count Data Model

    Directory of Open Access Journals (Sweden)

    Yoon Lee

    2016-04-01

    Full Text Available Ecological river restoration projects aim to revitalize healthy and self-sustaining river systems that can provide irreplaceable benefits to human society. Cheonggyecheon and Anyangcheon are two sites of recent river restoration projects in Korea. To assess the economic value of two rivers, count data was collected to conduct the individual travel cost method (ITCM in this study. Five statistical models such as the Poisson, the negative binomial, the zero-truncated Poisson, the negative binomial, and negative binomial model adjusted for both truncation and endogenous stratification were used in the analysis due to the nature of count data. Empirical results showed that regressors were statistically significant and corresponded to conventional consumer theory. Since collected count data indicated over-dispersion and endogenous stratification, the adjusted Negative Binomial was selected as an optimal model to analyze the recreational value of Cheonggyecheon and Anyangcheon. Estimates of the annual economic value of two river restoration projects were approximately US $170.1 million and US $50.5 million, respectively.

  15. Statistical Mechanical Models of Integer Factorization Problem

    Science.gov (United States)

    Nakajima, Chihiro H.; Ohzeki, Masayuki

    2017-01-01

    We formulate the integer factorization problem via a formulation of the searching problem for the ground state of a statistical mechanical Hamiltonian. The first passage time required to find a correct divisor of a composite number signifies the exponential computational hardness. The analysis of the density of states of two macroscopic quantities, i.e., the energy and the Hamming distance from the correct solutions, leads to the conclusion that the ground state (correct solution) is completely isolated from the other low-energy states, with the distance being proportional to the system size. In addition, the profile of the microcanonical entropy of the model has two peculiar features that are each related to two marked changes in the energy region sampled via Monte Carlo simulation or simulated annealing. Hence, we find a peculiar first-order phase transition in our model.

  16. Statistical model semiquantitatively approximates arabinoxylooligosaccharides' structural diversity

    DEFF Research Database (Denmark)

    Dotsenko, Gleb; Nielsen, Michael Krogsgaard; Lange, Lene

    2016-01-01

    A statistical model describing the random distribution of substituted xylopyranosyl residues in arabinoxylooligosaccharides is suggested and compared with existing experimental data. Structural diversity of arabinoxylooligosaccharides of various length, originating from different arabinoxylans...... (wheat flour arabinoxylan (arabinose/xylose, A/X = 0.47); grass arabinoxylan (A/X = 0.24); wheat straw arabinoxylan (A/X = 0.15); and hydrothermally pretreated wheat straw arabinoxylan (A/X = 0.05)), is semiquantitatively approximated using the proposed model. The suggested approach can be applied...... not only for prediction and quantification of arabinoxylooligosaccharides' structural diversity, but also for estimate of yield and selection of the optimal source of arabinoxylan for production of arabinoxylooligosaccharides with desired structural features....

  17. Time series count data models: an empirical application to traffic accidents.

    Science.gov (United States)

    Quddus, Mohammed A

    2008-09-01

    Count data are primarily categorised as cross-sectional, time series, and panel. Over the past decade, Poisson and Negative Binomial (NB) models have been used widely to analyse cross-sectional and time series count data, and random effect and fixed effect Poisson and NB models have been used to analyse panel count data. However, recent literature suggests that although the underlying distributional assumptions of these models are appropriate for cross-sectional count data, they are not capable of taking into account the effect of serial correlation often found in pure time series count data. Real-valued time series models, such as the autoregressive integrated moving average (ARIMA) model, introduced by Box and Jenkins have been used in many applications over the last few decades. However, when modelling non-negative integer-valued data such as traffic accidents at a junction over time, Box and Jenkins models may be inappropriate. This is mainly due to the normality assumption of errors in the ARIMA model. Over the last few years, a new class of time series models known as integer-valued autoregressive (INAR) Poisson models, has been studied by many authors. This class of models is particularly applicable to the analysis of time series count data as these models hold the properties of Poisson regression and able to deal with serial correlation, and therefore offers an alternative to the real-valued time series models. The primary objective of this paper is to introduce the class of INAR models for the time series analysis of traffic accidents in Great Britain. Different types of time series count data are considered: aggregated time series data where both the spatial and temporal units of observation are relatively large (e.g., Great Britain and years) and disaggregated time series data where both the spatial and temporal units are relatively small (e.g., congestion charging zone and months). The performance of the INAR models is compared with the class of Box and

  18. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  19. MSMBuilder: Statistical Models for Biomolecular Dynamics.

    Science.gov (United States)

    Harrigan, Matthew P; Sultan, Mohammad M; Hernández, Carlos X; Husic, Brooke E; Eastman, Peter; Schwantes, Christian R; Beauchamp, Kyle A; McGibbon, Robert T; Pande, Vijay S

    2017-01-10

    MSMBuilder is a software package for building statistical models of high-dimensional time-series data. It is designed with a particular focus on the analysis of atomistic simulations of biomolecular dynamics such as protein folding and conformational change. MSMBuilder is named for its ability to construct Markov state models (MSMs), a class of models that has gained favor among computational biophysicists. In addition to both well-established and newer MSM methods, the package includes complementary algorithms for understanding time-series data such as hidden Markov models and time-structure based independent component analysis. MSMBuilder boasts an easy to use command-line interface, as well as clear and consistent abstractions through its Python application programming interface. MSMBuilder was developed with careful consideration for compatibility with the broader machine learning community by following the design of scikit-learn. The package is used primarily by practitioners of molecular dynamics, but is just as applicable to other computational or experimental time-series measurements. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. ZERODUR strength modeling with Weibull statistical distributions

    Science.gov (United States)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a

  1. Poisson distribution and process as a well-fitting pattern for counting variables in biologic models

    Directory of Open Access Journals (Sweden)

    Lucietta Betti

    2012-09-01

    Full Text Available One of the major criticisms directed to basic research on high dilution effects is the lack of a steady statistical approach; therefore, it seems crucial to fix some milestones in statistical analysis of this kind of experimentation. Since plant research in homeopathy has been recently developed and one of the mostly used models is based on in vitro seed germination, here we propose a statistical approach focused on the Poisson distribution, that satisfactorily fits the number of non-germinated seeds. Poisson distribution is a discrete-valued model often used in statistics when representing the number X of specific events (telephone calls, industrial machine failures, genetic mutations etc. that occur in a fixed period of time, supposing that instant probability of occurrence of such events is constant. If we denote with λ the average number of events that occur within the fixed period, the probability of observing exactly k events is: P(k = e-λ λk /k! , k = 0, 1,2,… This distribution is commonly used when dealing with rare effects, in the sense that it has to be almost impossible to have two events at the same time. Poisson distribution is the basic model of the socalled Poisson process, which is a counting process N(t, where t is a time parameter, having these properties: -The process starts with zero: N(0 = 0; -The increments are independent; -The number of events that occur in a period of time d(t follows a Poisson distribution with parameter proportional to d(t; -The waiting time, i.e. the time between an event and another one, follows and exponential distribution. In a series of experiments performed by our research group ([1], [2]., [3], [4] we tried to apply this distribution to the number X of non-germinated seeds out of a fixed number N* of seeds in a Petri dish (usually N* = 33 or N* = 36. The goodness-of-fit was checked by different tests (Kolmogorov distance and chi-squared, as well as

  2. Statistical model for OCT image denoising

    KAUST Repository

    Li, Muxingzi

    2017-08-01

    Optical coherence tomography (OCT) is a non-invasive technique with a large array of applications in clinical imaging and biological tissue visualization. However, the presence of speckle noise affects the analysis of OCT images and their diagnostic utility. In this article, we introduce a new OCT denoising algorithm. The proposed method is founded on a numerical optimization framework based on maximum-a-posteriori estimate of the noise-free OCT image. It combines a novel speckle noise model, derived from local statistics of empirical spectral domain OCT (SD-OCT) data, with a Huber variant of total variation regularization for edge preservation. The proposed approach exhibits satisfying results in terms of speckle noise reduction as well as edge preservation, at reduced computational cost.

  3. Physical and Statistical Modeling of Saturn's Troposphere

    Science.gov (United States)

    Yanamandra-Fisher, Padmavati A.; Braverman, Amy J.; Orton, Glenn S.

    2002-12-01

    The 5.2-μm atmospheric window on Saturn is dominated by thermal radiation and weak gaseous absorption, with a 20% contribution from sunlight reflected from clouds. The striking variability displayed by Saturn's clouds at 5.2 μm and the detection of PH3 (an atmospheric tracer) variability near or below the 2-bar level and possibly at lower pressures provide salient constraints on the dynamical organization of Saturn's atmosphere by constraining the strength of vertical motions at two levels across the disk. We analyse the 5.2-μm spectra of Saturn by utilising two independent methods: (a) physical models based on the relevant atmospheric parameters and (b) statistical analysis, based on principal components analysis (PCA), to determine the influence of the variation of phosphine and the opacity of clouds deep within Saturn's atmosphere to understand the dynamics in its atmosphere.

  4. New advances in statistical modeling and applications

    CERN Document Server

    Santos, Rui; Oliveira, Maria; Paulino, Carlos

    2014-01-01

    This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.

  5. A generalized mathematical model to determine the turning movement counts at

    Directory of Open Access Journals (Sweden)

    Al-Sayed Ahmed Al-Sobky

    2014-09-01

    Full Text Available Traffic turning movement counts at roundabouts is one of the key inputs required for roundabout assessment, control and management. Traditionally, a direct counting is conducted to track a vehicle from entering through circulation until exiting. This counting may be difficult and costly due to the size of roundabout, the vision obstacles, and the continuous traffic flow. Many researchers tried to avoid the tracking problem by counting only at entries and exits, then estimating the movements based on historical data which unfortunately affect the results. Other researchers reduced the tracking problem by counting some turning movements in addition to at entries and exits, then calculating mathematically the remaining movements. This approach is practical and accurate; however, it was applied on limited cases. In this paper, a generalized mathematical model was developed to calculate the most difficult movements based on the easiest movements determined based on the size of monitoring area. The developed model can be used to calculate the turning movements, including the u-turns, for roundabouts with any number of legs. The developed model was presented in O–D matrix forms to be practical and user-friendly. The model was validated against reference count data and the results were found to be satisfactory.

  6. Regression models for categorical, count, and related variables an applied approach

    CERN Document Server

    Hoffmann, John P, Dr

    2016-01-01

    Social science and behavioral science students and researchers are often confronted with data that are categorical, count a phenomenon, or have been collected over time. Sociologists examining the likelihood of interracial marriage, political scientists studying voting behavior, criminologists counting the number of offenses people commit, health scientists studying the number of suicides across neighborhoods, and psychologists modeling mental health treatment success are all interested in outcomes that are not continuous. Instead, they must measure and analyze these events and phenomena in a

  7. What every radiochemist should know about statistics

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, W.L.

    1994-04-01

    Radionuclide decay and measurement with appropriate counting instruments is one of the few physical processes for which exact mathematical/probabilistic models are available. This paper discusses statistical procedures associated with display and analysis of radionuclide counting data that derive from these exact models. For low count situations the attractiveness of fixed-count-random-time procedures is discussed.

  8. A Statistical Model for Regional Tornado Climate Studies.

    Science.gov (United States)

    Jagger, Thomas H; Elsner, James B; Widen, Holly M

    2015-01-01

    Tornado reports are locally rare, often clustered, and of variable quality making it difficult to use them directly to describe regional tornado climatology. Here a statistical model is demonstrated that overcomes some of these difficulties and produces a smoothed regional-scale climatology of tornado occurrences. The model is applied to data aggregated at the level of counties. These data include annual population, annual tornado counts and an index of terrain roughness. The model has a term to capture the smoothed frequency relative to the state average. The model is used to examine whether terrain roughness is related to tornado frequency and whether there are differences in tornado activity by County Warning Area (CWA). A key finding is that tornado reports increase by 13% for a two-fold increase in population across Kansas after accounting for improvements in rating procedures. Independent of this relationship, tornadoes have been increasing at an annual rate of 1.9%. Another finding is the pattern of correlated residuals showing more Kansas tornadoes in a corridor of counties running roughly north to south across the west central part of the state consistent with the dryline climatology. The model is significantly improved by adding terrain roughness. The effect amounts to an 18% reduction in the number of tornadoes for every ten meter increase in elevation standard deviation. The model indicates that tornadoes are 51% more likely to occur in counties served by the CWAs of DDC and GID than elsewhere in the state. Flexibility of the model is illustrated by fitting it to data from Illinois, Mississippi, South Dakota, and Ohio.

  9. A Statistical Model for Regional Tornado Climate Studies.

    Directory of Open Access Journals (Sweden)

    Thomas H Jagger

    Full Text Available Tornado reports are locally rare, often clustered, and of variable quality making it difficult to use them directly to describe regional tornado climatology. Here a statistical model is demonstrated that overcomes some of these difficulties and produces a smoothed regional-scale climatology of tornado occurrences. The model is applied to data aggregated at the level of counties. These data include annual population, annual tornado counts and an index of terrain roughness. The model has a term to capture the smoothed frequency relative to the state average. The model is used to examine whether terrain roughness is related to tornado frequency and whether there are differences in tornado activity by County Warning Area (CWA. A key finding is that tornado reports increase by 13% for a two-fold increase in population across Kansas after accounting for improvements in rating procedures. Independent of this relationship, tornadoes have been increasing at an annual rate of 1.9%. Another finding is the pattern of correlated residuals showing more Kansas tornadoes in a corridor of counties running roughly north to south across the west central part of the state consistent with the dryline climatology. The model is significantly improved by adding terrain roughness. The effect amounts to an 18% reduction in the number of tornadoes for every ten meter increase in elevation standard deviation. The model indicates that tornadoes are 51% more likely to occur in counties served by the CWAs of DDC and GID than elsewhere in the state. Flexibility of the model is illustrated by fitting it to data from Illinois, Mississippi, South Dakota, and Ohio.

  10. Counting Penguins.

    Science.gov (United States)

    Perry, Mike; Kader, Gary

    1998-01-01

    Presents an activity on the simplification of penguin counting by employing the basic ideas and principles of sampling to teach students to understand and recognize its role in statistical claims. Emphasizes estimation, data analysis and interpretation, and central limit theorem. Includes a list of items for classroom discussion. (ASK)

  11. Modelling earthquake interaction and seismicity statistics

    Science.gov (United States)

    Steacy, S.; Hetherington, A.

    2009-04-01

    The effects of earthquake interaction and fault complexity on seismicity statistics are investigated in a 3D model composed of a number of cellular automata (each representing an individual fault) distributed in a volume. Each automaton is assigned a fractal distribution of strength. Failure occurs when the 3D Coulomb stress on any cell exceeds its strength and stress transfer during simulated earthquake rupture is via nearest-neighbor rules formulated to give realistic stress concentrations. An event continues until all neighboring cells whose stresses exceed their strengths have ruptured and the size of the event is determined from its area and stress drop. Long-range stress interactions are computed following the termination of simulated ruptures using a boundary element code. In practice, these stress perturbations are only computed for events above a certain size (e.g. a threshold length of 10 km) and stresses are updated on nearby structures. Events which occur as a result of these stress interactions are considered to be "triggered" earthquakes and they, in turn, can trigger further seismic activity. The threshold length for computing interaction stresses is a free parameter and hence interaction can be "turned off" by setting this to an unrealistically high value. We consider 3 synthetic fault networks of increasing degrees of complexity - modelled on the North Anatolian fault system, the structures in the San Francisco Bay Area, and the Southern California fault network. We find that the effect of interaction is dramatically different in networks of differing complexity. In the North Anatolian analogue, for example, interaction leads to a decreased number of events, increased b-values, and an increase in recurrence intervals. In the Bay Area model, by contrast, we observe that interaction increases the number of events, decreases the b-values, and has little effect on recurrence intervals. For all networks, we find that interaction can activate mis

  12. The additive nonparametric and semiparametric Aalen model as the rate function for a counting process

    DEFF Research Database (Denmark)

    Scheike, Thomas Harder

    2002-01-01

    We use the additive risk model of Aalen (Aalen, 1980) as a model for the rate of a counting process. Rather than specifying the intensity, that is the instantaneous probability of an event conditional on the entire history of the relevant covariates and counting processes, we present a model...... for the rate function, i.e., the instantaneous probability of an event conditional on only a selected set of covariates. When the rate function for the counting process is of Aalen form we show that the usual Aalen estimator can be used and gives almost unbiased estimates. The usual martingale based variance...... estimator is incorrect and an alternative estimator should be used. We also consider the semi-parametric version of the Aalen model as a rate model (McKeague and Sasieni, 1994) and show that the standard errors that are computed based on an assumption of intensities are incorrect and give a different...

  13. Statistical models and regularization strategies in statistical image reconstruction of low-dose X-ray computed tomography: a survey

    CERN Document Server

    Zhang, Hao; Ma, Jianhua; Lu, Hongbing; Liang, Zhengrong

    2014-01-01

    Statistical image reconstruction (SIR) methods have shown potential to substantially improve the image quality of low-dose X-ray computed tomography (CT) as compared to the conventional filtered back-projection (FBP) method for various clinical tasks. According to the maximum a posterior (MAP) estimation, the SIR methods can be typically formulated by an objective function consisting of two terms: (1) data-fidelity (or equivalently, data-fitting or data-mismatch) term modeling the statistics of projection measurements, and (2) regularization (or equivalently, prior or penalty) term reflecting prior knowledge or expectation on the characteristics of the image to be reconstructed. Existing SIR methods for low-dose CT can be divided into two groups: (1) those that use calibrated transmitted photon counts (before log-transform) with penalized maximum likelihood (pML) criterion, and (2) those that use calibrated line-integrals (after log-transform) with penalized weighted least-squares (PWLS) criterion. Accurate s...

  14. The Perception-Action Model: Counting Computational Mechanisms

    DEFF Research Database (Denmark)

    Grünbaum, Thor

    2016-01-01

    Milner and Goodale’s Two Visual Systems Hypothesis (TVSH) is regarded as common ground in recent discussions of visual consciousness. A central part of TVSH is a functional model of vision and action (a functional perception-action model, PAM for short). In this paper, I provide a brief overview...... of these current discussions and argue that PAM is ambiguous between a strong and a weak version. I argue that, given a standard way of individuating computational mechanisms, the available evidence cannot be used to distinguish between these versions. This not only has consequences for philosophical theories...

  15. Pathway Model and Nonextensive Statistical Mechanics

    Science.gov (United States)

    Mathai, A. M.; Haubold, H. J.; Tsallis, C.

    2015-12-01

    The established technique of eliminating upper or lower parameters in a general hypergeometric series is profitably exploited to create pathways among confluent hypergeometric functions, binomial functions, Bessel functions, and exponential series. One such pathway, from the mathematical statistics point of view, results in distributions which naturally emerge within nonextensive statistical mechanics and Beck-Cohen superstatistics, as pursued in generalizations of Boltzmann-Gibbs statistics.

  16. Statistical Ensemble Theory of Gompertz Growth Model

    Directory of Open Access Journals (Sweden)

    Takuya Yamano

    2009-11-01

    Full Text Available An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein statistics picture. The analytical entropy expression pertain to the law can be obtained in terms of the growth velocity distribution as well as the Gompertz function itself for the whole process.

  17. The Perception-Action Model: Counting Computational Mechanisms

    DEFF Research Database (Denmark)

    Grünbaum, Thor

    2016-01-01

    of these current discussions and argue that PAM is ambiguous between a strong and a weak version. I argue that, given a standard way of individuating computational mechanisms, the available evidence cannot be used to distinguish between these versions. This not only has consequences for philosophical theories...... of the role of visual consciousness but also has implications for the role of experimental evidence in model testing in cognitive neuroscience....

  18. Integer Representations towards Efficient Counting in the Bit Probe Model

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Greve, Mark; Pandey, Vineet

    2011-01-01

    Abstract We consider the problem of representing numbers in close to optimal space and supporting increment, decrement, addition and subtraction operations efficiently. We study the problem in the bit probe model and analyse the number of bits read and written to perform the operations, both...... in the worst-case and in the average-case. A counter is space-optimal if it represents any number in the range [0,...,2 n  − 1] using exactly n bits. We provide a space-optimal counter which supports increment and decrement operations by reading at most n − 1 bits and writing at most 3 bits in the worst...... of the counter as the ratio between L + 1 and 2 n . We present various representations that achieve different trade-offs between the read and write complexities and the efficiency. We also give another representation of integers that uses n + O(logn ) bits to represent integers in the range [0,...,2 n  − 1...

  19. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  20. Power-Counting Theorem for Non-Local Matrix Models and Renormalisation

    Science.gov (United States)

    Grosse, Harald; Wulkenhaar, Raimar

    2005-02-01

    Solving the exact renormalisation group equation à la Wilson-Polchinski perturbatively, we derive a power-counting theorem for general matrix models with arbitrarily non-local propagators. The power-counting degree is determined by two scaling dimensions of the cut-off propagator and various topological data of ribbon graphs. As a necessary condition for the renormalisability of a model, the two scaling dimensions have to be large enough relative to the dimension of the underlying space. In order to have a renormalisable model one needs additional locality properties—typically arising from orthogonal polynomials—which relate the relevant and marginal interaction coefficients to a finite number of base couplings. The main application of our power-counting theorem is the renormalisation of field theories on noncommutative RD in matrix formulation.

  1. Multiplicity Counting

    Energy Technology Data Exchange (ETDEWEB)

    Geist, William H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pueff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  2. Parseq: reconstruction of microbial transcription landscape from RNA-Seq read counts using state-space models.

    Science.gov (United States)

    Mirauta, Bogdan; Nicolas, Pierre; Richard, Hugues

    2014-05-15

    The most common RNA-Seq strategy consists of random shearing, amplification and high-throughput sequencing of the RNA fraction. Methods to analyze transcription level variations along the genome from the read count profiles generated by the RNA-Seq protocol are needed. We developed a statistical approach to estimate the local transcription levels and to identify transcript borders. This transcriptional landscape reconstruction relies on a state-space model to describe transcription level variations in terms of abrupt shifts and more progressive drifts. A new emission model is introduced to capture not only the read count variance inside a transcript but also its short-range autocorrelation and the fraction of positions with zero counts. The estimation relies on a particle Gibbs algorithm whose running time makes it more suited to microbial genomes. The approach outperformed read-overlapping strategies on synthetic and real microbial datasets. A program named Parseq is available at: http://www.lgm.upmc.fr/parseq/. bodgan.mirauta@upmc.fr Supplementary data are available at Bioinformatics online.

  3. Statistical Model Checking of Rich Models and Properties

    DEFF Research Database (Denmark)

    Poulsen, Danny Bøgsted

    in undecidability issues for the traditional model checking approaches. Statistical model checking has proven itself a valuable supplement to model checking and this thesis is concerned with extending this software validation technique to stochastic hybrid systems. The thesis consists of two parts: the first part......Software is in increasing fashion embedded within safety- and business critical processes of society. Errors in these embedded systems can lead to human casualties or severe monetary loss. Model checking technology has proven formal methods capable of finding and correcting errors in software....... However, software is approaching the boundary in terms of the complexity and size that model checking can handle. Furthermore, software systems are nowadays more frequently interacting with their environment hence accurately modelling such systems requires modelling the environment as well - resulting...

  4. Modeling time-series count data: the unique challenges facing political communication studies.

    Science.gov (United States)

    Fogarty, Brian J; Monogan, James E

    2014-05-01

    This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher.

  5. Latent segmentation based count models: Analysis of bicycle safety in Montreal and Toronto.

    Science.gov (United States)

    Yasmin, Shamsunnahar; Eluru, Naveen

    2016-10-01

    The study contributes to literature on bicycle safety by building on the traditional count regression models to investigate factors affecting bicycle crashes at the Traffic Analysis Zone (TAZ) level. TAZ is a traffic related geographic entity which is most frequently used as spatial unit for macroscopic crash risk analysis. In conventional count models, the impact of exogenous factors is restricted to be the same across the entire region. However, it is possible that the influence of exogenous factors might vary across different TAZs. To accommodate for the potential variation in the impact of exogenous factors we formulate latent segmentation based count models. Specifically, we formulate and estimate latent segmentation based Poisson (LP) and latent segmentation based Negative Binomial (LNB) models to study bicycle crash counts. In our latent segmentation approach, we allow for more than two segments and also consider a large set of variables in segmentation and segment specific models. The formulated models are estimated using bicycle-motor vehicle crash data from the Island of Montreal and City of Toronto for the years 2006 through 2010. The TAZ level variables considered in our analysis include accessibility measures, exposure measures, sociodemographic characteristics, socioeconomic characteristics, road network characteristics and built environment. A policy analysis is also conducted to illustrate the applicability of the proposed model for planning purposes. This macro-level research would assist decision makers, transportation officials and community planners to make informed decisions to proactively improve bicycle safety - a prerequisite to promoting a culture of active transportation.

  6. Integer Set Compression and Statistical Modeling

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...... enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities...

  7. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  8. BAYESIAN SPATIAL-TEMPORAL MODELING OF ECOLOGICAL ZERO-INFLATED COUNT DATA.

    Science.gov (United States)

    Wang, Xia; Chen, Ming-Hui; Kuo, Rita C; Dey, Dipak K

    2015-01-01

    A Bayesian hierarchical model is developed for count data with spatial and temporal correlations as well as excessive zeros, uneven sampling intensities, and inference on missing spots. Our contribution is to develop a model on zero-inflated count data that provides flexibility in modeling spatial patterns in a dynamic manner and also improves the computational efficiency via dimension reduction. The proposed methodology is of particular importance for studying species presence and abundance in the field of ecological sciences. The proposed model is employed in the analysis of the survey data by the Northeast Fisheries Sciences Center (NEFSC) for estimation and prediction of the Atlantic cod in the Gulf of Maine - Georges Bank region. Model comparisons based on the deviance information criterion and the log predictive score show the improvement by the proposed spatial-temporal model.

  9. The beta-binomial convolution model for 2 × 2 tables with missing cell counts

    NARCIS (Netherlands)

    Eisinga, Rob

    2009-01-01

    This paper considers the beta-binomial convolution model for the analysis of 2×2 tables with missing cell counts.We discuss maximumlikelihood (ML) parameter estimation using the expectation–maximization algorithm and study information loss relative to complete data estimators. We also examine bias o

  10. Statistical Compressive Sensing of Gaussian Mixture Models

    CERN Document Server

    Yu, Guoshen

    2010-01-01

    A new framework of compressive sensing (CS), namely statistical compressive sensing (SCS), that aims at efficiently sampling a collection of signals that follow a statistical distribution and achieving accurate reconstruction on average, is introduced. For signals following a Gaussian distribution, with Gaussian or Bernoulli sensing matrices of O(k) measurements, considerably smaller than the O(k log(N/k)) required by conventional CS, where N is the signal dimension, and with an optimal decoder implemented with linear filtering, significantly faster than the pursuit decoders applied in conventional CS, the error of SCS is shown tightly upper bounded by a constant times the k-best term approximation error, with overwhelming probability. The failure probability is also significantly smaller than that of conventional CS. Stronger yet simpler results further show that for any sensing matrix, the error of Gaussian SCS is upper bounded by a constant times the k-best term approximation with probability one, and the ...

  11. Comparison of Statistical Models for Regional Crop Trial Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qun-yuan; KONG Fan-ling

    2002-01-01

    Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model > AMMI model > PCA model > Treatment Means (TM) model > Linear Regression (LR) model > Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.

  12. Enhanced surrogate models for statistical design exploiting space mapping technology

    DEFF Research Database (Denmark)

    Koziel, Slawek; Bandler, John W.; Mohamed, Achmed S.;

    2005-01-01

    We present advances in microwave and RF device modeling exploiting Space Mapping (SM) technology. We propose new SM modeling formulations utilizing input mappings, output mappings, frequency scaling and quadratic approximations. Our aim is to enhance circuit models for statistical analysis...

  13. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar

  14. Spot counting on fluorescence in situ hybridization in suspension images using Gaussian mixture model

    Science.gov (United States)

    Liu, Sijia; Sa, Ruhan; Maguire, Orla; Minderman, Hans; Chaudhary, Vipin

    2015-03-01

    Cytogenetic abnormalities are important diagnostic and prognostic criteria for acute myeloid leukemia (AML). A flow cytometry-based imaging approach for FISH in suspension (FISH-IS) was established that enables the automated analysis of several log-magnitude higher number of cells compared to the microscopy-based approaches. The rotational positioning can occur leading to discordance between spot count. As a solution of counting error from overlapping spots, in this study, a Gaussian Mixture Model based classification method is proposed. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) of GMM are used as global image features of this classification method. Via Random Forest classifier, the result shows that the proposed method is able to detect closely overlapping spots which cannot be separated by existing image segmentation based spot detection methods. The experiment results show that by the proposed method we can obtain a significant improvement in spot counting accuracy.

  15. Modeling Systematic Change in Stopover Duration Does Not Improve Bias in Trends Estimated from Migration Counts.

    Directory of Open Access Journals (Sweden)

    Tara L Crewe

    Full Text Available The use of counts of unmarked migrating animals to monitor long term population trends assumes independence of daily counts and a constant rate of detection. However, migratory stopovers often last days or weeks, violating the assumption of count independence. Further, a systematic change in stopover duration will result in a change in the probability of detecting individuals once, but also in the probability of detecting individuals on more than one sampling occasion. We tested how variation in stopover duration influenced accuracy and precision of population trends by simulating migration count data with known constant rate of population change and by allowing daily probability of survival (an index of stopover duration to remain constant, or to vary randomly, cyclically, or increase linearly over time by various levels. Using simulated datasets with a systematic increase in stopover duration, we also tested whether any resulting bias in population trend could be reduced by modeling the underlying source of variation in detection, or by subsampling data to every three or five days to reduce the incidence of recounting. Mean bias in population trend did not differ significantly from zero when stopover duration remained constant or varied randomly over time, but bias and the detection of false trends increased significantly with a systematic increase in stopover duration. Importantly, an increase in stopover duration over time resulted in a compounding effect on counts due to the increased probability of detection and of recounting on subsequent sampling occasions. Under this scenario, bias in population trend could not be modeled using a covariate for stopover duration alone. Rather, to improve inference drawn about long term population change using counts of unmarked migrants, analyses must include a covariate for stopover duration, as well as incorporate sampling modifications (e.g., subsampling to reduce the probability that individuals will

  16. Statistical models of shape optimisation and evaluation

    CERN Document Server

    Davies, Rhodri; Taylor, Chris

    2014-01-01

    Deformable shape models have wide application in computer vision and biomedical image analysis. This book addresses a key issue in shape modelling: establishment of a meaningful correspondence between a set of shapes. Full implementation details are provided.

  17. Power-counting theorem for non-local matrix models and renormalisation

    CERN Document Server

    Grosse, H; Grosse, Harald; Wulkenhaar, Raimar

    2003-01-01

    Solving the exact renormalisation group equation a la Wilson-Polchinski perturbatively, we derive a power-counting theorem for general matrix models with arbitrarily non-local propagators. The power-counting degree is determined by three different scaling dimensions of the cut-off propagator and various topological data of ribbon graphs. The main application is the renormalisation problem of field theories on noncommutative R^D written in matrix formulation. It turns out that the propagator for the real scalar field has anomalous scaling dimensions, which for D>2 result in arbitrarily high power-counting degrees of divergence. This feature is known as UV/IR-mixing, which we conclude to emerge in any non-local matrix model with anomalous scaling dimensions of the propagator. Models in which the propagator has regular scaling dimensions are for D=2,4 power-counting renormalisable but acquire due to non-locality an infinite number of relevant or marginal interactions. By a reduction-of-couplings mechanism it is ...

  18. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  19. Statistics-based investigation on typhoon transition modeling

    DEFF Research Database (Denmark)

    Zhang, Shuoyun; Nishijima, Kazuyoshi

    and the seasonality are taken into account by developing the models for different spatial grids and seasons separately. An appropriate size of spatial grids is investigated. The statistical characteristics of the random residual terms in the models are also examined. Finally, Monte Carlo simulations are performed......The present study revisits the statistical modeling of typhoon transition. The objective of the study is to provide insights on plausible statistical typhoon transition models based on extensive statistical analysis. First, the correlation structures of the typhoon transition are estimated in terms...

  20. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  1. Statistical Tests for Mixed Linear Models

    CERN Document Server

    Khuri, André I; Sinha, Bimal K

    2011-01-01

    An advanced discussion of linear models with mixed or random effects. In recent years a breakthrough has occurred in our ability to draw inferences from exact and optimum tests of variance component models, generating much research activity that relies on linear models with mixed and random effects. This volume covers the most important research of the past decade as well as the latest developments in hypothesis testing. It compiles all currently available results in the area of exact and optimum tests for variance component models and offers the only comprehensive treatment for these models a

  2. Estimation for zero-inflated over-dispersed count data model with missing response.

    Science.gov (United States)

    Mian, Rajibul; Paul, Sudhir

    2016-12-30

    In this paper, we develop estimation procedure for the parameters of a zero-inflated over-dispersed/under-dispersed count model in the presence of missing responses. In particular, we deal with a zero-inflated extended negative binomial model in the presence of missing responses. A weighted expectation maximization algorithm is used for the maximum likelihood estimation of the parameters involved. Some simulations are conducted to study the properties of the estimators. Robustness of the procedure is shown when count data follow other over-dispersed models, such as the log-normal mixture of the Poisson distribution or even from a zero-inflated Poisson model. An illustrative example and a discussion leading to some conclusions are given. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  4. Statistical models for nuclear decay from evaporation to vaporization

    CERN Document Server

    Cole, A J

    2000-01-01

    Elements of equilibrium statistical mechanics: Introduction. Microstates and macrostates. Sub-systems and convolution. The Boltzmann distribution. Statistical mechanics and thermodynamics. The grand canonical ensemble. Equations of state for ideal and real gases. Pseudo-equilibrium. Statistical models of nuclear decay. Nuclear physics background: Introduction. Elements of the theory of nuclear reactions. Quantum mechanical description of scattering from a potential. Decay rates and widths. Level and state densities in atomic nuclei. Angular momentum in quantum mechanics. History of statistical

  5. Bayesian prediction of spatial count data using generalized linear mixed models

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Waagepetersen, Rasmus Plenge

    2002-01-01

    Spatial weed count data are modeled and predicted using a generalized linear mixed model combined with a Bayesian approach and Markov chain Monte Carlo. Informative priors for a data set with sparse sampling are elicited using a previously collected data set with extensive sampling. Furthermore, we...... demonstrate that so-called Langevin-Hastings updates are useful for efficient simulation of the posterior distributions, and we discuss computational issues concerning prediction....

  6. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  7. Traffic Analysis Zones - TRAFFIC_COUNTS_INDOTMODEL_IN: Traffic Counts on Roadways (from INDOT Statewide Travel Demand MODEL version 4, ISTDM4) in Indiana (Indiana Department of Transportation, Line Shapefile)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — TRAFFIC_COUNTS_INDOTMODEL_IN is a line shapefile that shows traffic counts (factored to year 2000) for the roadways in the Indiana Statewide Travel Demand Model,...

  8. Spherical collapse model and cluster number counts in power-law f(T) gravity

    Science.gov (United States)

    Malekjani, M.; Basilakos, S.; Heidari, N.

    2017-04-01

    We study the spherical collapse model in the framework of spatially flat power law f(T) ∝ (- T)b gravity model. We find that the linear and non-linear growth of spherical overdensities of this particular f(T) model are affected by the power-law parameter b. Finally, we compute the predicted number counts of virialized haloes in order to distinguish the current f(T) model from the expectations of the concordance Λ cosmology. Specifically, the present analysis suggests that the f(T) gravity model with positive (negative) b predicts more (less) virialized objects with respect to those of Λ cold dark matter.

  9. Using Count Data and Ordered Models in National Forest Recreation Demand Analysis

    Science.gov (United States)

    Simões, Paula; Barata, Eduardo; Cruz, Luis

    2013-11-01

    This research addresses the need to improve our knowledge on the demand for national forests for recreation and offers an in-depth data analysis supported by the complementary use of count data and ordered models. From a policy-making perspective, while count data models enable the estimation of monetary welfare measures, ordered models allow for the wider use of the database and provide a more flexible analysis of data. The main purpose of this article is to analyse the individual forest recreation demand and to derive a measure of its current use value. To allow a more complete analysis of the forest recreation demand structure the econometric approach supplements the use of count data models with ordered category models using data obtained by means of an on-site survey in the Bussaco National Forest (Portugal). Overall, both models reveal that travel cost and substitute prices are important explanatory variables, visits are a normal good and demographic variables seem to have no influence on demand. In particular, estimated price and income elasticities of demand are quite low. Accordingly, it is possible to argue that travel cost (price) in isolation may be expected to have a low impact on visitation levels.

  10. Análisis estadístico para datos de conteo: aplicaciones para el uso de los servicios de salud Statistical analysis for count data: use of healthcare services applications

    Directory of Open Access Journals (Sweden)

    Aarón Salinas-Rodríguez

    2009-10-01

    Full Text Available OBJETIVO: Describir algunos de los modelos estadísticos para el estudio de variables expresadas como un conteo en el contexto del uso de los servicios de salud. MATERIAL Y MÉTODOS: Con base en la Encuesta de Evaluación del Seguro Popular (2005-2006 se calculó el efecto del Seguro Popular sobre el número de consultas externas mediante el uso de los modelos de regresión Poisson, binomial negativo, binomial negativo cero-inflado y Hurdle binomial negativo. Se utilizó el criterio de información de Akaike (AIC para definir el mejor modelo. RESULTADOS: La mejor opción estadística para el análisis del uso de los servicios de salud resultó ser el modelo Hurdle, de acuerdo con sus presuposiciones y el valor del AIC. DISCUSIÓN: La modelación de variables de conteo requiere el empleo de modelos que incluyan una medición de la dispersión. Ante la presencia de exceso de ceros, el modelo Hurdle es una opción apropiada.OBJECTIVE: To describe some of the statistical models for the study of count variables in the context of the use of health services. MATERIAL AND METHODS: We used the Seguro Popular Evaluation Survey to estimate the effect of Seguro Popular on the frequency of use of outpatient health services, using Poisson regression models and negative binomial, zero-inflated negative binomial and the hurdle negative binomial models. We used the Akaike Information Criterion (AIC to define the best model. RESULTS: Results show that the best statistical approach to model the use of health services is the hurdle model, taking into account both the main theoretical assumptions and the statistical results of the AIC. DISCUSSION: The modelling of count data requires the application of statistical models to model data dispersion; in the presence of an excess of zeros, the hurdle model is an appropriate statistical option.

  11. A cascaded model of spectral distortions due to spectral response effects and pulse pileup effects in a photon-counting x-ray detector for CT

    Energy Technology Data Exchange (ETDEWEB)

    Cammin, Jochen, E-mail: jcammin1@jhmi.edu, E-mail: ktaguchi@jhmi.edu; Taguchi, Katsuyuki, E-mail: jcammin1@jhmi.edu, E-mail: ktaguchi@jhmi.edu [Division of Medical Imaging Physics, The Russell H. Morgan Department of Radiology and Radiological Science, Johns Hopkins University School of Medicine, Baltimore, Maryland 21287 (United States); Xu, Jennifer [Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21287 (United States); Barber, William C.; Iwanczyk, Jan S.; Hartsough, Neal E. [DxRay, Inc., Northridge, California 91324 (United States)

    2014-04-15

    )]. The agreement between the x-ray spectra calculated by the cascaded SRE+PPE model and the measured spectra was evaluated for various levels of deadtime loss ratios (DLR) and incident spectral shapes, realized using different attenuators, in terms of the weighted coefficient of variation (COV{sub W}), i.e., the root mean square difference weighted by the statistical errors of the data and divided by the mean. Results: At low count rates, when DLR < 10%, the distorted spectra measured by the DXMCT-1 were in agreement with those calculated by SRE only, with COV{sub W}'s less than 4%. At higher count rates, the measured spectra were also in agreement with the ones calculated by the cascaded SRE+PPE model; with PMMA as attenuator, COV{sub W} was 5.6% at a DLR of 22% and as small as 6.7% for a DLR as high as 55%. Conclusions: The x-ray spectra calculated by the proposed model agreed with the measured spectra over a wide range of count rates and spectral shapes. The SRE model predicted the distorted, recorded spectra with low count rates over various types and thicknesses of attenuators. The study also validated the hypothesis that the complex spectral distortions in a PCD can be adequately modeled by cascading the count-rate independent SRE and the count-rate dependent PPE.

  12. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  13. Extragalactic millimeter-wave point-source catalog, number counts and statistics from 771 deg{sup 2} of the SPT-SZ survey

    Energy Technology Data Exchange (ETDEWEB)

    Mocanu, L. M.; Crawford, T. M.; Benson, B. A.; Bleem, L. E.; Carlstrom, J. E.; Chang, C. L.; Crites, A. T. [Kavli Institute for Cosmological Physics, University of Chicago, Chicago, IL 60637 (United States); Vieira, J. D. [California Institute of Technology, Pasadena, CA 91125 (United States); Aird, K. A. [University of Chicago, Chicago, IL 60637 (United States); Aravena, M. [European Southern Observatory, Alonso de Córdova 3107, Vitacura Santiago (Chile); Austermann, J. E.; Everett, W. B.; Halverson, N. W. [Department of Astrophysical and Planetary Sciences and Department of Physics, University of Colorado, Boulder, CO 80309 (United States); Béthermin, M. [Laboratoire AIM-Paris-Saclay, CEA/DSM/Irfu-CNRS-Université Paris Diderot, CEA-Saclay, Orme des Merisiers, F-91191 Gif-sur-Yvette (France); Bothwell, M. [Cavendish Laboratory, University of Cambridge, 19 J.J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Chapman, S. [Department of Physics and Atmospheric Science, Dalhousie University, Halifax NS B3H 3J5 (Canada); Cho, H.-M. [NIST Quantum Devices Group, Boulder, CO 80305 (United States); De Haan, T.; Dobbs, M. A. [Department of Physics, McGill University, Montreal, Quebec H3A 2T8 (Canada); George, E. M., E-mail: lmocanu@uchicago.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States); and others

    2013-12-10

    We present a point-source catalog from 771 deg{sup 2} of the South Pole Telescope Sunyaev-Zel'dovich survey at 95, 150, and 220 GHz. We detect 1545 sources above 4.5σ significance in at least one band. Based on their relative brightness between survey bands, we classify the sources into two populations, one dominated by synchrotron emission from active galactic nuclei, and one dominated by thermal emission from dust-enshrouded star-forming galaxies. We find 1238 synchrotron and 307 dusty sources. We cross-match all sources against external catalogs and find 189 unidentified synchrotron sources and 189 unidentified dusty sources. The dusty sources without counterparts are good candidates for high-redshift, strongly lensed submillimeter galaxies. We derive number counts for each population from 1 Jy down to roughly 11, 4, and 11 mJy at 95, 150, and 220 GHz. We compare these counts with galaxy population models and find that none of the models we consider for either population provide a good fit to the measured counts in all three bands. The disparities imply that these measurements will be an important input to the next generation of millimeter-wave extragalactic source population models.

  14. Functional summary statistics for the Johnson-Mehl model

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    of functional summary statistics. This paper therefore invents four functional summary statistics adapted to the Johnson-Mehl model, with two of them based on the second-order properties and the other two on the nuclei-boundary distances for the associated Johnson-Mehl tessellation. The functional summary...... statistics theoretical properties are investigated, non-parametric estimators are suggested, and their usefulness for model checking is examined in a simulation study. The functional summary statistics are also used for checking fitted parametric Johnson-Mehl models for a neurotransmitters dataset....

  15. A new model to predict weak-lensing peak counts II. Parameter constraint strategies

    CERN Document Server

    Lin, Chieh-An

    2015-01-01

    Peak counts have been shown to be an excellent tool to extract the non-Gaussian part of the weak lensing signal. Recently, we developped a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analyses. In this work, we explore and compare various strategies for constraining parameter using our model, focusing on the matter density $\\Omega_\\mathrm{m}$ and the density fluctuation amplitude $\\sigma_8$. First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique which makes a weaker assumption compared to the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. We find that neglecting the CDC ...

  16. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  17. Statistical modelling of fine red wine production

    OpenAIRE

    María Rosa Castro; Marcelo Eduardo Echegaray; Rosa Ana Rodríguez; Stella Maris Udaquiola

    2010-01-01

    Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was ...

  18. On the Logical Development of Statistical Models.

    Science.gov (United States)

    1983-12-01

    parameters t2 . Type I models include scalar and vectorial probability distributions. Usually, the noise has an expected value equal to zero, so that...qualitative variables. As might be expected, the vectorial representation of all these types of models lagged behind the scaler forms. The first...1978). "Modelos con parametros variables en el analisis de series temporales" Questiio, 4, 2, 75-87. [25] Seal, H. L. (1967). "The historical

  19. Book review: Statistical Analysis and Modelling of Spatial Point Patterns

    DEFF Research Database (Denmark)

    Møller, Jesper

    2009-01-01

    Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...

  20. Examining secular trend  and seasonality in count data using dynamic generalized linear modelling

    DEFF Research Database (Denmark)

    Lundbye-Christensen, Søren; Dethlefsen, Claus; Gorst-Rasmussen, Anders;

    series regression model for Poisson counts. It differs in allowing the regression coefficients to vary gradually over time in a random fashion. Data  In the period January 1980 to 1999, 17,989 incidents of acute myocardial infarction were recorded in the county of Northern Jutland, Denmark. Records were...... updated daily. Results  The model with a seasonal pattern and an approximately linear trend was fitted to the data, and diagnostic plots indicate a good model fit. The analysis with the dynamic model revealed peaks coinciding with influenza epidemics. On average the peak-to-trough ratio is estimated...

  1. A statistical model of facial attractiveness.

    Science.gov (United States)

    Said, Christopher P; Todorov, Alexander

    2011-09-01

    Previous research has identified facial averageness and sexual dimorphism as important factors in facial attractiveness. The averageness and sexual dimorphism accounts provide important first steps in understanding what makes faces attractive, and should be valued for their parsimony. However, we show that they explain relatively little of the variance in facial attractiveness, particularly for male faces. As an alternative to these accounts, we built a regression model that defines attractiveness as a function of a face's position in a multidimensional face space. The model provides much more predictive power than the averageness and sexual dimorphism accounts and reveals previously unreported components of attractiveness. The model shows that averageness is attractive in some dimensions but not in others and resolves previous contradictory reports about the effects of sexual dimorphism on the attractiveness of male faces.

  2. Fluctuations of offshore wind generation: Statistical modelling

    DEFF Research Database (Denmark)

    Pinson, Pierre; Christensen, Lasse E.A.; Madsen, Henrik

    2007-01-01

    The magnitude of power fluctuations at large offshore wind farms has a significant impact on the control and management strategies of their power output. If focusing on the minute scale, one observes successive periods with smaller and larger power fluctuations. It seems that different regimes...... production averaged at a 1, 5, and 10-minute rate. The exercise consists in one-step ahead forecasting of these time-series with the various regime-switching models. It is shown that the MSAR model, for which the succession of regimes is represented by a hidden Markov chain, significantly outperforms...

  3. Statistical modelling of traffic safety development

    DEFF Research Database (Denmark)

    Christens, Peter

    2004-01-01

    : - Statistisk modellering af trafik uheld, Trafikdage på Ålborg Univeristet, 2001. - Sociale karakteristika hos trafikofre, Danish Transport Research Institute, 2001. - Models for traffic accidents, FERSI Young Researchers' Seminar, 2001. - Evaluation of the Danish Automatic Mobile Speed Camera Project...... 2000 dræbte trafikuheld over 40.000 i EU og skadede over 1.7 millioner. I Danmark i 2001 var der 6861 politirapporteret trafikuheld med tilskadekomst. De resulterede i 4519 lettere tilskadekomne, 3946 alvorligt tilskadekomne og 431 dræbte. Det generelle formål med dette forskningsarbejde er at forbedre...

  4. Exponential order statistic models of software reliability growth

    Science.gov (United States)

    Miller, D. R.

    1986-01-01

    Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  5. Statistical Modeling of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  6. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  7. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  8. Statistical Model of the 3-D Braided Composites Strength

    Institute of Scientific and Technical Information of China (English)

    XIAO Laiyuan; ZUO Weiwei; CAI Ganwei; LIAO Daoxun

    2007-01-01

    Based on the statistical model for the tensile statistical strength of unidirectional composite materials and the stress analysis of 3-D braided composites, a new method is proposed to calculate the tensile statistical strength of the 3-D braided composites. With this method, the strength of 3-D braided composites can be calculated with very large accuracy, and the statistical parameters of 3-D braided composites can be determined. The numerical result shows that the tensile statistical strength of 3-D braided composites can be predicted using this method.

  9. Eigenfunction statistics in the localized Anderson model

    CERN Document Server

    Killip, R

    2006-01-01

    We consider the localized region of the Anderson model and study the distribution of eigenfunctions simultaneously in space and energy. In a natural scaling limit, we prove convergence to a Poisson process. This provides a counterpoint to recent work, which proves repulsion of the localization centres in a subtly different regime.

  10. Statistical modelling of fine red wine production

    Directory of Open Access Journals (Sweden)

    María Rosa Castro

    2010-05-01

    Full Text Available Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was thus used in this work. The sampling coefficient of correlation was calculated and a dispersion diagram was then constructed; this indicated a li- neal relationship between the litres of wine obtained and the kilograms of crushed grape. Two lineal models were then adopted and variance analysis was carried out because the data came from normal populations having the same variance. The most appropriate model was obtained from this analysis; it was validated with experimental values, a good approach being obtained.

  11. Structured Statistical Models of Inductive Reasoning

    Science.gov (United States)

    Kemp, Charles; Tenenbaum, Joshua B.

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…

  12. Probing NWP model deficiencies by statistical postprocessing

    DEFF Research Database (Denmark)

    Rosgaard, Martin Haubjerg; Nielsen, Henrik Aalborg; Nielsen, Torben S.

    2016-01-01

    numerical weather prediction (NWP) model generating global weather forecasts four times daily, with numerous users worldwide. The analysis is based on two years of hourly wind speed time series measured at three locations; offshore, in coastal and flat terrain, and inland in complex topography, respectively...

  13. Network Data: Statistical Theory and New Models

    Science.gov (United States)

    2016-02-17

    Using AERONET DRAGON Campaign Data, IEEE Transactions on Geoscience and Remote Sensing, (08 2015): 0. doi: 10.1109/TGRS.2015.2395722 Geoffrey...are not viable, i.e. the fruit fly dies after the knock-out of the gene. Further examination of the ftz stained embryos indicates that the lack of...our approach for spatial gene expression analysis for early stage fruit fly embryos, we are in a process to extend it to model later stage gene

  14. Using observation-level random effects to model overdispersion in count data in ecology and evolution

    Directory of Open Access Journals (Sweden)

    Xavier A. Harrison

    2014-10-01

    Full Text Available Overdispersion is common in models of count data in ecology and evolutionary biology, and can occur due to missing covariates, non-independent (aggregated data, or an excess frequency of zeroes (zero-inflation. Accounting for overdispersion in such models is vital, as failing to do so can lead to biased parameter estimates, and false conclusions regarding hypotheses of interest. Observation-level random effects (OLRE, where each data point receives a unique level of a random effect that models the extra-Poisson variation present in the data, are commonly employed to cope with overdispersion in count data. However studies investigating the efficacy of observation-level random effects as a means to deal with overdispersion are scarce. Here I use simulations to show that in cases where overdispersion is caused by random extra-Poisson noise, or aggregation in the count data, observation-level random effects yield more accurate parameter estimates compared to when overdispersion is simply ignored. Conversely, OLRE fail to reduce bias in zero-inflated data, and in some cases increase bias at high levels of overdispersion. There was a positive relationship between the magnitude of overdispersion and the degree of bias in parameter estimates. Critically, the simulations reveal that failing to account for overdispersion in mixed models can erroneously inflate measures of explained variance (r2, which may lead to researchers overestimating the predictive power of variables of interest. This work suggests use of observation-level random effects provides a simple and robust means to account for overdispersion in count data, but also that their ability to minimise bias is not uniform across all types of overdispersion and must be applied judiciously.

  15. Behavioral and Statistical Models of Educational Inequality

    DEFF Research Database (Denmark)

    Holm, Anders; Breen, Richard

    2016-01-01

    This article addresses the question of how students and their families make educational decisions. We describe three types of behavioral model that might underlie decision-making, and we show that they have consequences for what decisions are made. Our study, thus, has policy implications if we...... wish to encourage students and their families to make better educational choices. We also establish the conditions under which empirical analysis can distinguish between the three sorts of decision-making, and we illustrate our arguments using data from the National Educational Longitudinal Study....

  16. Behavioral and Statistical Models of Educational Inequality

    DEFF Research Database (Denmark)

    Holm, Anders; Breen, Richard

    2016-01-01

    This paper addresses the question of how students and their families make educational decisions. We describe three types of behavioral model that might underlie decision-making and we show that they have consequences for what decisions are made. Our study thus has policy implications if we wish...... to encourage students and their families to make better educational choices. We also establish the conditions under which empirical analysis can distinguish between the three sorts of decision-making and we illustrate our arguments using data from the National Educational Longitudinal Study....

  17. A new model to predict weak-lensing peak counts III. Filtering technique comparisons

    CERN Document Server

    Lin, Chieh-An; Pires, Sandrine

    2016-01-01

    This is the third in a series of papers that develop a new and flexible model to predict weak-lensing (WL) peak counts, which have been shown to be a very valuable non-Gaussian probe of cosmology. In this paper, we compare the cosmological information extracted from WL peak counts using different filtering techniques of the galaxy shear data, including linear filtering with a Gaussian and two compensated filters (the starlet wavelet and the aperture mass), and the nonlinear filtering method MRLens. We present improvements to our model that account for realistic survey conditions, which are masks, shear-to-convergence transformations, and non-constant noise. We create simulated peak counts from our stochastic model, from which we obtain constraints on the matter density $\\Omega_\\mathrm{m}$, the power spectrum normalization $\\sigma_8$, and the dark-energy parameter $w_0^\\mathrm{de}$. We use two methods for parameter inference, a copula likelihood, and approximate Bayesian computation (ABC). We measure the conto...

  18. Detecting liquid threats with x-ray diffraction imaging (XDi) using a hybrid approach to navigate trade-offs between photon count statistics and spatial resolution

    Science.gov (United States)

    Skatter, Sondre; Fritsch, Sebastian; Schlomka, Jens-Peter

    2016-05-01

    The performance limits were explored for an X-ray Diffraction based explosives detection system for baggage scanning. This XDi system offers 4D imaging that comprises three spatial dimensions with voxel sizes in the order of ~(0.5cm)3, and one spectral dimension for material discrimination. Because only a very small number of photons are observed for an individual voxel, material discrimination cannot work reliably at the voxel level. Therefore, an initial 3D reconstruction is performed, which allows the identification of objects of interest. Combining all the measured photons that scattered within an object, more reliable spectra are determined on the object-level. As a case study we looked at two liquid materials, one threat and one innocuous, with very similar spectral characteristics, but with 15% difference in electron density. Simulations showed that Poisson statistics alone reduce the material discrimination performance to undesirable levels when the photon counts drop to 250. When additional, uncontrolled variation sources are considered, the photon count plays a less dominant role in detection performance, but limits the performance also for photon counts of 500 and higher. Experimental data confirmed the presence of such non-Poisson variation sources also in the XDi prototype system, which suggests that the present system can still be improved without necessarily increasing the photon flux, but by better controlling and accounting for these variation sources. When the classification algorithm was allowed to use spectral differences in the experimental data, the discrimination between the two materials improved significantly, proving the potential of X-ray diffraction also for liquid materials.

  19. Statistical modelling in biostatistics and bioinformatics selected papers

    CERN Document Server

    Peng, Defen

    2014-01-01

    This book presents selected papers on statistical model development related mainly to the fields of Biostatistics and Bioinformatics. The coverage of the material falls squarely into the following categories: (a) Survival analysis and multivariate survival analysis, (b) Time series and longitudinal data analysis, (c) Statistical model development and (d) Applied statistical modelling. Innovations in statistical modelling are presented throughout each of the four areas, with some intriguing new ideas on hierarchical generalized non-linear models and on frailty models with structural dispersion, just to mention two examples. The contributors include distinguished international statisticians such as Philip Hougaard, John Hinde, Il Do Ha, Roger Payne and Alessandra Durio, among others, as well as promising newcomers. Some of the contributions have come from researchers working in the BIO-SI research programme on Biostatistics and Bioinformatics, centred on the Universities of Limerick and Galway in Ireland and fu...

  20. Isoscaling in Statistical Sequential Decay Model

    Institute of Scientific and Technical Information of China (English)

    TIAN Wen-Dong; SU Qian-Min; WANG Hong-Wei; WANG Kun; YAN Ting-ZHi; MA Yu-Gang; CAI Xiang-Zhou; FANG De-Qing; GUO Wei; MA Chun-Wang; LIU Gui-Hua; SHEN Wen-Qing; SHI Yu

    2007-01-01

    A sequential decay model is used to study isoscaling, I.e. The factorization of the isotope ratios from sources of different isospins and sizes over a broad range of excitation energies, into fugacity terms of proton and neutron number, R21(N, Z) = Y2(N, Z)/Y1(N, Z) = Cexp(αN +βZ). It is found that the isoscaling parameters α and β have a strong dependence on the isospin difference of equilibrated source and excitation energy, no significant influence of the source size on α andβ has been observed. It is found that α and β decrease with the excitation energy and are linear functions of 1/T and △(Z/A)2 or △(N/A)2 of the sources. Symmetry energy coefficient Csym is constrained from the relationship of α and source △(Z/A)2, β and source △(N/A)2.

  1. Comparison of Primary Models to Predict Microbial Growth by the Plate Count and Absorbance Methods.

    Science.gov (United States)

    Pla, María-Leonor; Oltra, Sandra; Esteban, María-Dolores; Andreu, Santiago; Palop, Alfredo

    2015-01-01

    The selection of a primary model to describe microbial growth in predictive food microbiology often appears to be subjective. The objective of this research was to check the performance of different mathematical models in predicting growth parameters, both by absorbance and plate count methods. For this purpose, growth curves of three different microorganisms (Bacillus cereus, Listeria monocytogenes, and Escherichia coli) grown under the same conditions, but with different initial concentrations each, were analysed. When measuring the microbial growth of each microorganism by optical density, almost all models provided quite high goodness of fit (r(2) > 0.93) for all growth curves. The growth rate remained approximately constant for all growth curves of each microorganism, when considering one growth model, but differences were found among models. Three-phase linear model provided the lowest variation for growth rate values for all three microorganisms. Baranyi model gave a variation marginally higher, despite a much better overall fitting. When measuring the microbial growth by plate count, similar results were obtained. These results provide insight into predictive microbiology and will help food microbiologists and researchers to choose the proper primary growth predictive model.

  2. Process Model Construction and Optimization Using Statistical Experimental Design,

    Science.gov (United States)

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  3. Antibiotic Resistances in Livestock: A Comparative Approach to Identify an Appropriate Regression Model for Count Data

    Directory of Open Access Journals (Sweden)

    Anke Hüls

    2017-05-01

    Full Text Available Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model and (ii to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate

  4. Daisy Models Semi-Poisson statistics and beyond

    CERN Document Server

    Hernández-Saldaña, H; Seligman, T H

    1999-01-01

    Semi-Poisson statistics are shown to be obtained by removing every other number from a random sequence. Retaining every (r+1)th level we obtain a family of secuences which we call daisy models. Their statistical properties coincide with those of Bogomolny's nearest-neighbour interaction Coulomb gas if the inverse temperature coincides with the integer r. In particular the case r=2 reproduces closely the statistics of quasi-optimal solutions of the traveling salesman problem.

  5. Development of statistical models for data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Downham, D.Y.

    2000-07-01

    Incidents that cause, or could cause, injury to personnel, and that satisfy specific criteria, are reported to the Offshore Safety Division (OSD) of the Health and Safety Executive (HSE). The underlying purpose of this report is to improve ways of quantifying risk, a recommendation in Lord Cullen's report into the Piper Alpha disaster. Records of injuries and hydrocarbon releases from 1 January, 1991, to 31 March 1996, are analysed, because the reporting of incidents was standardised after 1990. Models are identified for risk assessment and some are applied. The appropriate analyses of one or two factors (or variables) are tests of uniformity or of independence. Radar graphs are used to represent some temporal variables. Cusums are applied for the analysis of incident frequencies over time, and could be applied for regular monitoring. Log-linear models for Poisson-distributed data are identified as being suitable for identifying 'non-random' combinations of more than two factors. Some questions cannot be addressed with the available data: for example, more data are needed to assess the risk of injury per employee in a time interval. If the questions are considered sufficiently important, resources could be assigned to obtain the data. Some of the main results from the analyses are as follows: the cusum analyses identified a change-point at the end of July 1993, when the reported number of injuries reduced by 40%. Injuries were more likely to occur between 8am and 12am or between 2pm and 5pm than at other times: between 2pm and 3pm the number of injuries was almost twice the average and was more than three fold the smallest. No seasonal effects in the numbers of injuries were identified. Three-day injuries occurred more frequently on the 5th, 6th and 7th days into a tour of duty than on other days. Three-day injuries occurred less frequently on the 13th and 14th days of a tour of duty. An injury classified as 'lifting or craning' was

  6. Integrating count and detection–nondetection data to model population dynamics

    Science.gov (United States)

    Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan

    2017-01-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.

  7. Mixed deterministic statistical modelling of regional ozone air pollution

    KAUST Repository

    Kalenderski, Stoitchko Dimitrov

    2011-03-17

    We develop a physically motivated statistical model for regional ozone air pollution by separating the ground-level pollutant concentration field into three components, namely: transport, local production and large-scale mean trend mostly dominated by emission rates. The model is novel in the field of environmental spatial statistics in that it is a combined deterministic-statistical model, which gives a new perspective to the modelling of air pollution. The model is presented in a Bayesian hierarchical formalism, and explicitly accounts for advection of pollutants, using the advection equation. We apply the model to a specific case of regional ozone pollution-the Lower Fraser valley of British Columbia, Canada. As a predictive tool, we demonstrate that the model vastly outperforms existing, simpler modelling approaches. Our study highlights the importance of simultaneously considering different aspects of an air pollution problem as well as taking into account the physical bases that govern the processes of interest. © 2011 John Wiley & Sons, Ltd..

  8. The Abelian Embedding Formulation of the Stueckelberg Model and its Power-counting Renormalizable Extension

    CERN Document Server

    Quadri, A

    2006-01-01

    We elucidate the geometry of the polynomial formulation of the non-abelian Stueckelberg mechanism. We show that a natural off-shell nilpotent BRST differential exists allowing to implement the constraint on the sigma field by means of BRST techniques. This is achieved by extending the ghost sector by an additional U(1) factor (abelian embedding). An important consequence is that a further BRST-invariant but not gauge-invariant mass term can be written for the non-abelian gauge fields. As all versions of the Stueckelberg theory, also the abelian embedding formulation yields a non power-counting renormalizable theory in D=4. We then derive its natural power-counting renormalizable extension and show that the physical spectrum contains a physical massive scalar particle. Physical unitarity is also established. This model implements the spontaneous symmetry breaking in the abelian embedding formalism.

  9. Solar models of low neutrino-counting rate - The depleted Maxwellian tail

    Science.gov (United States)

    Clayton, D. D.; Dwek, E.; Newman, M. J.; Talbot, R. J., Jr.

    1975-01-01

    Evolutionary sequences for the sun are presented which confirm that the Cl-37 neutrino counting rate will be greatly reduced if the high-energy tail of the Maxwellian distribution of relative energies is progressively depleted. Thermonuclear reaction rates and pressure are reevaluated for a distribution function modified by the correction factor suggested by Clayton (1974), and the effect of the results on solar models calculated with a simple Henyey code is discussed. It is shown that if the depletion is characterized by a certain exponential dependence on the distribution function, the counting rate will fall below 1 SNU for a distribution function of not less than 0.01. Suggestions are made for measuring the distribution function in the sun by means of neutrino spectroscopy and photography.

  10. The Importance of Statistical Modeling in Data Analysis and Inference

    Science.gov (United States)

    Rollins, Derrick, Sr.

    2017-01-01

    Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…

  11. An ENSO-Forecast Independent Statistical Model for the Prediction of Annual Atlantic Tropical Cyclone Frequency in April

    Directory of Open Access Journals (Sweden)

    Kenny Xie

    2014-01-01

    Full Text Available Statistical models for preseason prediction of annual Atlantic tropical cyclone (TC and hurricane counts generally include El Niño/Southern Oscillation (ENSO forecasts as a predictor. As a result, the predictions from such models are often contaminated by the errors in ENSO forecasts. In this study, it is found that the latent heat flux (LHF over Eastern Tropical Pacific (ETP, defined as the region 0°–5°N, 115°–125°W in spring is negatively correlated with the annual Atlantic TC and hurricane counts. By using stepwise backward elimination regression, it is further shown that the March value of ETP LHF is a better predictor than the spring or summer ENSO index for Atlantic TC counts. Leave-one-out cross validation indicates that the annual Atlantic TC counts predicted by this ENSO-independent statistical model show a remarkable correlation with the actual TC counts (R=0.72; P value <0.01. For Atlantic hurricanes, the predictions using March ETP LHF and summer (July–September ENSO indices show only minor differences except in moderate to strong El Niño years. Thus, March ETP LHF is an excellent predictor for seasonal Atlantic TC prediction and a viable alternative to using ENSO index for Atlantic hurricane prediction.

  12. Gravitational wave source counts at high redshift and in models with extra dimensions

    Science.gov (United States)

    García-Bellido, Juan; Nesseris, Savvas; Trashorras, Manuel

    2016-07-01

    Gravitational wave (GW) source counts have been recently shown to be able to test how gravitational radiation propagates with the distance from the source. Here, we extend this formalism to cosmological scales, i.e. the high redshift regime, and we discuss the complications of applying this methodology to high redshift sources. We also allow for models with compactified extra dimensions like in the Kaluza-Klein model. Furthermore, we also consider the case of intermediate redshifts, i.e. 0 < z lesssim 1, where we show it is possible to find an analytical approximation for the source counts dN/d(S/N). This can be done in terms of cosmological parameters, such as the matter density Ωm,0 of the cosmological constant model or the cosmographic parameters for a general dark energy model. Our analysis is as general as possible, but it depends on two important factors: a source model for the black hole binary mergers and the GW source to galaxy bias. This methodology also allows us to obtain the higher order corrections of the source counts in terms of the signal-to-noise S/N. We then forecast the sensitivity of future observations in constraining GW physics but also the underlying cosmology by simulating sources distributed over a finite range of signal-to-noise with a number of sources ranging from 10 to 500 sources as expected from future detectors. We find that with 500 events it will be possible to provide constraints on the matter density parameter at present Ωm,0 on the order of a few percent and with the precision growing fast with the number of events. In the case of extra dimensions we find that depending on the degeneracies of the model, with 500 events it may be possible to provide stringent limits on the existence of the extra dimensions if the aforementioned degeneracies can be broken.

  13. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  14. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  15. Isospin dependence of nuclear multifragmentation in statistical model

    Institute of Scientific and Technical Information of China (English)

    张蕾; 谢东珠; 张艳萍; 高远

    2011-01-01

    The evolution of nuclear disintegration mechanisms with increasing excitation energy, from compound nucleus to multifragmentation, has been studied by using the Statistical Multifragmentation Model (SMM) within a micro-canonical ensemble. We discuss the o

  16. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  17. Statistical modeling of a considering work-piece

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2008-10-01

    Full Text Available In this article are presented the stochastic predictive models for controlling properly the independent variables of the drilling operation a combined approach of statistical design and Response Surface Methodology (RSM.

  18. A no extensive statistical model for the nucleon structure function

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos

    2013-03-01

    We studied an application of nonextensive thermodynamics to describe the structure function of nucleon, in a model where the usual Fermi-Dirac and Bose-Einstein energy distribution were replaced by the equivalent functions of the q-statistical. The parameters of the model are given by an effective temperature T, the q parameter (from Tsallis statistics), and two chemical potentials given by the corresponding up (u) and down (d) quark normalization in the nucleon.

  19. Model of risk assessment under ballistic statistical tests

    Science.gov (United States)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  20. Influence of Point Count Length and Repeated Visits on Habitat Model Performance

    Science.gov (United States)

    Randy Dettmers; David A. Buehler; John G. Bartlett; Nathan A. Klaus

    1999-01-01

    Point counts are commonly used to monitor bird populations, and a substantial amount of research has investigated how conducting counts for different lengths of time affects the accuracy of these counts and the subsequent ability to monitor changes in population trends. However, little work has been done io assess how changes in count duration affect bird-habitat...

  1. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    Science.gov (United States)

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik

  2. Use of Poisson spatiotemporal regression models for the Brazilian Amazon Forest: malaria count data.

    Science.gov (United States)

    Achcar, Jorge Alberto; Martinez, Edson Zangiacomi; Souza, Aparecida Doniseti Pires de; Tachibana, Vilma Mayumi; Flores, Edilson Ferreira

    2011-01-01

    Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using bayesian spatiotemporal methods. We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the bayesian paradigm is a good strategy for modeling malaria counts.

  3. Use of Poisson spatiotemporal regression models for the Brazilian Amazon Forest: malaria count data

    Directory of Open Access Journals (Sweden)

    Jorge Alberto Achcar

    2011-12-01

    Full Text Available INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.

  4. The impact of missing data in a generalized integer-valued autoregression model for count data.

    Science.gov (United States)

    Alosh, Mohamed

    2009-11-01

    The impact of the missing data mechanism on estimates of model parameters for continuous data has been extensively investigated in the literature. In comparison, minimal research has been carried out for the impact of missing count data. The focus of this article is to investigate the impact of missing data on a transition model, termed the generalized autoregressive model of order 1 for longitudinal count data. The model has several features, including modeling dependence and accounting for overdispersion in the data, that make it appealing for the clinical trial setting. Furthermore, the model can be viewed as a natural extension of the commonly used log-linear model. Following introduction of the model and discussion of its estimation we investigate the impact of different missing data mechanisms on estimates of the model parameters through a simulation experiment. The findings of the simulation experiment show that, as in the case of normally distributed data, estimates under the missing completely at random (MCAR) and missing at random (MAR) mechanisms are close to their analogue for the full dataset and that the missing not at random (MNAR) mechanism has the greatest bias. Furthermore, estimates based on imputing the last observed value carried forward (LOCF) for missing data under the MAR assumption are similar to those of the MAR. This latter finding might be attributed to the Markov property underlying the model and to the high level of dependence among successive observations used in the simulation experiment. Finally, we consider an application of the generalized autoregressive model to a longitudinal epilepsy dataset analyzed in the literature.

  5. Galaxy Modelling - II. Multi-Wavelength Faint Counts from a Semi-Analytic Model of Galaxy Formation

    CERN Document Server

    Devriendt, J E G

    2000-01-01

    (Abridged) This paper predicts self-consistent faint galaxy counts from the UV to the submm wavelength range. The STARDUST spectral energy distributions described in Devriendt et al. (1999) are embedded within the explicit cosmological framework of a simple semi-analytic model of galaxy formation and evolution. We build a class of models which capture the luminosity budget of the universe through faint galaxy counts and redshift distributions in the whole wavelength range spanned by our spectra. In contrast with a rather stable behaviour in the optical and even in the far-IR, the submm counts are dramatically sensitive to variations in the cosmological parameters and changes in the star formation history. Faint submm counts are more easily accommodated within an open universe with a low value of $\\Omega_0$, or a flat universe with a non-zero cosmological constant. This study illustrates the implementation of multi-wavelength spectra into a semi-analytic model. In spite of its simplicity, it already provides f...

  6. A whole blood model of thrombocytopenia that controls platelet count and hematocrit.

    Science.gov (United States)

    Bercovitz, R S; Brenner, M K; Newman, D K

    2016-10-01

    In patients with thrombocytopenia, it can be difficult to predict a patient's bleeding risk based on platelet count alone. Platelet reactivity may provide additional information; however, current clinical assays cannot reliably assess platelet function in the setting of thrombocytopenia. New methods to study platelet reactivity in thrombocytopenic samples are needed. In this study, we sought to develop a laboratory model of thrombocytopenia using blood from healthy subjects that preserves the whole blood environment and reproducibly produces samples with a specific platelet count and hematocrit. We compared the activation state of unstimulated and agonist-stimulated platelets in thrombocytopenic samples derived from this method with normocytic controls. Whole blood was diluted with autologous red blood cell concentrate and platelet-poor plasma, which were obtained via centrifugation, in specific ratios to attain a final sample with a predetermined platelet count and hematocrit. P-selectin exposure and GPIIbIIIa activation in unstimulated platelets and platelets stimulated with collagen-related peptide (CRP) or adenosine diphosphate (ADP) in thrombocytopenic samples and the normocytic control from which they were derived were quantified by flow cytometry. Our methodology reliably produced thrombocytopenic samples with a platelet count ≤50,000/μL and an accurately and precisely controlled hematocrit. P-selectin exposure and GPIIbIIIa activation on unstimulated platelets or on ADP- or CRP-stimulated platelets did not differ in thrombocytopenic samples compared to normocytic controls. We describe a new method for creating thrombocytopenic blood that can be used to better understand the contributions of platelet number and function to hemostasis.

  7. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  8. Variable selection for distribution-free models for longitudinal zero-inflated count responses.

    Science.gov (United States)

    Chen, Tian; Wu, Pan; Tang, Wan; Zhang, Hui; Feng, Changyong; Kowalski, Jeanne; Tu, Xin M

    2016-07-20

    Zero-inflated count outcomes arise quite often in research and practice. Parametric models such as the zero-inflated Poisson and zero-inflated negative binomial are widely used to model such responses. Like most parametric models, they are quite sensitive to departures from assumed distributions. Recently, new approaches have been proposed to provide distribution-free, or semi-parametric, alternatives. These methods extend the generalized estimating equations to provide robust inference for population mixtures defined by zero-inflated count outcomes. In this paper, we propose methods to extend smoothly clipped absolute deviation (SCAD)-based variable selection methods to these new models. Variable selection has been gaining popularity in modern clinical research studies, as determining differential treatment effects of interventions for different subgroups has become the norm, rather the exception, in the era of patent-centered outcome research. Such moderation analysis in general creates many explanatory variables in regression analysis, and the advantages of SCAD-based methods over their traditional counterparts render them a great choice for addressing this important and timely issues in clinical research. We illustrate the proposed approach with both simulated and real study data. Copyright © 2016 John Wiley & Sons, Ltd.

  9. The LZIP: A Bayesian latent factor model for correlated zero-inflated counts.

    Science.gov (United States)

    Neelon, Brian; Chung, Dongjun

    2017-03-01

    Motivated by a study of molecular differences among breast cancer patients, we develop a Bayesian latent factor zero-inflated Poisson (LZIP) model for the analysis of correlated zero-inflated counts. The responses are modeled as independent zero-inflated Poisson distributions conditional on a set of subject-specific latent factors. For each outcome, we express the LZIP model as a function of two discrete random variables: the first captures the propensity to be in an underlying "at-risk" state, while the second represents the count response conditional on being at risk. The latent factors and loadings are assigned conditionally conjugate gamma priors that accommodate overdispersion and dependence among the outcomes. For posterior computation, we propose an efficient data-augmentation algorithm that relies primarily on easily sampled Gibbs steps. We conduct simulation studies to investigate both the inferential properties of the model and the computational capabilities of the proposed sampling algorithm. We apply the method to an analysis of breast cancer genomics data from The Cancer Genome Atlas. © 2016, The International Biometric Society.

  10. Electron impact ionization of tungsten ions in a statistical model

    Science.gov (United States)

    Demura, A. V.; Kadomtsev, M. B.; Lisitsa, V. S.; Shurygin, V. A.

    2015-01-01

    The statistical model for calculations of the electron impact ionization cross sections of multielectron ions is developed for the first time. The model is based on the idea of collective excitations of atomic electrons with the local plasma frequency, while the Thomas-Fermi model is used for atomic electrons density distribution. The electron impact ionization cross sections and related ionization rates of tungsten ions from W+ up to W63+ are calculated and then compared with the vast collection of modern experimental and modeling results. The reasonable correspondence between experimental and theoretical data demonstrates the universal nature of statistical approach to the description of atomic processes in multielectron systems.

  11. An Order Statistics Approach to the Halo Model for Galaxies

    Science.gov (United States)

    Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.

    2017-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts no luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-predicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the Halo Model for galaxies with more physically motivated galaxy formation models.

  12. Equilibrium Statistical-Thermal Models in High-Energy Physics

    CERN Document Server

    Tawfik, Abdel Nasser

    2014-01-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics, that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948 an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analysed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-par...

  13. Gravitational wave source counts at high redshift and in models with extra dimensions

    CERN Document Server

    García-Bellido, Juan; Trashorras, Manuel

    2016-01-01

    Gravitational wave (GW) source counts have been recently shown to be able to test how gravitational radiation propagates with the distance from the source. Here, we extend this formalism to cosmological scales, i.e. the high redshift regime, and we also allow for models with large or compactified extra dimensions like in the Kaluza-Klein (KK) model. We found that in the high redshift regime one would potentially expect two windows where observations above the minimum signal-to-noise threshold can be made, assuming there are no higher order corrections in the redshift dependence of the signal-to-noise $S/N(z)$ for the expected prediction. Furthermore, we also considered the case of intermediate redshifts, i.e. $0counts $\\frac{dN}{S/N}$ in terms of the cosmological parameters, like the matter density $\\Omega_{m,0}$ in the cosmological constant model and also the cosmographic parameters $(q_0,j_0,s_0)$ for a general ...

  14. A Stellar Population Synthesis Model for the Study of Ultraviolet Star Counts of the Galaxy

    CERN Document Server

    Pradhan, Ananta C; Robin, A C; Ghosh, S K; Vickers, John J

    2014-01-01

    GALEX, the first all sky imaging UV satellite, has imaged a large part of the sky providing an excellent opportunity for studying UV star counts. The aim of our study is to investigate in detail the observed UV star counts obtained by GALEX vis-a-vis the model simulated catalogs produced by the Besancon model of stellar population synthesis in various Galactic directions, and to explore the potential for studying the structure of our Galaxy from images in multiple NUV and FUV filters of the forthcoming Ultraviolet Imaging Telescope (UVIT) to be flown onboard ASTROSAT. We have upgraded the Besancon model of stellar population synthesis to include the UV bands of GALEX and UVIT. Depending on the availability of contiguous GALEX, SDSS, WISE and 2MASS overlapping regions, we have chosen a set of 19 GALEX fields which spread over a range of Galactic directions. We cross-matched GALEX sources with the WISE+2MASS and SDSS catalogs and UV stars in the GALEX catalog are identified by choosing a suitable IR colour, J -...

  15. How Managed Care Affects Medicaid Utilization : A Synthetic Difference-in-Difference Zero-Inflated Count Model

    NARCIS (Netherlands)

    Freund, D.A.; Kniesner, T.J.; LoSasso, A.T.

    1996-01-01

    We develop a synthetic difference-in-differences statistical design to apply to experimental data for adult women living in Hennepin County, Minnesota, to estimate the impact of Medicaid managed care on various modes of medical care use.Because the outcomes of interest are utilization counts with ma

  16. Microbiological quality and somatic cell count in bulk milk of dromedary camels (Camelus dromedarius): descriptive statistics, correlations, and factors of variation.

    Science.gov (United States)

    Nagy, P; Faye, B; Marko, O; Thomas, S; Wernery, U; Juhasz, J

    2013-09-01

    The objectives of the present study were to monitor the microbiological quality and somatic cell count (SCC) of bulk tank milk at the world's first large-scale camel dairy farm for a 2-yr period, to compare the results of 2 methods for the enumeration of SCC, to evaluate correlation among milk quality indicators, and to determine the effect of specific factors (year, season, stage of lactation, and level of production) on milk quality indicators. The study was conducted from January 2008 to January 2010. Total viable count (TVC), coliform count (CC), California Mastitis Test (CMT) score, and SCC were determined from daily bulk milk samples. Somatic cell count was measured by using a direct microscopic method and with an automatic cell counter. In addition, production parameters [total daily milk production (TDM, kg), number of milking camels (NMC), average milk per camel (AMC, kg)] and stage of lactation (average postpartum days, PPD) were recorded for each test day. A strong correlation (r=0.33) was found between the 2 methods for SCC enumeration; however, values derived using the microscopic method were higher. The geometric means of SCC and TVC were 394×10(3) cells/mL and 5,157 cfu/mL during the observation period, respectively. Somatic cell count was >500×10(3) cells/mL on 14.6% (106/725) and TVC was >10×10(3) cfu/mL on 4.0% (30/742) of the test days. Both milk quality indicators had a distinct seasonal pattern. For log SCC, the mean was lowest in summer and highest in autumn. The seasonal pattern of log TVC was slightly different, with the lowest values being recorded during the spring. The monthly mean TVC pattern showed a clear difference between years. Coliform count was <10 cfu/mL in most of the samples (709/742, 95.6%). A positive correlation was found between log SCC and log TVC (r=0.32), between log SCC and CMT score (r=0.26), and between log TVC and CC in yr 1 (r=0.30). All production parameters and stage of lactation showed strong seasonal

  17. Statistical Model and the mesonic-baryonic transition region

    CERN Document Server

    Oeschler, H.; Redlich, K.; Wheaton, S.

    2009-01-01

    The statistical model assuming chemical equilibriumand local strangeness conservation describes most of the observed features of strange particle production from SIS up to RHIC. Deviations are found as the maximum in the measured K+/pi+ ratio is much sharper than in the model calculations. At the incident energy of the maximum, the statistical model shows that freeze out changes regime from one being dominated by baryons at the lower energies toward one being dominated by mesons. It will be shown how deviations from the usual freeze-out curve influence the various particle ratios. Furthermore, other observables exhibit also changes just in this energy regime.

  18. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  19. A statistical model for the excitation of cavities through apertures

    CERN Document Server

    Gradoni, Gabriele; Anlage, Steven M; Ott, Edward

    2015-01-01

    In this paper, a statistical model for the coupling of electromagnetic radiation into enclosures through apertures is presented. The model gives a unified picture bridging deterministic theories of aperture radiation, and statistical models necessary for capturing the properties of irregular shaped enclosures. A Monte Carlo technique based on random matrix theory is used to predict and study the power transmitted through the aperture into the enclosure. Universal behavior of the net power entering the aperture is found. Results are of interest for predicting the coupling of external radiation through openings in irregular enclosures and reverberation chambers.

  20. Multiple commodities in statistical microeconomics: Model and market

    Science.gov (United States)

    Baaquie, Belal E.; Yu, Miao; Du, Xin

    2016-11-01

    A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.

  1. Probabilistic Quantitative Precipitation Forecasting Using Ensemble Model Output Statistics

    CERN Document Server

    Scheuerer, Michael

    2013-01-01

    Statistical post-processing of dynamical forecast ensembles is an essential component of weather forecasting. In this article, we present a post-processing method that generates full predictive probability distributions for precipitation accumulations based on ensemble model output statistics (EMOS). We model precipitation amounts by a generalized extreme value distribution that is left-censored at zero. This distribution permits modelling precipitation on the original scale without prior transformation of the data. A closed form expression for its continuous rank probability score can be derived and permits computationally efficient model fitting. We discuss an extension of our approach that incorporates further statistics characterizing the spatial variability of precipitation amounts in the vicinity of the location of interest. The proposed EMOS method is applied to daily 18-h forecasts of 6-h accumulated precipitation over Germany in 2011 using the COSMO-DE ensemble prediction system operated by the Germa...

  2. Speech emotion recognition based on statistical pitch model

    Institute of Scientific and Technical Information of China (English)

    WANG Zhiping; ZHAO Li; ZOU Cairong

    2006-01-01

    A modified Parzen-window method, which keep high resolution in low frequencies and keep smoothness in high frequencies, is proposed to obtain statistical model. Then, a gender classification method utilizing the statistical model is proposed, which have a 98% accuracy of gender classification while long sentence is dealt with. By separation the male voice and female voice, the mean and standard deviation of speech training samples with different emotion are used to create the corresponding emotion models. Then the Bhattacharyya distance between the test sample and statistical models of pitch, are utilized for emotion recognition in speech.The normalization of pitch for the male voice and female voice are also considered, in order to illustrate them into a uniform space. Finally, the speech emotion recognition experiment based on K Nearest Neighbor shows that, the correct rate of 81% is achieved, where it is only 73.85%if the traditional parameters are utilized.

  3. What is the meaning of the statistical hadronization model?

    CERN Document Server

    Becattini, F

    2005-01-01

    The statistical model of hadronization succeeds in reproducing particle abundances and transverse momentum spectra in high energy collisions of elementary particles as well as of heavy ions. Despite its apparent success, the interpretation of these results is controversial and the validity of the approach very often questioned. In this paper, we would like to summarize the whole issue by first outlining a basic formulation of the model and then comment on the main criticisms and different kinds of interpretations, with special emphasis on the so-called "phase space dominance". While the ultimate answer to the question why the statistical model works should certainly be pursued, we stress that it is a priority to confirm or disprove the fundamental scheme of the statistical model by performing some detailed tests on the rates of exclusive channels at lower energy.

  4. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  5. In all likelihood statistical modelling and inference using likelihood

    CERN Document Server

    Pawitan, Yudi

    2001-01-01

    Based on a course in the theory of statistics this text concentrates on what can be achieved using the likelihood/Fisherian method of taking account of uncertainty when studying a statistical problem. It takes the concept ot the likelihood as providing the best methods for unifying the demands of statistical modelling and the theory of inference. Every likelihood concept is illustrated by realistic examples, which are not compromised by computational problems. Examples range from asimile comparison of two accident rates, to complex studies that require generalised linear or semiparametric mode

  6. Binary and Ternary Fission Within the Statistical Model

    Science.gov (United States)

    Adamian, Gurgen G.; Andreev, Alexander V.; Antonenko, Nikolai V.; Scheid, Werner

    The binary and ternary nuclear fission are treated within the statistical model. At the scission point we calculate the potentials as functions of the deformations of the fragments in the dinuclear model. The potentials give the mass and charge distributions of the fission fragments. The ternary fission is assumed to occur during the binary fission.

  7. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  8. Statistical model of the classification of shale in a hydrocyclone

    Energy Technology Data Exchange (ETDEWEB)

    Lopachenok, L.V.; Punin, A.E.; Belyanin, Yu.I.; Proskuryakov, V.A.

    1977-10-01

    The mathematical model obtained by experimental and statistical methods for the classification of shale in a hydrocyclone is adequate for a real industrial-scale process, as indicated by the statistical analysis carried out for it, and together with the material-balance relationships it permits the calculation of the engineering parameters for any classification conditions within the region of the factor space investigated, as well as the search for the optimum conditions for the industrial realization of the process.

  9. General Linear Models: An Integrated Approach to Statistics

    OpenAIRE

    Andrew Faulkner; Sylvain Chartier

    2008-01-01

    Generally, in psychology, the various statistical analyses are taught independently from each other. As a consequence, students struggle to learn new statistical analyses, in contexts that differ from their textbooks. This paper gives a short introduction to the general linear model (GLM), in which it is showed that ANOVA (one-way, factorial, repeated measure and analysis of covariance) is simply a multiple correlation/regression analysis (MCRA). Generalizations to other cases, such as multiv...

  10. A model of the high count rate performance of NaI(Tl)-based PET detectors

    Energy Technology Data Exchange (ETDEWEB)

    Wear, J.A.; Karp, J.S.; Freifelder, R. [Univ. of Pennsylvania, Philadelphia, PA (United States). Dept. of Radiology; Mankoff, D.A. [Univ. of Washington, Seattle, WA (United States). Dept. of Radiology; Muehllehner, G. [UGM Medical Systems, Philadelphia, PA (United States)

    1998-06-01

    A detailed model of the response of large-area NaI(Tl) detectors used in PET and their triggering and data acquisition electronics has been developed. This allows one to examine the limitations of the imaging system`s performance due to degradation in the detector performance from light pile-up and deadtime from triggering and event processing. Comparisons of simulation results to measurements from the HEAD PENN-PET scanner have been performed to validate the Monte Carlo model. The model was then used to predict improvements in the high count rate performance of the HEAD PENN-PET scanner using different signal integration times, light response functions, and detectors.

  11. An accurate simulation model for single-photon avalanche diodes including important statistical effects

    Science.gov (United States)

    Qiuyang, He; Yue, Xu; Feifei, Zhao

    2013-10-01

    An accurate and complete circuit simulation model for single-photon avalanche diodes (SPADs) is presented. The derived model is not only able to simulate the static DC and dynamic AC behaviors of an SPAD operating in Geiger-mode, but also can emulate the second breakdown and the forward bias behaviors. In particular, it considers important statistical effects, such as dark-counting and after-pulsing phenomena. The developed model is implemented using the Verilog-A description language and can be directly performed in commercial simulators such as Cadence Spectre. The Spectre simulation results give a very good agreement with the experimental results reported in the open literature. This model shows a high simulation accuracy and very fast simulation rate.

  12. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  13. What counts in preschool number knowledge? A Bayes factor analytic approach toward theoretical model development.

    Science.gov (United States)

    Mou, Yi; Berteletti, Ilaria; Hyde, Daniel C

    2017-09-06

    Preschool children vary tremendously in their numerical knowledge, and these individual differences strongly predict later mathematics achievement. To better understand the sources of these individual differences, we measured a variety of cognitive and linguistic abilities motivated by previous literature to be important and then analyzed which combination of these variables best explained individual differences in actual number knowledge. Through various data-driven Bayesian model comparison and selection strategies on competing multiple regression models, our analyses identified five variables of unique importance to explaining individual differences in preschool children's symbolic number knowledge: knowledge of the count list, nonverbal approximate numerical ability, working memory, executive conflict processing, and knowledge of letters and words. Furthermore, our analyses revealed that knowledge of the count list, likely a proxy for explicit practice or experience with numbers, and nonverbal approximate numerical ability were much more important to explaining individual differences in number knowledge than general cognitive and language abilities. These findings suggest that children use a diverse set of number-specific, general cognitive, and language abilities to learn about symbolic numbers, but the contribution of number-specific abilities may overshadow that of more general cognitive abilities in the learning process. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Structural Characterization and Statistical-Mechanical Model of Epidermal Patterns.

    Science.gov (United States)

    Chen, Duyu; Aw, Wen Yih; Devenport, Danelle; Torquato, Salvatore

    2016-12-06

    In proliferating epithelia of mammalian skin, cells of irregular polygon-like shapes pack into complex, nearly flat two-dimensional structures that are pliable to deformations. In this work, we employ various sensitive correlation functions to quantitatively characterize structural features of evolving packings of epithelial cells across length scales in mouse skin. We find that the pair statistics in direct space (correlation function) and Fourier space (structure factor) of the cell centroids in the early stages of embryonic development show structural directional dependence (statistical anisotropy), which is a reflection of the fact that cells are stretched, which promotes uniaxial growth along the epithelial plane. In the late stages, the patterns tend toward statistically isotropic states, as cells attain global polarization and epidermal growth shifts to produce the skin's outer stratified layers. We construct a minimalist four-component statistical-mechanical model involving effective isotropic pair interactions consisting of hard-core repulsion and extra short-range soft-core repulsion beyond the hard core, whose length scale is roughly the same as the hard core. The model parameters are optimized to match the sample pair statistics in both direct and Fourier spaces. By doing this, the parameters are biologically constrained. In contrast with many vertex-based models, our statistical-mechanical model does not explicitly incorporate information about the cell shapes and interfacial energy between cells; nonetheless, our model predicts essentially the same polygonal shape distribution and size disparity of cells found in experiments, as measured by Voronoi statistics. Moreover, our simulated equilibrium liquid-like configurations are able to match other nontrivial unconstrained statistics, which is a testament to the power and novelty of the model. The array of structural descriptors that we deploy enable us to distinguish between normal, mechanically

  15. Improving Trust and Reputation Modeling in E-Commerce Using Agent Lifetime and Transaction Count

    Science.gov (United States)

    Cormier, Catherine; Tran, Thomas T.

    Effective and reliable trust and reputation modeling systems are central to the success of decentralized e-commerce systems where autonomous agents are relied upon to conduct commercial transactions. However, the subjective and social-based qualities that are inherent to trust and reputation introduce many complexities into the development of a reliable model. Existing research has successfully demonstrated how trust systems can be decentralized and has illustrated the importance of sharing trust information, or rather, modeling reputation. Still, few models have provided a solution for developing an initial set of advisors from whom to solicit reputation rankings, or have taken into account all of the social criteria used to determine trustworthiness. To meet these objectives, we propose the use of two new parameters in trust and reputation modeling: agent lifetime and total transaction count. We describe a model that employs these parameters to calculate an agent’s seniority, then apply this information when selecting agents for soliciting and ranking reputation information. Experiments using this model are described. The results are then presented and discussed to evaluate the effect of using these parameters in reputation modeling. We also discuss the value of our particular model in contrast with related work and conclude with directions for future research.

  16. Statistical Design Model (SDM) of satellite thermal control subsystem

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  17. Statistical Inference of Biometrical Genetic Model With Cultural Transmission.

    Science.gov (United States)

    Guo, Xiaobo; Ji, Tian; Wang, Xueqin; Zhang, Heping; Zhong, Shouqiang

    2013-01-01

    Twin and family studies establish the foundation for studying the genetic, environmental and cultural transmission effects for phenotypes. In this work, we make use of the well established statistical methods and theory for mixed models to assess cultural transmission in twin and family studies. Specifically, we address two critical yet poorly understood issues: the model identifiability in assessing cultural transmission for twin and family data and the biases in the estimates when sub-models are used. We apply our models and theory to two real data sets. A simulation is conducted to verify the bias in the estimates of genetic effects when the working model is a sub-model.

  18. Analyzing sickness absence with statistical models for survival data

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars;

    2007-01-01

    absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...... between the psychosocial work environment and sickness absence were used to illustrate the results. RESULTS: Standard methods were found to underestimate true effect sizes by approximately one-tenth [method i] and one-third [method ii] and to have lower statistical power than frailty models. CONCLUSIONS...

  19. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  1. Modern statistical models for forensic fingerprint examinations: a critical review.

    Science.gov (United States)

    Abraham, Joshua; Champod, Christophe; Lennard, Chris; Roux, Claude

    2013-10-10

    Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.

  2. Growth Curve Models and Applications : Indian Statistical Institute

    CERN Document Server

    2017-01-01

    Growth curve models in longitudinal studies are widely used to model population size, body height, biomass, fungal growth, and other variables in the biological sciences, but these statistical methods for modeling growth curves and analyzing longitudinal data also extend to general statistics, economics, public health, demographics, epidemiology, SQC, sociology, nano-biotechnology, fluid mechanics, and other applied areas.   There is no one-size-fits-all approach to growth measurement. The selected papers in this volume build on presentations from the GCM workshop held at the Indian Statistical Institute, Giridih, on March 28-29, 2016. They represent recent trends in GCM research on different subject areas, both theoretical and applied. This book includes tools and possibilities for further work through new techniques and modification of existing ones. The volume includes original studies, theoretical findings and case studies from a wide range of app lied work, and these contributions have been externally r...

  3. Statistical Modeling for Wind-Temperature Meteorological Elements in Troposphere

    CERN Document Server

    Virtser, A; Golbraikh, E

    2010-01-01

    A comprehensive statistical model for vertical profiles of the horizontal wind and temperature throughout the troposphere is presented. The model is based on radiosonde measurements of wind and temperature during several years. The profiles measured under quite different atmospheric conditions exhibit qualitative similarity, and a proper choice of the reference scales for the wind, temperature and altitude levels allows to consider the measurement data as realizations of a random process with universal characteristics: means, the basic functions and parameters of standard distributions for transform coefficients of the Principal Component Analysis. The features of the atmospheric conditions are described by statistical characteristics of the wind-temperature ensemble of dimensional reference scales. The high effectiveness of the proposed approach is provided by a similarity of wind - temperature vertical profiles, which allow to carry out the statistical modeling in the low-dimension space of the dimensional ...

  4. Sensitivity Analysis and Statistical Convergence of a Saltating Particle Model

    CERN Document Server

    Maldonado, S

    2016-01-01

    Saltation models provide considerable insight into near-bed sediment transport. This paper outlines a simple, efficient numerical model of stochastic saltation, which is validated against previously published experimental data on saltation in a channel of nearly horizontal bed. Convergence tests are systematically applied to ensure the model is free from statistical errors emanating from the number of particle hops considered. Two criteria for statistical convergence are derived; according to the first criterion, at least $10^3$ hops appear to be necessary for convergent results, whereas $10^4$ saltations seem to be the minimum required in order to achieve statistical convergence in accordance with the second criterion. Two empirical formulae for lift force are considered: one dependent on the slip (relative) velocity of the particle multiplied by the vertical gradient of the horizontal flow velocity component; the other dependent on the difference between the squares of the slip velocity components at the to...

  5. Computationally efficient statistical differential equation modeling using homogenization

    Science.gov (United States)

    Hooten, Mevin B.; Garlick, Martha J.; Powell, James A.

    2013-01-01

    Statistical models using partial differential equations (PDEs) to describe dynamically evolving natural systems are appearing in the scientific literature with some regularity in recent years. Often such studies seek to characterize the dynamics of temporal or spatio-temporal phenomena such as invasive species, consumer-resource interactions, community evolution, and resource selection. Specifically, in the spatial setting, data are often available at varying spatial and temporal scales. Additionally, the necessary numerical integration of a PDE may be computationally infeasible over the spatial support of interest. We present an approach to impose computationally advantageous changes of support in statistical implementations of PDE models and demonstrate its utility through simulation using a form of PDE known as “ecological diffusion.” We also apply a statistical ecological diffusion model to a data set involving the spread of mountain pine beetle (Dendroctonus ponderosae) in Idaho, USA.

  6. Non-Markovian spin-resolved counting statistics and an anomalous relation between autocorrelations and cross correlations in a three-terminal quantum dot

    Science.gov (United States)

    Luo, JunYan; Yan, Yiying; Huang, Yixiao; Yu, Li; He, Xiao-Ling; Jiao, HuJun

    2017-01-01

    We investigate the noise correlations of spin and charge currents through an electron spin resonance (ESR)-pumped quantum dot, which is tunnel coupled to three electrodes maintained at an equivalent chemical potential. A recursive scheme is employed with inclusion of the spin degrees of freedom to account for the spin-resolved counting statistics in the presence of non-Markovian effects due to coupling with a dissipative heat bath. For symmetric spin-up and spin-down tunneling rates, an ESR-induced spin flip mechanism generates a pure spin current without an accompanying net charge current. The stochastic tunneling of spin carriers, however, produces universal shot noises of both charge and spin currents, revealing the effective charge and spin units of quasiparticles in transport. In the case of very asymmetric tunneling rates for opposite spins, an anomalous relationship between noise autocorrelations and cross correlations is revealed, where super-Poissonian autocorrelation is observed in spite of a negative cross correlation. Remarkably, with strong dissipation strength, non-Markovian memory effects give rise to a positive cross correlation of the charge current in the absence of a super-Poissonian autocorrelation. These unique noise features may offer essential methods for exploiting internal spin dynamics and various quasiparticle tunneling processes in mesoscopic transport.

  7. LETTER: Statistical physics of the Schelling model of segregation

    Science.gov (United States)

    Dall'Asta, L.; Castellano, C.; Marsili, M.

    2008-07-01

    We investigate the static and dynamic properties of a celebrated model of social segregation, providing a complete explanation of the mechanisms leading to segregation both in one- and two-dimensional systems. Standard statistical physics methods shed light on the rich phenomenology of this simple model, exhibiting static phase transitions typical of kinetic constrained models, non-trivial coarsening like in driven-particle systems and percolation-related phenomena.

  8. Understanding and forecasting polar stratospheric variability with statistical models

    Directory of Open Access Journals (Sweden)

    C. Blume

    2012-02-01

    Full Text Available The variability of the north-polar stratospheric vortex is a prominent aspect of the middle atmosphere. This work investigates a wide class of statistical models with respect to their ability to model geopotential and temperature anomalies, representing variability in the polar stratosphere. Four partly nonstationary, nonlinear models are assessed: linear discriminant analysis (LDA; a cluster method based on finite elements (FEM-VARX; a neural network, namely a multi-layer perceptron (MLP; and support vector regression (SVR. These methods model time series by incorporating all significant external factors simultaneously, including ENSO, QBO, the solar cycle, volcanoes, etc., to then quantify their statistical importance. We show that variability in reanalysis data from 1980 to 2005 is successfully modeled. FEM-VARX and MLP even satisfactorily forecast the period from 2005 to 2011. However, internal variability remains that cannot be statistically forecasted, such as the unexpected major warming in January 2009. Finally, the statistical model with the best generalization performance is used to predict a vortex breakdown in late January, early February 2012.

  9. Hubble space telescope counts of elliptical galaxies constraints on cosmological models?

    CERN Document Server

    Driver, S P; Phillipps, S; Bristow, P D; Driver, Simon P; Windhorst, Rogier A; Phillipps, Steven; Bristow, Paul D

    1995-01-01

    The interpretation of galaxy number counts in terms of cosmological models is fraught with difficulty due to uncertainties in the overall galaxy population (mix of morphological types, luminosity functions etc.) and in the observations (loss of low surface brightness images, image blending etc.). Many of these can be overcome if we use deep high resolution imaging of a single class of high surface brightness galaxies, whose evolution is thought to be fairly well understood. This is now possible by selecting elliptical and S0 galaxies using Hubble Space Telescope images from the Medium Deep Survey and other ultradeep WFPC2 images. In the present paper, we examine whether such data can be used to discriminate between open and closed universes, or between conventional cosmological models and those dominated by a cosmological constant. We find, based on the currently available data, that unless elliptical galaxies undergo very strong merging since z \\sim 1 (and/or very large errors exist in the morphological clas...

  10. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    Science.gov (United States)

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  11. A statistical shape model of the human second cervical vertebra.

    Science.gov (United States)

    Clogenson, Marine; Duff, John M; Luethi, Marcel; Levivier, Marc; Meuli, Reto; Baur, Charles; Henein, Simon

    2015-07-01

    Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

  12. Calculation of precise firing statistics in a neural network model

    Science.gov (United States)

    Cho, Myoung Won

    2017-08-01

    A precise prediction of neural firing dynamics is requisite to understand the function of and the learning process in a biological neural network which works depending on exact spike timings. Basically, the prediction of firing statistics is a delicate manybody problem because the firing probability of a neuron at a time is determined by the summation over all effects from past firing states. A neural network model with the Feynman path integral formulation is recently introduced. In this paper, we present several methods to calculate firing statistics in the model. We apply the methods to some cases and compare the theoretical predictions with simulation results.

  13. The Statistical Modeling of the Trends Concerning the Romanian Population

    Directory of Open Access Journals (Sweden)

    Gabriela OPAIT

    2014-11-01

    Full Text Available This paper reflects the statistical modeling concerning the resident population in Romania, respectively the total of the romanian population, through by means of the „Least Squares Method”. Any country it develops by increasing of the population, respectively of the workforce, which is a factor of influence for the growth of the Gross Domestic Product (G.D.P.. The „Least Squares Method” represents a statistical technique for to determine the trend line of the best fit concerning a model.

  14. Conceptualizations of Personality Disorders with the Five Factor Model-Count and Empathy Traits

    Science.gov (United States)

    Kajonius, Petri J.; Dåderman, Anna M.

    2017-01-01

    Previous research has long advocated that emotional and behavioral disorders are related to general personality traits, such as the Five Factor Model (FFM). The addition of section III in the latest "Diagnostic and Statistical Manual of Mental Disorders" (DSM) recommends that extremity in personality traits together with maladaptive…

  15. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  16. Schedulability of Herschel revisited using statistical model checking

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2015-01-01

    Schedulability analysis is a main concern for several embedded applications due to their safety-critical nature. The classical method of response time analysis provides an efficient technique used in industrial practice. However, the method is based on conservative assumptions related to execution...... to obtain some guarantee on the (un)schedulability of the model even in the presence of undecidability. Two methods are considered: symbolic model checking and statistical model checking. Since the model uses stop-watches, the reachability problem becomes undecidable so we are using an over......-approximation technique. We can safely conclude that the system is schedulable for varying values of BCET. For the cases where deadlines are violated, we use polyhedra to try to confirm the witnesses. Our alternative method to confirm non-schedulability uses statistical model-checking (SMC) to generate counter...

  17. Reference analysis of the signal + background model in counting experiments II. Approximate reference prior

    Science.gov (United States)

    Casadei, D.

    2014-10-01

    The objective Bayesian treatment of a model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered. It is shown that the reference prior for the parameter of interest (the signal intensity) can be well approximated by the widely (ab)used flat prior only when the expected background is very high. On the other hand, a very simple approximation (the limiting form of the reference prior for perfect prior background knowledge) can be safely used over a large portion of the background parameters space. The resulting approximate reference posterior is a Gamma density whose parameters are related to the observed counts. This limiting form is simpler than the result obtained with a flat prior, with the additional advantage of representing a much closer approximation to the reference posterior in all cases. Hence such limiting prior should be considered a better default or conventional prior than the uniform prior. On the computing side, it is shown that a 2-parameter fitting function is able to reproduce extremely well the reference prior for any background prior. Thus, it can be useful in applications requiring the evaluation of the reference prior for a very large number of times.

  18. A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects

    Directory of Open Access Journals (Sweden)

    Shuai Luo

    2016-02-01

    Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.

  19. Statistical Models for Tornado Climatology: Long and Short-Term Views

    CERN Document Server

    Elsner, James B; Fricker, Tyler

    2016-01-01

    This paper estimates local tornado risk from records of past events using statistical models. First, a spatial model is fit to the tornado counts aggregated in counties with terms that control for changes in observational practices over time. Results provide a long-term view of risk that delineates the main tornado corridors in the United States where the expected annual rate exceeds two tornadoes per 10,000 square km. A few counties in the Texas Panhandle and central Kansas have annual rates that exceed four tornadoes per 10,000 square km. Refitting the model after removing the least damaging tornadoes from the data (EF0) produces a similar map but with the greatest tornado risk shifted south and eastward. Second, a space-time model is fit to the counts aggregated in raster cells with terms that control for changes in climate factors. Results provide a short-term view of risk. The short-term view identifies the shift of tornado activity away from the Ohio Valley under El Ni\\~no conditions and away from the S...

  20. Development of 3D statistical mandible models for cephalometric measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Goo; Yi, Won Jin; Hwang, Soon Jung; Choi, Soon Chul; Lee, Sam Sun; Heo, Min Suk; Huh, Kyung Hoe; Kim, Tae Il [School of Dentistry, Seoul National University, Seoul (Korea, Republic of); Hong, Helen; Yoo, Ji Hyun [Division of Multimedia Engineering, Seoul Women' s University, Seoul (Korea, Republic of)

    2012-09-15

    The aim of this study was to provide sex-matched three-dimensional (3D) statistical shape models of the mandible, which would provide cephalometric parameters for 3D treatment planning and cephalometric measurements in orthognathic surgery. The subjects used to create the 3D shape models of the mandible included 23 males and 23 females. The mandibles were segmented semi-automatically from 3D facial CT images. Each individual mandible shape was reconstructed as a 3D surface model, which was parameterized to establish correspondence between different individual surfaces. The principal component analysis (PCA) applied to all mandible shapes produced a mean model and characteristic models of variation. The cephalometric parameters were measured directly from the mean models to evaluate the 3D shape models. The means of the measured parameters were compared with those from other conventional studies. The male and female 3D statistical mean models were developed from 23 individual mandibles, respectively. The male and female characteristic shapes of variation produced by PCA showed a large variability included in the individual mandibles. The cephalometric measurements from the developed models were very close to those from some conventional studies. We described the construction of 3D mandibular shape models and presented the application of the 3D mandibular template in cephalometric measurements. Optimal reference models determined from variations produced by PCA could be used for craniofacial patients with various types of skeletal shape.

  1. Modelling geographical graduate job search using circular statistics

    NARCIS (Netherlands)

    Faggian, Alessandra; Corcoran, Jonathan; McCann, Philip

    2013-01-01

    Theory suggests that the spatial patterns of migration flows are contingent both on individual human capital and underlying geographical structures. Here we demonstrate these features by using circular statistics in an econometric modelling framework applied to the flows of UK university graduates.

  2. Interactive comparison of hypothesis tests for statistical model checking

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; Reijsbergen, D.P.; Scheinhardt, Willem R.W.

    2015-01-01

    We present a web-based interactive comparison of hypothesis tests as are used in statistical model checking, providing users and tool developers with more insight into their characteristics. Parameters can be modified easily and their influence is visualized in real time; an integrated simulation

  3. Validation of Models : Statistical Techniques and Data Availability

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1999-01-01

    This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real

  4. Statistical Modeling for Radiation Hardness Assurance: Toward Bigger Data

    Science.gov (United States)

    Ladbury, R.; Campola, M. J.

    2015-01-01

    New approaches to statistical modeling in radiation hardness assurance are discussed. These approaches yield quantitative bounds on flight-part radiation performance even in the absence of conventional data sources. This allows the analyst to bound radiation risk at all stages and for all decisions in the RHA process. It also allows optimization of RHA procedures for the project's risk tolerance.

  5. Nowcasting GDP Growth: statistical models versus professional analysts

    NARCIS (Netherlands)

    J.M. de Winter (Jasper)

    2016-01-01

    markdownabstractThis thesis contains four chapters that cast new light on the ability of professional analysts and statistical models to assess economic growth in the current quarter (nowcast) and its development in the near future. This is not a trivial issue. An accurate assessment of the current

  6. Hypersonic Vehicle Tracking Based on Improved Current Statistical Model

    Directory of Open Access Journals (Sweden)

    He Guangjun

    2013-11-01

    Full Text Available A new method of tracking the near space hypersonic vehicle is put forward. According to hypersonic vehicles’ characteristics, we improved current statistical model through online identification of the maneuvering frequency. A Monte Carlo simulation is used to analyze the performance of the method. The results show that the improved method exhibits very good tracking performance in comparison with the old method.

  7. Hierarchical modelling for the environmental sciences statistical methods and applications

    CERN Document Server

    Clark, James S

    2006-01-01

    New statistical tools are changing the way in which scientists analyze and interpret data and models. Hierarchical Bayes and Markov Chain Monte Carlo methods for analysis provide a consistent framework for inference and prediction where information is heterogeneous and uncertain, processes are complicated, and responses depend on scale. Nowhere are these methods more promising than in the environmental sciences.

  8. Octet magnetic Moments and their sum rules in statistical model

    CERN Document Server

    Batra, M

    2013-01-01

    The statistical model is implemented to find the magnetic moments of all octet baryons. The well-known sum rules like GMO and CG sum rules has been checked in order to check the consistency of our approach. The small discrepancy between the results suggests the importance of breaking in SU(3) symmetry.

  9. Environmental Concern and Sociodemographic Variables: A Study of Statistical Models

    Science.gov (United States)

    Xiao, Chenyang; McCright, Aaron M.

    2007-01-01

    Studies of the social bases of environmental concern over the past 30 years have produced somewhat inconsistent results regarding the effects of sociodemographic variables, such as gender, income, and place of residence. The authors argue that model specification errors resulting from violation of two statistical assumptions (interval-level…

  10. Statistical sampling and modelling for cork oak and eucalyptus stands

    NARCIS (Netherlands)

    Paulo, M.J.

    2002-01-01

    This thesis focuses on the use of modern statistical methods to solve problems on sampling, optimal cutting time and agricultural modelling in Portuguese cork oak and eucalyptus stands. The results are contained in five chapters that have been submitted for publication as scientific manuscripts.The

  11. Monte-Carlo simulation-based statistical modeling

    CERN Document Server

    Chen, John

    2017-01-01

    This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.

  12. Workshop on Model Uncertainty and its Statistical Implications

    CERN Document Server

    1988-01-01

    In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.

  13. Statistical mechanics models for motion and force planning

    Science.gov (United States)

    Rodriguez, G.

    1990-01-01

    The models of statistical mechanics provide an alternative to the methods of classical mechanics more traditionally used in robotics. They have a potential to: improve analysis of object collisions; handle kinematic and dynamic contact interactions within the same frmework; and reduce the need for perfect deterministic world model information. The statistical mechanics models characterize the state of the system as a probability density function (p.d.f.) whose time evolution is governed by a partial differential equation subject to boundary and initial conditions. The boundary conditions when rigid objects collide reflect the conservation of momentum. The models are being developed to embedd in remote semi-autonomous systems with a need to reason and interact with a multiobject environment.

  14. An Order Statistics Approach to the Halo Model for Galaxies

    CERN Document Server

    Paul, Niladri; Sheth, Ravi K

    2016-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models -- one in which this luminosity function $p(L)$ is universal -- naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts $\\textit{no}$ luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a $\\textit{halo mass dependent}$ luminosity function $p(L|m)$, is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-pre...

  15. Statistical models describing the energy signature of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Thavlov, Anders

    2010-01-01

    Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i.......e. the heat dynamics of the building, have been developed. The models can be used to obtain rather detailed knowledge of the energy performance of the building and to optimize the control of the energy consumption for heating, which will be vital in conditions with increasing fluctuation of the energy supply...... or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as UA...

  16. Applying the luminosity function statistics in the fireshell model

    Science.gov (United States)

    Rangel Lemos, L. J.; Bianco, C. L.; Ruffini, R.

    2015-12-01

    The luminosity function (LF) statistics applied to the data of BATSE, GBM/Fermi and BAT/Swift is the theme approached in this work. The LF is a strong statistical tool to extract useful information from astrophysical samples, and the key point of this statistical analysis is in the detector sensitivity, where we have performed careful analysis. We applied the tool of the LF statistics to three GRB classes predicted by the Fireshell model. We produced, by LF statistics, predicted distributions of: peak ux N(Fph pk), redshift N(z) and peak luminosity N(Lpk) for the three GRB classes predicted by Fireshell model; we also used three GRB rates. We looked for differences among the distributions, and in fact we found. We performed a comparison between the distributions predicted and observed (with and without redshifts), where we had to build a list with 217 GRBs with known redshifts. Our goal is transform the GRBs in a standard candle, where a alternative is find a correlation between the isotropic luminosity and the Band peak spectral energy (Liso - Epk).

  17. Forecasting model of Corylus, Alnus, and Betula pollen concentration levels using spatiotemporal correlation properties of pollen count.

    Science.gov (United States)

    Nowosad, Jakub; Stach, Alfred; Kasprzyk, Idalia; Weryszko-Chmielewska, Elżbieta; Piotrowska-Weryszko, Krystyna; Puc, Małgorzata; Grewling, Łukasz; Pędziszewska, Anna; Uruska, Agnieszka; Myszkowska, Dorota; Chłopek, Kazimiera; Majkowska-Wojciechowska, Barbara

    The aim of the study was to create and evaluate models for predicting high levels of daily pollen concentration of Corylus, Alnus, and Betula using a spatiotemporal correlation of pollen count. For each taxon, a high pollen count level was established according to the first allergy symptoms during exposure. The dataset was divided into a training set and a test set, using a stratified random split. For each taxon and city, the model was built using a random forest method. Corylus models performed poorly. However, the study revealed the possibility of predicting with substantial accuracy the occurrence of days with high pollen concentrations of Alnus and Betula using past pollen count data from monitoring sites. These results can be used for building (1) simpler models, which require data only from aerobiological monitoring sites, and (2) combined meteorological and aerobiological models for predicting high levels of pollen concentration.

  18. Statistical multiscale image segmentation via Alpha-stable modeling

    OpenAIRE

    Wan, Tao; Canagarajah, CN; Achim, AM

    2007-01-01

    This paper presents a new statistical image segmentation algorithm, in which the texture features are modeled by symmetric alpha-stable (SalphaS) distributions. These features are efficiently combined with the dominant color feature to perform automatic segmentation. First, the image is roughly segmented into textured and nontextured regions using the dual-tree complex wavelet transform (DT-CWT) with the sub-band coefficients modeled as SalphaS random variables. A mul-tiscale segmentation is ...

  19. Generalized statistical model for multicomponent adsorption equilibria on zeolites

    Energy Technology Data Exchange (ETDEWEB)

    Rota, R.; Gamba, G.; Paludetto, R.; Carra, S.; Morbidelli, M. (Dipartimento di Chimica Fisica Applicata, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (IT))

    1988-05-01

    The statistical thermodynamic approach to multicomponent adsorption equilibria on zeolites has been extended to nonideal systems, through the correction of cross coefficients characterizing the interaction between unlike molecules. Estimation of the model parameters requires experimental binary equilibrium data. Comparisons with the classical model based on adsorbed solution theory are reported for three nonideal ternary systems. The two approaches provide comparable results in the simulation of binary and ternary adsorption equilibrium data at constant temperature and pressure.

  20. Development of 3D statistical mandible models for cephalometric measurements

    OpenAIRE

    2012-01-01

    Purpose The aim of this study was to provide sex-matched three-dimensional (3D) statistical shape models of the mandible, which would provide cephalometric parameters for 3D treatment planning and cephalometric measurements in orthognathic surgery. Materials and Methods The subjects used to create the 3D shape models of the mandible included 23 males and 23 females. The mandibles were segmented semi-automatically from 3D facial CT images. Each individual mandible shape was reconstructed as a ...

  1. Bregman divergence as general framework to estimate unnormalized statistical models

    CERN Document Server

    Gutmann, Michael

    2012-01-01

    We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to one, respectively. We prove that recent estimation methods such as noise-contrastive estimation, ratio matching, and score matching belong to the proposed framework, and explain their interconnection based on supervised learning. Further, we discuss the role of boosting in unsupervised learning.

  2. A Bayesian model for censored positive count data in evaluating breast cancer progression.

    Science.gov (United States)

    Yeh, Hung-Wen; Jiang, Yu; Garrard, Lili; Lei, Yang; Gajewski, Byron

    2013-01-01

    Basic science researchers transplant human cancer tissues from patients with ductal carcinoma in situ (DCIS) to animals and observe the progression of the disease. Successful transplants show invasion of human tissues across mammary ducts in animal fat pads and cause DCIS-like lesions in one or more ducts. In this work, we consider data from a recent publication of breast cancer research where positive counts of affected ducts may be subject to censoring. We fit the data with zero-truncated Poisson (ZTP) models with an informative prior of gamma. Due to the zero-truncation and right censoring, posterior distributions may not be conventional gamma and are estimated through Markov chain Monte Carlo and grid approximation. For each of the two cell lines, we fit a model with group-specific parameters for DCIS subtypes classified by the cell surface biomarkers, and another model with a homogeneous parameter across groups. Models are compared by the Deviance Information Criterion (DIC). For the chosen prior parameter values, Bayes estimates are comparative to the maximum likelihood estimates, and the DIC favors the simpler model in both cell lines.

  3. A modelling framework to optimize timing of haulout counts for estimating harbour seal (Phoca vitulina abundance

    Directory of Open Access Journals (Sweden)

    Michelle Cronin

    2010-09-01

    Full Text Available The time of year and day, the state of the tide and prevailing environmental conditions significantly influence seal haulout behaviour. Understanding these effects is fundamentally important in deriving accurate estimates of harbour seal abundance from haulout data. We present a modelling approach to assess the influence of these variables on seals’ haulout behaviour and, by identifying the combination of covariates during which seal abundance is highest, predict the optimal time and conditions for future surveys. Count data of harbour seals at haulouts in southwest Ireland collected during 2003-2005 were included in mixed additive models together with environmental covariates, including season, time of day and weather conditions. The models show maximum abundance at haulout sites occurred during midday periods during August and in late afternoon/early evening during September. Accurate national and local population estimates are essential for the effective monitoring of the conservation status of the species and for the identification, management and monitoring of Special Areas of Conservation (SAC in accordance with the EU Habitats Directive. Our model based approach provides a useful tool for optimising the timing of harbourseal surveys in Ireland and the modelling framework is useful for predicting optimal survey periods for other protected, endangered or significant species worldwide.

  4. Improving EWMA Plans for Detecting Unusual Increases in Poisson Counts

    Directory of Open Access Journals (Sweden)

    R. S. Sparks

    2009-01-01

    adaptive exponentially weighted moving average (EWMA plan is developed for signalling unusually high incidence when monitoring a time series of nonhomogeneous daily disease counts. A Poisson transitional regression model is used to fit background/expected trend in counts and provides “one-day-ahead” forecasts of the next day's count. Departures of counts from their forecasts are monitored. The paper outlines an approach for improving early outbreak data signals by dynamically adjusting the exponential weights to be efficient at signalling local persistent high side changes. We emphasise outbreak signals in steady-state situations; that is, changes that occur after the EWMA statistic had run through several in-control counts.

  5. Observer-model optimization of X-ray system in photon-counting breast imaging

    Science.gov (United States)

    Cederström, Björn; Fredenberg, Erik; Lundqvist, Mats; Ericson, Tove; Åslund, Magnus

    2011-08-01

    An ideal-observer model is applied to optimize the design of an X-ray tube intended for use in a multi-slit scanning photon-counting mammography system. The design is such that the anode and the heel effect are reversed and the projected focal spot is smallest at the chest wall. Using linear systems theory, detectability and dose efficiency for a 0.1-mm disk are calculated for different focal spot sizes and anode angles. It is shown that the image acquisition time can be reduced by about 25% with spatial resolution and dose efficiency improved near the chest wall and worsened further away. The image quality is significantly more homogeneous than for the conventional anode orientation, both with respect to noise and detectability of a small object. With the tube rotated 90∘, dose efficiency can be improved by 20% for a fixed image acquisition time.

  6. Bayesian latent variable models for hierarchical clustered count outcomes with repeated measures in microbiome studies.

    Science.gov (United States)

    Xu, Lizhen; Paterson, Andrew D; Xu, Wei

    2017-04-01

    Motivated by the multivariate nature of microbiome data with hierarchical taxonomic clusters, counts that are often skewed and zero inflated, and repeated measures, we propose a Bayesian latent variable methodology to jointly model multiple operational taxonomic units within a single taxonomic cluster. This novel method can incorporate both negative binomial and zero-inflated negative binomial responses, and can account for serial and familial correlations. We develop a Markov chain Monte Carlo algorithm that is built on a data augmentation scheme using Pólya-Gamma random variables. Hierarchical centering and parameter expansion techniques are also used to improve the convergence of the Markov chain. We evaluate the performance of our proposed method through extensive simulations. We also apply our method to a human microbiome study.

  7. Statistical modelling of transcript profiles of differentially regulated genes

    Directory of Open Access Journals (Sweden)

    Sergeant Martin J

    2008-07-01

    Full Text Available Abstract Background The vast quantities of gene expression profiling data produced in microarray studies, and the more precise quantitative PCR, are often not statistically analysed to their full potential. Previous studies have summarised gene expression profiles using simple descriptive statistics, basic analysis of variance (ANOVA and the clustering of genes based on simple models fitted to their expression profiles over time. We report the novel application of statistical non-linear regression modelling techniques to describe the shapes of expression profiles for the fungus Agaricus bisporus, quantified by PCR, and for E. coli and Rattus norvegicus, using microarray technology. The use of parametric non-linear regression models provides a more precise description of expression profiles, reducing the "noise" of the raw data to produce a clear "signal" given by the fitted curve, and describing each profile with a small number of biologically interpretable parameters. This approach then allows the direct comparison and clustering of the shapes of response patterns between genes and potentially enables a greater exploration and interpretation of the biological processes driving gene expression. Results Quantitative reverse transcriptase PCR-derived time-course data of genes were modelled. "Split-line" or "broken-stick" regression identified the initial time of gene up-regulation, enabling the classification of genes into those with primary and secondary responses. Five-day profiles were modelled using the biologically-oriented, critical exponential curve, y(t = A + (B + CtRt + ε. This non-linear regression approach allowed the expression patterns for different genes to be compared in terms of curve shape, time of maximal transcript level and the decline and asymptotic response levels. Three distinct regulatory patterns were identified for the five genes studied. Applying the regression modelling approach to microarray-derived time course data

  8. Advances on statistical/thermodynamical models for unpolarized structure functions

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro

    2013-03-01

    During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag[1] with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries ¯d/¯u and ¯d-¯u.

  9. STATISTICAL MODELS FOR SEMI-RIGID NEMATIC POLYMERS

    Institute of Scientific and Technical Information of China (English)

    WANG Xinjiu

    1995-01-01

    Semi-rigid liquid crystal polymer is a class of liquid crystal polymers different from long rigid rod liquid crystal polymer to which the well-known Onsager and Flory theories are applied. In this paper, three statistical models for the semi-rigid nematic polymer were addressed. They are the elastically jointed rod model, worm-like chain model, and non-homogeneous chain model.The nematic-isotropic transition temperature was examined. The pseudo-second transition temperature is expressed analytically. Comparisons with the experiments were made and the agreements were found.

  10. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  11. Frequentist comparison of CMB local extrema statistics in the five-year WMAP data with two anisotropic cosmological models

    CERN Document Server

    Hou, Zhen; Górski, K M; Groeneboom, N E; Eriksen, H K

    2009-01-01

    We present local extrema studies of two models that introduce a preferred direction into the observed cosmic microwave background (CMB) temperature field. In particular, we make a frequentist comparison of the one- and two-point statistics for the dipole modulation and ACW models with data from the five-year Wilkinson Microwave Anisotropy Probe (WMAP). This analysis is motivated by previously revealed anomalies in the WMAP data, and particularly the difference in the statistical nature of the temperature anisotropies when analysed in hemispherical partitions. The analysis of the one-point statistics indicates that the previously determined hemispherical variance difficulties can be apparently overcome by a dipole modulation field, but new inconsistencies arise if the mean and the l-dependence of the statistics are considered. The two-point correlation functions of the local extrema, the temperature pair product and the point-point spatial pair-count, demonstrate that the impact of such a modulation is to over...

  12. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  13. Stochastic geometry, spatial statistics and random fields models and algorithms

    CERN Document Server

    2015-01-01

    Providing a graduate level introduction to various aspects of stochastic geometry, spatial statistics and random fields, this volume places a special emphasis on fundamental classes of models and algorithms as well as on their applications, for example in materials science, biology and genetics. This book has a strong focus on simulations and includes extensive codes in Matlab and R, which are widely used in the mathematical community. It can be regarded as a continuation of the recent volume 2068 of Lecture Notes in Mathematics, where other issues of stochastic geometry, spatial statistics and random fields were considered, with a focus on asymptotic methods.

  14. Level statistics of a pseudo-Hermitian Dicke model.

    Science.gov (United States)

    Deguchi, Tetsuo; Ghosh, Pijush K; Kudo, Kazue

    2009-08-01

    A non-Hermitian operator that is related to its adjoint through a similarity transformation is defined as a pseudo-Hermitian operator. We study the level statistics of a pseudo-Hermitian Dicke Hamiltonian that undergoes quantum phase transition (QPT). We find that the level-spacing distribution of this Hamiltonian near the integrable limit is close to Poisson distribution, while it is Wigner distribution for the ranges of the parameters for which the Hamiltonian is nonintegrable. We show that the assertion in the context of the standard Dicke model that QPT is a precursor to a change in the level statistics is not valid in general.

  15. Convex Combination of Multiple Statistical Models with Application to VAD

    DEFF Research Database (Denmark)

    Petsatodis, Theodoros; Boukis, Christos; Talantzis, Fotios

    2011-01-01

    This paper proposes a robust Voice Activity Detector (VAD) based on the observation that the distribution of speech captured with far-field microphones is highly varying, depending on the noise and reverberation conditions. The proposed VAD employs a convex combination scheme comprising three...... statistical distributions - a Gaussian, a Laplacian, and a two-sided Gamma - to effectively model captured speech. This scheme shows increased ability to adapt to dynamic acoustic environments. The contribution of each distribution to this convex combination is automatically adjusted based on the statistical...

  16. General Linear Models: An Integrated Approach to Statistics

    Directory of Open Access Journals (Sweden)

    Andrew Faulkner

    2008-09-01

    Full Text Available Generally, in psychology, the various statistical analyses are taught independently from each other. As a consequence, students struggle to learn new statistical analyses, in contexts that differ from their textbooks. This paper gives a short introduction to the general linear model (GLM, in which it is showed that ANOVA (one-way, factorial, repeated measure and analysis of covariance is simply a multiple correlation/regression analysis (MCRA. Generalizations to other cases, such as multivariate and nonlinear analysis, are also discussed. It can easily be shown that every popular linear analysis can be derived from understanding MCRA.

  17. Statistical skull models from 3D X-ray images

    CERN Document Server

    Berar, M; Bailly, G; Payan, Y; Berar, Maxime; Desvignes, Michel; Payan, Yohan

    2006-01-01

    We present 2 statistical models of the skull and mandible built upon an elastic registration method of 3D meshes. The aim of this work is to relate degrees of freedom of skull anatomy, as static relations are of main interest for anthropology and legal medicine. Statistical models can effectively provide reconstructions together with statistical precision. In our applications, patient-specific meshes of the skull and the mandible are high-density meshes, extracted from 3D CT scans. All our patient-specific meshes are registrated in a subject-shared reference system using our 3D-to-3D elastic matching algorithm. Registration is based upon the minimization of a distance between the high density mesh and a shared low density mesh, defined on the vertexes, in a multi resolution approach. A Principal Component analysis is performed on the normalised registrated data to build a statistical linear model of the skull and mandible shape variation. The accuracy of the reconstruction is under the millimetre in the shape...

  18. On Wiener filtering and the physics behind statistical modeling.

    Science.gov (United States)

    Marbach, Ralf

    2002-01-01

    The closed-form solution of the so-called statistical multivariate calibration model is given in terms of the pure component spectral signal, the spectral noise, and the signal and noise of the reference method. The "statistical" calibration model is shown to be as much grounded on the physics of the pure component spectra as any of the "physical" models. There are no fundamental differences between the two approaches since both are merely different attempts to realize the same basic idea, viz., the spectrometric Wiener filter. The concept of the application-specific signal-to-noise ratio (SNR) is introduced, which is a combination of the two SNRs from the reference and the spectral data. Both are defined and the central importance of the latter for the assessment and development of spectroscopic instruments and methods is explained. Other statistics like the correlation coefficient, prediction error, slope deficiency, etc., are functions of the SNR. Spurious correlations and other practically important issues are discussed in quantitative terms. Most important, it is shown how to use a priori information about the pure component spectra and the spectral noise in an optimal way, thereby making the distinction between statistical and physical calibrations obsolete and combining the best of both worlds. Companies and research groups can use this article to realize significant savings in cost and time for development efforts.

  19. Statistical traffic modeling of MPEG frame size: Experiments and Analysis

    Directory of Open Access Journals (Sweden)

    Haniph A. Latchman

    2009-12-01

    Full Text Available For guaranteed quality of service (QoS and sufficient bandwidth in a communication network which provides an integrated multimedia service, it is important to obtain an analytical and tractable model of the compressed MPEG data. This paper presents a statistical approach to a group of picture (GOP MPEG frame size model to increase network traffic performance in a communication network. We extract MPEG frame data from commercial DVD movies and make probability histograms to analyze the statistical characteristics of MPEG frame data. Six candidates of probability distributions are considered here and their parameters are obtained from the empirical data using the maximum likelihood estimation (MLE. This paper shows that the lognormal distribution is the best fitting model of MPEG-2 total frame data.

  20. Statistical 3D damage accumulation model for ion implant simulators

    CERN Document Server

    Hernandez-Mangas, J M; Enriquez, L E; Bailon, L; Barbolla, J; Jaraiz, M

    2003-01-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided.

  1. Statistical 3D damage accumulation model for ion implant simulators

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Mangas, J.M. E-mail: jesman@ele.uva.es; Lazaro, J.; Enriquez, L.; Bailon, L.; Barbolla, J.; Jaraiz, M

    2003-04-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided.

  2. Experimental, statistical, and biological models of radon carcinogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Cross, F.T.

    1991-09-01

    Risk models developed for underground miners have not been consistently validated in studies of populations exposed to indoor radon. Imprecision in risk estimates results principally from differences between exposures in mines as compared to domestic environments and from uncertainties about the interaction between cigarette-smoking and exposure to radon decay products. Uncertainties in extrapolating miner data to domestic exposures can be reduced by means of a broad-based health effects research program that addresses the interrelated issues of exposure, respiratory tract dose, carcinogenesis (molecular/cellular and animal studies, plus developing biological and statistical models), and the relationship of radon to smoking and other copollutant exposures. This article reviews experimental animal data on radon carcinogenesis observed primarily in rats at Pacific Northwest Laboratory. Recent experimental and mechanistic carcinogenesis models of exposures to radon, uranium ore dust, and cigarette smoke are presented with statistical analyses of animal data. 20 refs., 1 fig.

  3. Statistical model selection with “Big Data”

    Directory of Open Access Journals (Sweden)

    Jurgen A. Doornik

    2015-12-01

    Full Text Available Big Data offer potential benefits for statistical modelling, but confront problems including an excess of false positives, mistaking correlations for causes, ignoring sampling biases and selecting by inappropriate methods. We consider the many important requirements when searching for a data-based relationship using Big Data, and the possible role of Autometrics in that context. Paramount considerations include embedding relationships in general initial models, possibly restricting the number of variables to be selected over by non-statistical criteria (the formulation problem, using good quality data on all variables, analyzed with tight significance levels by a powerful selection procedure, retaining available theory insights (the selection problem while testing for relationships being well specified and invariant to shifts in explanatory variables (the evaluation problem, using a viable approach that resolves the computational problem of immense numbers of possible models.

  4. A statistical model for characterization of histopathology images

    Science.gov (United States)

    Álvarez, Pablo; Castro, Guatizalema; Corredor, Germán.; Romero, Eduardo

    2015-01-01

    Accessing information of interest in collections of histopathology images is a challenging task. To address such issue, previous works have designed searching strategies based on the use of keywords and low-level features. However, those methods have demonstrated to not be enough or practical for this purpose. Alternative low-level features such as cell area, distance among cells and cell density are directly associated to simple histological concepts and could serve as good descriptors for this purpose. In this paper, a statistical model is adapted to represent the distribution of the areas occupied by cells for its use in whole histopathology image characterization. This novel descriptor facilitates the design of metrics based on distribution parameters and also provides new elements for a better image understanding. The proposed model was validated using image processing and statistical techniques. Results showed low error rates, demonstrating the accuracy of the model.

  5. A greenhouse gas monitoring and modelling system for Switzerland: The CarboCount CH project

    Science.gov (United States)

    Brunner, Dominik; Buchmann, Nina; Eugster, Werner; Seneviratne, Sonia; Davin, Edouard; Gruber, Nicolas; Leuenberger, Markus; Bey, Isabelle; Bamberger, Ines; Henne, Stephan; Liu, Yu; Mystakidis, Stefanos; Oney, Brian; Roches, Anne

    2014-05-01

    CarboCount CH is a collaborative project of six research institutes in Switzerland. It investigates human-related emissions and natural exchange between the atmosphere and the biosphere of the two most important long-lived greenhouse gases carbon dioxide (CO2) and methane (CH4) at the regional scale with a special focus on Switzerland. For this purpose, four new measurement sites have been established including a 210 m tall tower at Beromünster, a water reservoir tower in flat terrain at Gimmiz, and two mountain sites at Lägern (856 m a.s.l.) and Früebüel (977 m a.s.l.). All sites were equipped with high-precision instruments for continuous measurements of CO2, CH4, and partially CO. The continuous CO measurements as well as bi-weekly 14CO2 samples at the tall tower site help to distinguish between anthropogenic and biogenic contributions to the observed CO2 concentrations. All data are transferred to the central processing facility at Empa where the calibrated data are uploaded to a database and made remotely accessible to all partners. The network is complemented by flux measurements of the Swiss Fluxnet network and other existing sites with CO2 and/or CH4 measurements including the high altitude GAW site Jungfraujoch. The four CarboCount CH sites have been operating reliably and almost continuously for more than one year now. For data interpretation and top-down flux estimation, two separate atmospheric transport and inverse modeling systems are being developed within the project. The first one uses the new tracer transport module of the regional numerical weather prediction model COSMO together with an Ensemble Kalman filter scheme. The second framework is based on backward simulations with the Lagrangian transport model FLEXPART-COSMO. Anthropogenic a priori emissions are obtained from newly developed high-resolution (500 m x 500 m) inventories of diurnally and seasonally varying CO2 and CH4 emissions in Switzerland, merged with European and global

  6. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  7. Improved head-driven statistical models for natural language parsing

    Institute of Scientific and Technical Information of China (English)

    袁里驰

    2013-01-01

    Head-driven statistical models for natural language parsing are the most representative lexicalized syntactic parsing models, but they only utilize semantic dependency between words, and do not incorporate other semantic information such as semantic collocation and semantic category. Some improvements on this distinctive parser are presented. Firstly, "valency" is an essential semantic feature of words. Once the valency of word is determined, the collocation of the word is clear, and the sentence structure can be directly derived. Thus, a syntactic parsing model combining valence structure with semantic dependency is purposed on the base of head-driven statistical syntactic parsing models. Secondly, semantic role labeling(SRL) is very necessary for deep natural language processing. An integrated parsing approach is proposed to integrate semantic parsing into the syntactic parsing process. Experiments are conducted for the refined statistical parser. The results show that 87.12% precision and 85.04% recall are obtained, and F measure is improved by 5.68% compared with the head-driven parsing model introduced by Collins.

  8. Counting carbohydrates

    Science.gov (United States)

    Carb counting; Carbohydrate-controlled diet; Diabetic diet; Diabetes-counting carbohydrates ... Many foods contain carbohydrates (carbs), including: Fruit and fruit juice Cereal, bread, pasta, and rice Milk and milk products, soy milk Beans, legumes, ...

  9. Seal Counts

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Database of seal counts from aerial photography. Counts by image, site, species, and date are stored in the database along with information on entanglements and...

  10. Platelet Count

    Science.gov (United States)

    ... their spleen removed surgically Use of birth control pills (oral contraceptives) Some conditions may cause a temporary (transitory) increased ... increased platelet counts include estrogen and birth control pills (oral contraceptives). Mildly decreased platelet counts may be seen in ...

  11. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    Science.gov (United States)

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  12. Editorial to: Six papers on Dynamic Statistical Models

    DEFF Research Database (Denmark)

    2014-01-01

    The following six papers are based on invited lectures at the satellite meeting held at the University of Copenhagen before the 58th World Statistics Congress of the International Statistical Institute in Dublin in 2011. At the invitation of the Bernoulli Society, the satellite meeting was organi......The following six papers are based on invited lectures at the satellite meeting held at the University of Copenhagen before the 58th World Statistics Congress of the International Statistical Institute in Dublin in 2011. At the invitation of the Bernoulli Society, the satellite meeting...... areas working with frontier research topics in statistics for dynamic models. This issue of SJS contains a quite diverse collection of six papers from the conference: Spectral Estimation of Covolatility from Noisy Observations Using Local Weights Markus Bibinger and Markus Reiß One-Way Anova...... of Copenhagen Program of Excellence and Elsevier. We would also like to thank the authors for contributing interesting papers, the referees for their helpful reports, and the present and previous editors of SJS for their support of the publication of the papers from the satellite meeting....

  13. Physics-based statistical learning approach to mesoscopic model selection

    Science.gov (United States)

    Taverniers, Søren; Haut, Terry S.; Barros, Kipton; Alexander, Francis J.; Lookman, Turab

    2015-11-01

    In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.

  14. Nuclear EMC effect in non-extensive statistical model

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos

    2013-05-01

    In the present work, we attempt to describe the nuclear EMC effect by using the proton structure functions obtained from the non-extensive statistical quark model. We record that such model has three fundamental variables, the temperature T, the radius, and the Tsallis parameter q. By combining different small changes, a good agreement with the experimental data may be obtained. Another interesting point of the model is to allow phenomenological interpretation, for instance, with q constant and changing the radius and the temperature or changing the radius and q and keeping the temperature.

  15. New statistical lattice model with double honeycomb symmetry

    Science.gov (United States)

    Naji, S.; Belhaj, A.; Labrim, H.; Bhihi, M.; Benyoussef, A.; El Kenz, A.

    2014-04-01

    Inspired from the connection between Lie symmetries and two-dimensional materials, we propose a new statistical lattice model based on a double hexagonal structure appearing in the G2 symmetry. We first construct an Ising-1/2 model, with spin values σ = ±1, exhibiting such a symmetry. The corresponding ground state shows the ferromagnetic, the antiferromagnetic, the partial ferrimagnetic and the topological ferrimagnetic phases depending on the exchange couplings. Then, we examine the phase diagrams and the magnetization using the mean field approximation (MFA). Among others, it has been suggested that the present model could be localized between systems involving the triangular and the single hexagonal lattice geometries.

  16. Statistical shape model with random walks for inner ear segmentation

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    Cochlear implants can restore hearing to completely or partially deaf patients. The intervention planning can be aided by providing a patient-specific model of the inner ear. Such a model has to be built from high resolution images with accurate segmentations. Thus, a precise segmentation...... is required. We propose a new framework for segmentation of micro-CT cochlear images using random walks combined with a statistical shape model (SSM). The SSM allows us to constrain the less contrasted areas and ensures valid inner ear shape outputs. Additionally, a topology preservation method is proposed...

  17. Think continuous: Markovian Gaussian models in spatial statistics

    CERN Document Server

    Simpson, Daniel; Rue, Håvard

    2011-01-01

    Gaussian Markov random fields (GMRFs) are frequently used as computationally efficient models in spatial statistics. Unfortunately, it has traditionally been difficult to link GMRFs with the more traditional Gaussian random field models as the Markov property is difficult to deploy in continuous space. Following the pioneering work of Lindgren et al. (2011), we expound on the link between Markovian Gaussian random fields and GMRFs. In particular, we discuss the theoretical and practical aspects of fast computation with continuously specified Markovian Gaussian random fields, as well as the clear advantages they offer in terms of clear, parsimonious and interpretable models of anisotropy and non-stationarity.

  18. Factors influencing adoption of farm management practices in three agrobiodiversity hotspots in India: an analysis using the Count Data Model

    Directory of Open Access Journals (Sweden)

    Prabhakaran T. Raghu

    2014-07-01

    Full Text Available Sustainable agricultural practices require, among other factors, adoption of improved nutrient management techniques, pest mitigation technology and soil conservation measures. Such improved management practices can be tools for enhancing crop productivity. Data on micro-level farm management practices from developing countries is either scarce or unavailable, despite the importance of their policy implications with regard to resource allocation. The present study investigates adoption of some farm management practices and factors influencing the adoption behavior of farm households in three agrobiodiversity hotspots in India: Kundra block in the Koraput district of Odisha, Meenangadi panchayat in the Wayanad district of Kerala and Kolli Hills in the Namakkal district of Tamil Nadu. Information on farm management practices was collected from November 2011 to February 2012 from 3845 households, of which the data from 2726 farm households was used for analysis. The three most popular farm management practices adopted by farmers include: application of chemical fertilizers, farm yard manure and green manure for managing nutrients; application of chemical pesticides, inter-cropping and mixed cropping for mitigating pests; and contour bunds, grass bunds and trenches for soil conservation. A Negative Binomial count data regression model was used to estimate factors influencing decision-making by farmers on farm management practices. The regression results indicate that farmers who received information from agricultural extension are statistically significant and positively related to the adoption of farm management practices. Another key finding shows the negative relationship between cultivation of local varieties and adoption of farm management practices.

  19. The computational hardness of counting in two-spin models on d-regular graphs

    CERN Document Server

    Sly, Allan

    2012-01-01

    The class of two-spin systems contains several important models, including random independent sets and the Ising model of statistical physics. We show that for both the hard-core (independent set) model and the anti-ferromagnetic Ising model with arbitrary external field, it is NP-hard to approximate the partition function or approximately sample from the model on d-regular graphs when the model has non-uniqueness on the d-regular tree. Together with results of Jerrum--Sinclair, Weitz, and Sinclair--Srivastava--Thurley giving FPRAS's for all other two-spin systems except at the uniqueness threshold, this gives an almost complete classification of the computational complexity of two-spin systems on bounded-degree graphs. Our proof establishes that the normalized log-partition function of any two-spin system on bipartite locally tree-like graphs converges to a limiting "free energy density" which coincides with the (non-rigorous) Bethe prediction of statistical physics. We use this result to characterize the lo...

  20. Spatio-temporal statistical models with applications to atmospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Wikle, C.K.

    1996-12-31

    This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model.

  1. Spatio-temporal statistical models with applications to atmospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Wikle, Christopher K. [Iowa State Univ., Ames, IA (United States)

    1996-01-01

    This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model.

  2. Statistics of a neuron model driven by asymmetric colored noise.

    Science.gov (United States)

    Müller-Hansen, Finn; Droste, Felix; Lindner, Benjamin

    2015-02-01

    Irregular firing of neurons can be modeled as a stochastic process. Here we study the perfect integrate-and-fire neuron driven by dichotomous noise, a Markovian process that jumps between two states (i.e., possesses a non-Gaussian statistics) and exhibits nonvanishing temporal correlations (i.e., represents a colored noise). Specifically, we consider asymmetric dichotomous noise with two different transition rates. Using a first-passage-time formulation, we derive exact expressions for the probability density and the serial correlation coefficient of the interspike interval (time interval between two subsequent neural action potentials) and the power spectrum of the spike train. Furthermore, we extend the model by including additional Gaussian white noise, and we give approximations for the interspike interval (ISI) statistics in this case. Numerical simulations are used to validate the exact analytical results for pure dichotomous noise, and to test the approximations of the ISI statistics when Gaussian white noise is included. The results may help to understand how correlations and asymmetry of noise and signals in nerve cells shape neuronal firing statistics.

  3. RANDOM SYSTEMS OF HARD PARTICLES:MODELS AND STATISTICS

    Institute of Scientific and Technical Information of China (English)

    Dietrich Stoyan

    2002-01-01

    This paper surveys models and statistical properties of random systems of hard particles. Such systems appear frequently in materials science, biology and elsewhere. In mathematical - statistical investigations, simulations of such structures play an important role. In these simulations various methods and models are applied, namely the RSA model, sedimentation and collective rearrangement algorithms, molecular dynamics, and Monte Carlo methods such as the Metropolis - Hastings algorithm. The statistical description of real and simulated particle systems uses ideas of the mathematical theories of random sets and point processes. This leads to characteristics such as volume fraction or porosity, covariance,contact distribution functions, specific connectivity number from the random set approach and intensity, pair correlation function and mark correlation functions from the point process approach. Some of them can be determined stereologically using planar sections, while others can only be obtained using three - dimensional data and 3D image analysis. They are valuable tools for fitting models to empirical data and, consequently, for understanding various materials, biological structures, porous media and other practically important spatial structures.

  4. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  5. Statistical model of clutter suppression in tissue harmonic imaging

    Science.gov (United States)

    Yan, Xiang; Hamilton, Mark F.

    2011-01-01

    A statistical model is developed for the suppression of clutter in tissue harmonic imaging (THI). Tissue heterogeneity is modeled as a random phase screen that is characterized by its correlation length and variance. With the autocorrelation function taken to be Gaussian and for small variance, statistical solutions are derived for the mean intensities at the fundamental and second-harmonic frequencies in the field of a focused sound beam that propagates through the phase screen. The statistical solutions are verified by comparison with ensemble averaging of direct numerical simulations. The model demonstrates that THI reduces the aberration clutter appearing in the focal region regardless of the depth of the aberrating layer, with suppression of the clutter most effective when the layer is close to the source. The model is also applied to the reverberation clutter that is transmitted forward along the axis of the beam. As with aberration clutter, suppression of such reverberation clutter by THI is most pronounced when the tissue heterogeneity is located close to the source. PMID:21428483

  6. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  7. Real-Time Statistical Modeling of Blood Sugar.

    Science.gov (United States)

    Otoom, Mwaffaq; Alshraideh, Hussam; Almasaeid, Hisham M; López-de-Ipiña, Diego; Bravo, José

    2015-10-01

    Diabetes is considered a chronic disease that incurs various types of cost to the world. One major challenge in the control of Diabetes is the real time determination of the proper insulin dose. In this paper, we develop a prototype for real time blood sugar control, integrated with the cloud. Our system controls blood sugar by observing the blood sugar level and accordingly determining the appropriate insulin dose based on patient's historical data, all in real time and automatically. To determine the appropriate insulin dose, we propose two statistical models for modeling blood sugar profiles, namely ARIMA and Markov-based model. Our experiment used to evaluate the performance of the two models shows that the ARIMA model outperforms the Markov-based model in terms of prediction accuracy.

  8. Can spatial statistical river temperature models be transferred between catchments?

    Science.gov (United States)

    Jackson, Faye L.; Fryer, Robert J.; Hannah, David M.; Malcolm, Iain A.

    2017-09-01

    There has been increasing use of spatial statistical models to understand and predict river temperature (Tw) from landscape covariates. However, it is not financially or logistically feasible to monitor all rivers and the transferability of such models has not been explored. This paper uses Tw data from four river catchments collected in August 2015 to assess how well spatial regression models predict the maximum 7-day rolling mean of daily maximum Tw (Twmax) within and between catchments. Models were fitted for each catchment separately using (1) landscape covariates only (LS models) and (2) landscape covariates and an air temperature (Ta) metric (LS_Ta models). All the LS models included upstream catchment area and three included a river network smoother (RNS) that accounted for unexplained spatial structure. The LS models transferred reasonably to other catchments, at least when predicting relative levels of Twmax. However, the predictions were biased when mean Twmax differed between catchments. The RNS was needed to characterise and predict finer-scale spatially correlated variation. Because the RNS was unique to each catchment and thus non-transferable, predictions were better within catchments than between catchments. A single model fitted to all catchments found no interactions between the landscape covariates and catchment, suggesting that the landscape relationships were transferable. The LS_Ta models transferred less well, with particularly poor performance when the relationship with the Ta metric was physically implausible or required extrapolation outside the range of the data. A single model fitted to all catchments found catchment-specific relationships between Twmax and the Ta metric, indicating that the Ta metric was not transferable. These findings improve our understanding of the transferability of spatial statistical river temperature models and provide a foundation for developing new approaches for predicting Tw at unmonitored locations across

  9. Statistical mechanics models for multimode lasers and random lasers

    CERN Document Server

    Antenucci, F; Berganza, M Ibáñez; Marruzzo, A; Leuzzi, L

    2015-01-01

    We review recent statistical mechanical approaches to multimode laser theory. The theory has proved very effective to describe standard lasers. We refer of the mean field theory for passive mode locking and developments based on Monte Carlo simulations and cavity method to study the role of the frequency matching condition. The status for a complete theory of multimode lasing in open and disordered cavities is discussed and the derivation of the general statistical models in this framework is presented. When light is propagating in a disordered medium, the system can be analyzed via the replica method. For high degrees of disorder and nonlinearity, a glassy behavior is expected at the lasing threshold, providing a suggestive link between glasses and photonics. We describe in details the results for the general Hamiltonian model in mean field approximation and mention an available test for replica symmetry breaking from intensity spectra measurements. Finally, we summary some perspectives still opened for such...

  10. Passive Target Tracking Based on Current Statistical Model

    Institute of Scientific and Technical Information of China (English)

    DENG Xiao-long; XIE Jian-ying; YANG Yu-pu

    2005-01-01

    Bearing-only passive tracking is regarded as a nonlinear hard tracking problem. There are still no completely good solutions to this problem until now. Based on current statistical model, the novel solution to this problem utilizing particle filter (PF) and the unscented Kalman filter (UKF) is proposed. The new solution adopts data fusion from two observers to increase the observability of passive tracking. It applies the residual resampling step to reduce the degeneracy of PF and it introduces the Markov Chain Monte Carlo methods (MCMC) to reduce the effect of the "sample impoverish". Based on current statistical model, the EKF, the UKF and particle filter with various proposal distributions are compared in the passive tracking experiments with two observers. The simulation results demonstrate the good performance of the proposed new filtering methods with the novel techniques.

  11. Statistical detection of structural damage based on model reduction

    Institute of Scientific and Technical Information of China (English)

    Tao YIN; Heung-fai LAM; Hong-ping ZHU

    2009-01-01

    This paper proposes a statistical method for damage detection based on the finite element (FE) model reduction technique that utilizes measured modal data with a limited number of sensors.A deterministic damage detection process is formulated based on the model reduction technique.The probabilistic process is integrated into the deterministic damage detection process using a perturbation technique,resulting in a statistical structural damage detection method.This is achieved by deriving the firstand second-order partial derivatives of uncertain parameters,such as elasticity of the damaged member,with respect to the measurement noise,which allows expectation and covariance matrix of the uncertain parameters to be calculated.Besides the theoretical development,this paper reports numerical verification of the proposed method using a portal frame example and Monte Carlo simulation.

  12. Statistical inference to advance network models in epidemiology.

    Science.gov (United States)

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  13. Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.

    Directory of Open Access Journals (Sweden)

    John K Hillier

    Full Text Available Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL. By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity. Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.

  14. Statistical Quark Model for the Nucleon Structure Function

    Science.gov (United States)

    Mirez, Carlos; Tomio, Lauro; Trevisan, Luis A.; Frederico, Tobias

    2009-06-01

    A statistical quark model, with quark energy levels given by a central linear confining potential is used to obtain the light sea-quark asymmetry, d¯/ū, and also for the ratio d/u, inside the nucleon. After adjusting a temperature parameter by the Gottfried sum rule violation, and chemical potentials by the valence up and down quark normalizations, the results are compared with experimental data available.

  15. A statistical mechanics model of carbon nanotube macro-films

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Carbon nanotube macro-films are two-dimensional films with micrometer thickness and centimeter by centimeter in-plane dimension.These carbon nanotube macroscopic assemblies have attracted significant attention from the material and mechanics communities recently because they can be easily handled and tailored to meet specific engineering needs.This paper reports the experimental methods on the preparation and characterization of single-walled carbon nanotube macro-films,and a statistical mechanics model on ...

  16. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Science.gov (United States)

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  17. Physical-Statistical Model of Thermal Conductivity of Nanofluids

    Directory of Open Access Journals (Sweden)

    B. Usowicz

    2014-01-01

    Full Text Available A physical-statistical model for predicting the effective thermal conductivity of nanofluids is proposed. The volumetric unit of nanofluids in the model consists of solid, liquid, and gas particles and is treated as a system made up of regular geometric figures, spheres, filling the volumetric unit by layers. The model assumes that connections between layers of the spheres and between neighbouring spheres in the layer are represented by serial and parallel connections of thermal resistors, respectively. This model is expressed in terms of thermal resistance of nanoparticles and fluids and the multinomial distribution of particles in the nanofluids. The results for predicted and measured effective thermal conductivity of several nanofluids (Al2O3/ethylene glycol-based and Al2O3/water-based; CuO/ethylene glycol-based and CuO/water-based; and TiO2/ethylene glycol-based are presented. The physical-statistical model shows a reasonably good agreement with the experimental results and gives more accurate predictions for the effective thermal conductivity of nanofluids compared to existing classical models.

  18. The Ising model in physics and statistical genetics.

    Science.gov (United States)

    Majewski, J; Li, H; Ott, J

    2001-10-01

    Interdisciplinary communication is becoming a crucial component of the present scientific environment. Theoretical models developed in diverse disciplines often may be successfully employed in solving seemingly unrelated problems that can be reduced to similar mathematical formulation. The Ising model has been proposed in statistical physics as a simplified model for analysis of magnetic interactions and structures of ferromagnetic substances. Here, we present an application of the one-dimensional, linear Ising model to affected-sib-pair (ASP) analysis in genetics. By analyzing simulated genetics data, we show that the simplified Ising model with only nearest-neighbor interactions between genetic markers has statistical properties comparable to much more complex algorithms from genetics analysis, such as those implemented in the Allegro and Mapmaker-Sibs programs. We also adapt the model to include epistatic interactions and to demonstrate its usefulness in detecting modifier loci with weak individual genetic contributions. A reanalysis of data on type 1 diabetes detects several susceptibility loci not previously found by other methods of analysis.

  19. Statistical mechanics of the Huxley-Simmons model.

    Science.gov (United States)

    Caruel, M; Truskinovsky, L

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971)NATUAS0028-083610.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  20. Statistical mechanics of the Huxley-Simmons model

    CERN Document Server

    Caruel, M

    2016-01-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971)] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power-stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  1. Statistical mechanics of the Huxley-Simmons model

    Science.gov (United States)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  2. A generalized statistical model for the size distribution of wealth

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  3. BOX-COX transformation and random regression models for fecal egg count data

    Directory of Open Access Journals (Sweden)

    Marcos Vinicius Silva

    2012-01-01

    Full Text Available Accurate genetic evaluation of livestock is based on appropriate modeling of phenotypic measurements. In ruminants fecal egg count (FEC is commonly used to measure resistance to nematodes. FEC values are not normally distributed and logarithmic transformations have been used to achieve normality before analysis. However, the transformed data are often not normally distributed, especially when data are extremely skewed. A series of repeated FEC measurements may provide information about the population dynamics of a group or individual. A total of 6,375 FEC measures were obtained for 410 animals between 1992 and 2003 from the Beltsville Agricultural Research Center Angus herd. Original data were transformed using an extension of the Box-Cox transformation to approach normality and to estimate (covariance components. We also proposed using random regression models (RRM for genetic and non-genetic studies of FEC. Phenotypes were analyzed using RRM and restricted maximum likelihood. Within the different orders of Legendre polynomials used, those with more parameters (order 4 adjusted FEC data best. Results indicated that the transformation of FEC data utilizing the Box-Cox transformation family was effective in reducing the skewness and kurtosis, and dramatically increased estimates of heritability, and measurements of FEC obtained in the period between 12 and 26 weeks in a 26-week experimental challenge period are genetically correlated.

  4. A Statistical Model for In Vivo Neuronal Dynamics.

    Directory of Open Access Journals (Sweden)

    Simone Carlo Surace

    Full Text Available Single neuron models have a long tradition in computational neuroscience. Detailed biophysical models such as the Hodgkin-Huxley model as well as simplified neuron models such as the class of integrate-and-fire models relate the input current to the membrane potential of the neuron. Those types of models have been extensively fitted to in vitro data where the input current is controlled. Those models are however of little use when it comes to characterize intracellular in vivo recordings since the input to the neuron is not known. Here we propose a novel single neuron model that characterizes the statistical properties of in vivo recordings. More specifically, we propose a stochastic process where the subthreshold membrane potential follows a Gaussian process and the spike emission intensity depends nonlinearly on the membrane potential as well as the spiking history. We first show that the model has a rich dynamical repertoire since it can capture arbitrary subthreshold autocovariance functions, firing-rate adaptations as well as arbitrary shapes of the action potential. We then show that this model can be efficiently fitted to data without overfitting. We finally show that this model can be used to characterize and therefore precisely compare various intracellular in vivo recordings from different animals and experimental conditions.

  5. Spatio-energetic cross-talks in photon counting detectors: detector model and correlated Poisson data generator

    Science.gov (United States)

    Taguchi, Katsuyuki; Polster, Christoph; Lee, Okkyun; Kappler, Steffen

    2016-03-01

    An x-ray photon interacts with photon counting detectors (PCDs) and generates an electron charge cloud or multiple clouds. The clouds (thus, the photon energy) may be split between two adjacent PCD pixels when the interaction occurs near pixel boundaries, producing a count at both of the two pixels. This is called double-counting with charge sharing. The output of individual PCD pixel is Poisson distributed integer counts; however, the outputs of adjacent pixels are correlated due to double-counting. Major problems are the lack of detector noise model for the spatio-energetic crosstalk and the lack of an efficient simulation tool. Monte Carlo simulation can accurately simulate these phenomena and produce noisy data; however, it is not computationally efficient. In this study, we developed a new detector model and implemented into an efficient software simulator which uses a Poisson random number generator to produce correlated noisy integer counts. The detector model takes the following effects into account effects: (1) detection efficiency and incomplete charge collection; (2) photoelectric effect with total absorption; (3) photoelectric effect with fluorescence x-ray emission and re-absorption; (4) photoelectric effect with fluorescence x-ray emission which leaves PCD completely; and (5) electric noise. The model produced total detector spectrum similar to previous MC simulation data. The model can be used to predict spectrum and correlation with various different settings. The simulated noisy data demonstrated the expected performance: (a) data were integers; (b) the mean and covariance matrix was close to the target values; (c) noisy data generation was very efficient

  6. Calculation of statistical entropic measures in a model of solids

    CERN Document Server

    Sanudo, Jaime

    2012-01-01

    In this work, a one-dimensional model of crystalline solids based on the Dirac comb limit of the Kronig-Penney model is considered. From the wave functions of the valence electrons, we calculate a statistical measure of complexity and the Fisher-Shannon information for the lower energy electronic bands appearing in the system. All these magnitudes present an extremal value for the case of solids having half-filled bands, a configuration where in general a high conductivity is attained in real solids, such as it happens with the monovalent metals.

  7. Linguistically motivated statistical machine translation models and algorithms

    CERN Document Server

    Xiong, Deyi

    2015-01-01

    This book provides a wide variety of algorithms and models to integrate linguistic knowledge into Statistical Machine Translation (SMT). It helps advance conventional SMT to linguistically motivated SMT by enhancing the following three essential components: translation, reordering and bracketing models. It also serves the purpose of promoting the in-depth study of the impacts of linguistic knowledge on machine translation. Finally it provides a systematic introduction of Bracketing Transduction Grammar (BTG) based SMT, one of the state-of-the-art SMT formalisms, as well as a case study of linguistically motivated SMT on a BTG-based platform.

  8. Efficient Parallel Statistical Model Checking of Biochemical Networks

    Directory of Open Access Journals (Sweden)

    Paolo Ballarini

    2009-12-01

    Full Text Available We consider the problem of verifying stochastic models of biochemical networks against behavioral properties expressed in temporal logic terms. Exact probabilistic verification approaches such as, for example, CSL/PCTL model checking, are undermined by a huge computational demand which rule them out for most real case studies. Less demanding approaches, such as statistical model checking, estimate the likelihood that a property is satisfied by sampling executions out of the stochastic model. We propose a methodology for efficiently estimating the likelihood that a LTL property P holds of a stochastic model of a biochemical network. As with other statistical verification techniques, the methodology we propose uses a stochastic simulation algorithm for generating execution samples, however there are three key aspects that improve the efficiency: first, the sample generation is driven by on-the-fly verification of P which results in optimal overall simulation time. Second, the confidence interval estimation for the probability of P to hold is based on an efficient variant of the Wilson method which ensures a faster convergence. Third, the whole methodology is designed according to a parallel fashion and a prototype software tool has been implemented that performs the sampling/verification process in parallel over an HPC architecture.

  9. Non-gaussianity and Statistical Anisotropy in Cosmological Inflationary Models

    CERN Document Server

    Valenzuela-Toledo, Cesar A

    2010-01-01

    We study the statistical descriptors for some cosmological inflationary models that allow us to get large levels of non-gaussianity and violations of statistical isotropy. Basically, we study two different class of models: a model that include only scalar field perturbations, specifically a subclass of small-field slow-roll models of inflation with canonical kinetic terms, and models that admit both vector and scalar field perturbations. We study the former to show that it is possible to attain very high, including observable, values for the levels of non-gaussianity f_{NL} and \\tao_{NL} in the bispectrum B_\\zeta and trispectrum T_\\zeta of the primordial curvature perturbation \\zeta respectively. Such a result is obtained by taking care of loop corrections in the spectrum P_\\zeta, the bispectrum B_\\zeta and the trispectrum T_\\zeta . Sizeable values for f_{NL} and \\tao_{NL} arise even if \\zeta is generated during inflation. For the latter we study the spectrum P_\\zeta, bispectrum B_\\zeta and trispectrum $T_\\ze...

  10. Anyonic behavior of an intermediate-statistics fermion gas model.

    Science.gov (United States)

    Algin, Abdullah; Irk, Dursun; Topcu, Gozde

    2015-06-01

    We study the high-temperature behavior of an intermediate-statistics fermionic gas model whose quantum statistical properties enable us to effectively deduce the details about both the interaction among deformed (quasi)particles and their anyonic behavior. Starting with a deformed fermionic grand partition function, we calculate, in the thermodynamical limit, several thermostatistical functions of the model such as the internal energy and the entropy by means of a formalism of the fermionic q calculus. For high temperatures, a virial expansion of the equation of state for the system is obtained in two and three dimensions and the first five virial coefficients are derived in terms of the model deformation parameter q. From the results obtained by the effect of fermionic deformation, it is found that the model parameter q interpolates completely between bosonlike and fermionic systems via the behaviors of the third and fifth virial coefficients in both two and three spatial dimensions and in addition it characterizes effectively the interaction among quasifermions. Our results reveal that the present deformed (quasi)fermion model could be very efficient and effective in accounting for the nonlinear behaviors in interacting composite particle systems.

  11. A statistical permafrost distribution model for the European Alps

    Directory of Open Access Journals (Sweden)

    L. Boeckli

    2011-05-01

    Full Text Available Permafrost distribution modeling in densely populated mountain regions is an important task to support the construction of infrastructure and for the assessment of climate change effects on permafrost and related natural systems. In order to analyze permafrost distribution and evolution on an Alpine-wide scale, one consistent model for the entire domain is needed.

    We present a statistical permafrost model for the entire Alps based on rock glacier inventories and rock surface temperatures. Starting from an integrated model framework, two different sub-models were developed, one for debris covered areas (debris model and one for steep rock faces (rock model. For the debris model a generalized linear mixed-effect model (GLMM was used to predict the probability of a rock glacier being intact as opposed to relict. The model is based on the explanatory variables mean annual air temperature (MAAT, potential incoming solar radiation (PISR and the mean annual sum of precipitation (PRECIP, and achieves an excellent discrimination (area under the receiver-operating characteristic, AUROC = 0.91. Surprisingly, the probability of a rock glacier being intact is positively associated with increasing PRECIP for given MAAT and PISR conditions. The rock model was calibrated with mean annual rock surface temperatures (MARST and is based on MAAT and PISR. The linear regression achieves a root mean square error (RMSE of 1.6 °C. The final model combines the two sub-models and accounts for the different scales used for model calibration. Further steps to transfer this model into a map-based product are outlined.

  12. The influence of design characteristics on statistical inference in nonlinear estimation: A simulation study based on survival data and hazard modeling

    DEFF Research Database (Denmark)

    Andersen, J.S.; Bedaux, J.J.M.; Kooijman, S.A.L.M.;

    2000-01-01

    This paper describes the influence of design characteristics on the statistical inference for an ecotoxicological hazard-based model using simulated survival data. The design characteristics of interest are the number and spacing of observations (counts) in time, the number and spacing of exposure...

  13. The Impact of Statistical Leakage Models on Design Yield Estimation

    Directory of Open Access Journals (Sweden)

    Rouwaida Kanj

    2011-01-01

    Full Text Available Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling.

  14. Statistical models of video structure for content analysis and characterization.

    Science.gov (United States)

    Vasconcelos, N; Lippman, A

    2000-01-01

    Content structure plays an important role in the understanding of video. In this paper, we argue that knowledge about structure can be used both as a means to improve the performance of content analysis and to extract features that convey semantic information about the content. We introduce statistical models for two important components of this structure, shot duration and activity, and demonstrate the usefulness of these models with two practical applications. First, we develop a Bayesian formulation for the shot segmentation problem that is shown to extend the standard thresholding model in an adaptive and intuitive way, leading to improved segmentation accuracy. Second, by applying the transformation into the shot duration/activity feature space to a database of movie clips, we also illustrate how the Bayesian model captures semantic properties of the content. We suggest ways in which these properties can be used as a basis for intuitive content-based access to movie libraries.

  15. Liver recognition based on statistical shape model in CT images

    Science.gov (United States)

    Xiang, Dehui; Jiang, Xueqing; Shi, Fei; Zhu, Weifang; Chen, Xinjian

    2016-03-01

    In this paper, an automatic method is proposed to recognize the liver on clinical 3D CT images. The proposed method effectively use statistical shape model of the liver. Our approach consist of three main parts: (1) model training, in which shape variability is detected using principal component analysis from the manual annotation; (2) model localization, in which a fast Euclidean distance transformation based method is able to localize the liver in CT images; (3) liver recognition, the initial mesh is locally and iteratively adapted to the liver boundary, which is constrained with the trained shape model. We validate our algorithm on a dataset which consists of 20 3D CT images obtained from different patients. The average ARVD was 8.99%, the average ASSD was 2.69mm, the average RMSD was 4.92mm, the average MSD was 28.841mm, and the average MSD was 13.31%.

  16. Estimating Predictive Variance for Statistical Gas Distribution Modelling

    Science.gov (United States)

    Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo

    2009-05-01

    Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.

  17. Statistical Inference for Partially Linear Regression Models with Measurement Errors

    Institute of Scientific and Technical Information of China (English)

    Jinhong YOU; Qinfeng XU; Bin ZHOU

    2008-01-01

    In this paper, the authors investigate three aspects of statistical inference for the partially linear regression models where some covariates are measured with errors. Firstly,a bandwidth selection procedure is proposed, which is a combination of the difference-based technique and GCV method. Secondly, a goodness-of-fit test procedure is proposed,which is an extension of the generalized likelihood technique. Thirdly, a variable selection procedure for the parametric part is provided based on the nonconcave penalization and corrected profile least squares. Same as "Variable selection via nonconcave penalized like-lihood and its oracle properties" (J. Amer. Statist. Assoc., 96, 2001, 1348-1360), it is shown that the resulting estimator has an oracle property with a proper choice of regu-larization parameters and penalty function. Simulation studies are conducted to illustrate the finite sample performances of the proposed procedures.

  18. The Statistical Multifragmentation Model with Skyrme Effective Interactions

    CERN Document Server

    Souza, S R; Donangelo, R; Lynch, W G; Steiner, A W; Tsang, M B

    2009-01-01

    The Statistical Multifragmentation Model is modified to incorporate the Helmholtz free energies calculated in the finite temperature Thomas-Fermi approximation using Skyrme effective interactions. In this formulation, the density of the fragments at the freeze-out configuration corresponds to the equilibrium value obtained in the Thomas-Fermi approximation at the given temperature. The behavior of the nuclear caloric curve at constant volume is investigated in the micro-canonical ensemble and a plateau is observed for excitation energies between 8 and 10 MeV per nucleon. A kink in the caloric curve is found at the onset of this gas transition, indicating the existence of a small excitation energy region with negative heat capacity. In contrast to previous statistical calculations, this situation takes place even in this case in which the system is constrained to fixed volume. The observed phase transition takes place at approximately constant entropy. The charge distribution and other observables also turn ou...

  19. The Statistical Multifragmentation Model with Skyrme Effective Interactions

    CERN Document Server

    Carlson, B V; Donangelo, R; Lynch, W G; Steiner, A W; Tsang, M B

    2010-01-01

    The Statistical Multifragmentation Model is modified to incorporate Helmholtz free energies calculated in the finite temperature Thomas-Fermi approximation using Skyrme effective interactions. In this formulation, the density of the fragments at the freeze-out configuration corresponds to the equilibrium value obtained in the Thomas-Fermi approximation at the given temperature. The behavior of the nuclear caloric curve, at constant volume, is investigated in the micro-canonical ensemble and a plateau is observed for excitation energies between 8 and 10 MeV per nucleon. A small kink in the caloric curve is found at the onset of this gas transition, indicating the existence of negative heat capacity, even in this case in which the system is constrained to a fixed volume, in contrast to former statistical calculations.

  20. WE-A-201-02: Modern Statistical Modeling.

    Science.gov (United States)

    Niemierko, A

    2016-06-01

    Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the "big tent" vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that "Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]". Don developed an interest in chemistry at school by "reading a book" - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to a BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory

  1. Modeling phenotypic plasticity in growth trajectories: a statistical framework.

    Science.gov (United States)

    Wang, Zhong; Pang, Xiaoming; Wu, Weimiao; Wang, Jianxin; Wang, Zuoheng; Wu, Rongling

    2014-01-01

    Phenotypic plasticity, that is multiple phenotypes produced by a single genotype in response to environmental change, has been thought to play an important role in evolution and speciation. Historically, knowledge about phenotypic plasticity has resulted from the analysis of static traits measured at a single time point. New insight into the adaptive nature of plasticity can be gained by an understanding of how organisms alter their developmental processes in a range of environments. Recent advances in statistical modeling of functional data and developmental genetics allow us to construct a dynamic framework of plastic response in developmental form and pattern. Under this framework, development, genetics, and evolution can be synthesized through statistical bridges to better address how evolution results from phenotypic variation in the process of development via genetic alterations.

  2. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  3. Statistical Process Control of a Kalman Filter Model

    Science.gov (United States)

    Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A.

    2014-01-01

    For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959

  4. Statistical Process Control of a Kalman Filter Model

    Directory of Open Access Journals (Sweden)

    Sonja Gamse

    2014-09-01

    Full Text Available For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

  5. A Statistical Quality Model for Data-Driven Speech Animation.

    Science.gov (United States)

    Ma, Xiaohan; Deng, Zhigang

    2012-11-01

    In recent years, data-driven speech animation approaches have achieved significant successes in terms of animation quality. However, how to automatically evaluate the realism of novel synthesized speech animations has been an important yet unsolved research problem. In this paper, we propose a novel statistical model (called SAQP) to automatically predict the quality of on-the-fly synthesized speech animations by various data-driven techniques. Its essential idea is to construct a phoneme-based, Speech Animation Trajectory Fitting (SATF) metric to describe speech animation synthesis errors and then build a statistical regression model to learn the association between the obtained SATF metric and the objective speech animation synthesis quality. Through delicately designed user studies, we evaluate the effectiveness and robustness of the proposed SAQP model. To the best of our knowledge, this work is the first-of-its-kind, quantitative quality model for data-driven speech animation. We believe it is the important first step to remove a critical technical barrier for applying data-driven speech animation techniques to numerous online or interactive talking avatar applications.

  6. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  7. Statistical process control of a Kalman filter model.

    Science.gov (United States)

    Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A

    2014-09-26

    For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

  8. A statistical model for interpreting computerized dynamic posturography data

    Science.gov (United States)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  9. Reference analysis of the signal + background model in counting experiments II. Approximate reference prior

    CERN Document Server

    Casadei, Diego

    2014-01-01

    The objective Bayesian treatment of a model representing two independent Poisson processes, labelled as "signal" and "background" and both contributing additively to the total number of counted events, is considered. It is shown that the reference prior for the parameter of interest (the signal intensity) is well approximated by the widely (ab)used flat prior only when the expected background is very high. For a large portion of the background parameters space, a very simple approximation (the asymptotic form of the reference prior for the limit of perfect prior background knowledge) can be safely used. In all cases, this approximation outperforms the uniform prior. When the asymptotic prior is not good enough, a simple 1-parameter fitting function is often sufficient to obtain an objective Bayesian solution. Otherwise, it is shown that a 2-parameters fitting function is able to reproduce the reference prior in all other cases. The latter is also useful to speed-up the computing time, which can be useful in a...

  10. Hybrid Perturbation methods based on Statistical Time Series models

    CERN Document Server

    San-Juan, Juan Félix; Pérez, Iván; López, Rosario

    2016-01-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...

  11. RM-structure alignment based statistical machine translation model

    Institute of Scientific and Technical Information of China (English)

    Sun Jiadong; Zhao Tiejun

    2008-01-01

    A novel model based on structure alignments is proposed for statistical machine translation in this paper.Meta-structure and sequence of meta-structure for a parse tree are defined.During the translation process, a parse tree is decomposed to deal with the structure divergence and the alignments can be constructed at different levels of recombination of meta-structure (RM).This method can perform the structure mapping across the sub-tree structure between languages.As a result, we get not only the translation for the target language, but sequence of meta-structure of its parse tree at the same time.Experiments show that the model in the framework of log-linear model has better generative ability and significantly outperforms Pharaoh, a phrase-based system.

  12. STATISTICAL ANALYSIS OF THE TM- MODEL VIA BAYESIAN APPROACH

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2012-11-01

    Full Text Available The method of paired comparisons calls for the comparison of treatments presented in pairs to judges who prefer the better one based on their sensory evaluations. Thurstone (1927 and Mosteller (1951 employ the method of maximum likelihood to estimate the parameters of the Thurstone-Mosteller model for the paired comparisons. A Bayesian analysis of the said model using the non-informative reference (Jeffreys prior is presented in this study. The posterior estimates (means and joint modes of the parameters and the posterior probabilities comparing the two parameters are obtained for the analysis. The predictive probabilities that one treatment (Ti in preferred to any other treatment (Tj in a future single comparison are also computed. In addition, the graphs of the marginal posterior distributions of the individual parameter are drawn. The appropriateness of the model is also tested using the Chi-Square test statistic.

  13. Dynamic statistical models of biological cognition: insights from communications theory

    Science.gov (United States)

    Wallace, Rodrick

    2014-10-01

    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  14. Exploiting linkage disequilibrium in statistical modelling in quantitative genomics

    DEFF Research Database (Denmark)

    Wang, Lei

    Alleles at two loci are said to be in linkage disequilibrium (LD) when they are correlated or statistically dependent. Genomic prediction and gene mapping rely on the existence of LD between gentic markers and causul variants of complex traits. In the first part of the thesis, a novel method...... to quantify and visualize local variation in LD along chromosomes in describet, and applied to characterize LD patters at the local and genome-wide scale in three Danish pig breeds. In the second part, different ways of taking LD into account in genomic prediction models are studied. One approach is to use...... the recently proposed antedependence models, which treat neighbouring marker effects as correlated; another approach involves use of haplotype block information derived using the program Beagle. The overall conclusion is that taking LD information into account in genomic prediction models potentially improves...

  15. Modeling, dependence, classification, united statistical science, many cultures

    CERN Document Server

    Parzen, Emanuel

    2012-01-01

    Breiman (2001) proposed to statisticians awareness of two cultures: 1. Parametric modeling culture, pioneered by R.A.Fisher and Jerzy Neyman; 2. Algorithmic predictive culture, pioneered by machine learning research. Parzen (2001), as a part of discussing Breiman (2001), proposed that researchers be aware of many cultures, including the focus of our research: 3. Nonparametric, quantile based, information theoretic modeling. Our research seeks to unify statistical problem solving in terms of comparison density, copula density, measure of dependence, correlation, information, new measures (called LP score comoments) that apply to long tailed distributions with out finite second order moments. A very important goal is to unify methods for discrete and continuous random variables. We are actively developing these ideas, which have a history of many decades, since Parzen (1979, 1983) and Eubank et al. (1987). Our research extends these methods to modern high dimensional data modeling.

  16. Discrete dynamical models: combinatorics, statistics and continuum approximations

    CERN Document Server

    Kornyak, Vladimir V

    2015-01-01

    This essay advocates the view that any problem that has a meaningful empirical content, can be formulated in constructive, more definitely, finite terms. We consider combinatorial models of dynamical systems and approaches to statistical description of such models. We demonstrate that many concepts of continuous physics --- such as continuous symmetries, the principle of least action, Lagrangians, deterministic evolution equations --- can be obtained from combinatorial structures as a result of the large number approximation. We propose a constructive description of quantum behavior that provides, in particular, a natural explanation of appearance of complex numbers in the formalism of quantum mechanics. Some approaches to construction of discrete models of quantum evolution that involve gauge connections are discussed.

  17. Statistical mechanics of Monod-Wyman-Changeux (MWC) models.

    Science.gov (United States)

    Marzen, Sarah; Garcia, Hernan G; Phillips, Rob

    2013-05-13

    The 50th anniversary of the classic Monod-Wyman-Changeux (MWC) model provides an opportunity to survey the broader conceptual and quantitative implications of this quintessential biophysical model. With the use of statistical mechanics, the mathematical implementation of the MWC concept links problems that seem otherwise to have no ostensible biological connection including ligand-receptor binding, ligand-gated ion channels, chemotaxis, chromatin structure and gene regulation. Hence, a thorough mathematical analysis of the MWC model can illuminate the performance limits of a number of unrelated biological systems in one stroke. The goal of our review is twofold. First, we describe in detail the general physical principles that are used to derive the activity of MWC molecules as a function of their regulatory ligands. Second, we illustrate the power of ideas from information theory and dynamical systems for quantifying how well the output of MWC molecules tracks their sensory input, giving a sense of the "design" constraints faced by these receptors.

  18. A new modeling and simulation method for important statistical performance prediction of single photon avalanche diode detectors

    Science.gov (United States)

    Xu, Yue; Xiang, Ping; Xie, Xiaopeng; Huang, Yang

    2016-06-01

    This paper presents a new modeling and simulation method to predict the important statistical performance of single photon avalanche diode (SPAD) detectors, including photon detection efficiency (PDE), dark count rate (DCR) and afterpulsing probability (AP). Three local electric field models are derived for the PDE, DCR and AP calculations, which show analytical dependence of key parameters such as avalanche triggering probability, impact ionization rate and electric field distributions that can be directly obtained from Geiger mode Technology Computer Aided Design (TCAD) simulation. The model calculation results are proven to be in good agreement with the reported experimental data in the open literature, suggesting that the proposed modeling and simulation method is very suitable for the prediction of SPAD statistical performance.

  19. Symmetry Energy Effects in a Statistical Multifragmentation Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lei; GAO Yuan1; ZHANG Hong-Fei; CHEN Xi-Meng; Yu Mei-Ling; LI Jun-Qing

    2011-01-01

    The symmetry energy effects on the nuclear disintegration mechanisms of the neutron-rich system (A0 = 200, Z0 = 78) are studied in the framework of the statistical multifragmentation model (SMM) within its micro-canonical ensemble. A modified symmetry energy term with consideration of the volume and surface asymmetry is adopted instead of the original invariable value in the standard SMM model. The results indicate that as the volume and surface asymmetries are considered, the neutron-rich system translates to a fission-like process from evaporation earlier than the original standard SMM model at lower excitation energies, and its mass distribution has larger probabilities in the medium-heavy nuclei range so that the system breaks up more averagely. When the excitation energy becomes higher, the volume and surface asymmetry lead to a smaller average multiplicity.%The symmetry energy effects on the nuclear disintegration mechanisms of the neutron-rich system (A0 =200,Z0 =78) are studied in the framework of the statistical multifragmentation model (SMM) within its micro-canonical ensemble.A modified symmetry energy term with consideration of the volume and surface asymmetry is adopted instead of the original invariable value in the standard SMM model.The results indicate that as the volume and surface asymmetries are considered,the neutron-rich system translates to a fission-like process from evaporation earlier than the original standard SMM model at lower excitation energies,and its mass distribution has larger probabilities in the medium-heavy nuclei range so that the system breaks up more averagely.When the excitation energy becomes higher,the volume and surface asymmetry lead to a smaller average multiplicity.

  20. Masked areas in shear peak statistics. A forward modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-03-09

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.