WorldWideScience

Sample records for channel statistical model

  1. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  2. Statistic Model Based Dynamic Channel Compensation for Telephony Speech Recognition

    Institute of Scientific and Technical Information of China (English)

    ZHANGHuayun; HANZhaobing; XUBo

    2004-01-01

    The degradation of speech recognition performance in real-life environments and through transmission channels is a main embarrassment for many speechbased applications around the world, especially when nonstationary noise and changing channel exist. Previous works have shown that the main reason for this performance degradation is the variational mismatch caused by different telephone channels between the testing and training sets. In this paper, we propose a statistic model based implementation to dynamically compensate this mismatch. Firstly, we focus on a Maximum-likelihood (ML) estimation algorithm for telephone channels. In experiments on Mandarin Large vocabulary continuous speech recognition (LVCSR) over telephone lines, the Character error rate (CER) decreases more than 20%. The average delay is about 300-400ms. Secondly, we will extend it by introducing a phone-conditioned prior statistic model for the channels and applying Maximum a posteriori (MAP) estimation technique. Compared to the ML based method, the MAP based algorithm follows with the variations within channels more effectively. Average delay of the algorithm is decreased to 200ms. An additional 7-8% CER relative reduction is observed in LVCSR.

  3. A new simple model for composite fading channels: Second order statistics and channel capacity

    KAUST Repository

    Yilmaz, Ferkan

    2010-09-01

    In this paper, we introduce the most general composite fading distribution to model the envelope and the power of the received signal in such fading channels as millimeter wave (60 GHz or above) fading channels and free-space optical channels, which we term extended generalized-K (EGK) composite fading distribution. We obtain the second-order statistics of the received signal envelope characterized by the EGK composite fading distribution. Expressions for probability density function, cumulative distribution function, level crossing rate and average fade duration, moments, amount of fading and average capacity are derived. Numerical and computer simulation examples validate the accuracy of the presented mathematical analysis. © 2010 IEEE.

  4. Statistical Channel Model for 60 GHz WLAN Systems in Conference Room Environment

    Directory of Open Access Journals (Sweden)

    A. Khoryaev

    2011-06-01

    Full Text Available In this work, a methodology of statistical channel modeling for 60 GHz WLAN systems is proposed and a channel model for the office conference room environment is developed. The proposed methodology takes into account the most important properties of the indoor 60 GHz propagation channel such as large propagation loss and necessity to use steerable directional antennas by the WLAN stations, quasi-optical propagation nature, clustering structure of the channel, and significant impact of the polarization characteristics. A general mathematical structure of the channel model that supports all the described 60 GHz propagation channel properties is suggested. Then the conference room scenario for 60 GHz WLAN systems is introduced. Development of the inter cluster, intra cluster, and polarization impact modeling parameters is considered in details first explaining the used methodology for each channel modeling aspect and then followed by its application to the conference room scenario. The raw data for the channel model development include the experimental results [1], [2] and ray-tracing simulations for the conference room scenario. The proposed channel modeling methodology and the developed conference room channel model were adopted by the IEEE 802.11ad committee for 60 GHz WLAN systems standardization.

  5. A Novel Statistical Channel Model for Turbulence-Induced Fading in Free-Space Optical Systems

    CERN Document Server

    Aminikashani, Mohammadreza; Kavehrad, Mohsen

    2015-01-01

    In this paper, we propose a new probability distribution function which accurately describes turbulence-induced fading under a wide range of turbulence conditions. The proposed model, termed Double Generalized Gamma (Double GG), is based on a doubly stochastic theory of scintillation and developed via the product of two Generalized Gamma (GG) distributions. The proposed Double GG distribution generalizes many existing turbulence channel models and provides an excellent fit to the published plane and spherical waves simulation data. Using this new statistical channel model, we derive closed form expressions for the outage probability and the average bit error as well as corresponding asymptotic expressions of free-space optical communication systems over turbulence channels. We demonstrate that our derived expressions cover many existing results in the literature earlier reported for Gamma-Gamma, Double-Weibull and K channels as special cases.

  6. Statistical mechanical analysis of the Kronecker channel model for MIMO wireless communication

    CERN Document Server

    Hatabu, Atsushi; Kabashima, Yoshiyuki

    2009-01-01

    The Kronecker channel model of wireless communication is analyzed using statistical mechanics methods. In the model, spatial proximities among transmission/reception antennas are taken into account as certain correlation matrices, which generally yield non-trivial dependence among symbols to be estimated. This prevents accurate assessment of the communication performance by naively using a previously developed analytical scheme based on a matrix integration formula. In order to resolve this difficulty, we develop a formalism that can formally handle the correlations in Kronecker models based on the known scheme. Unfortunately, direct application of the developed scheme is, in general, practically difficult. However, the formalism is still useful, indicating that the effect of the correlations generally increase after the fourth order with respect to correlation strength. Therefore, the known analytical scheme offers a good approximation in performance evaluation when the correlation strength is sufficiently s...

  7. Lagrangian statistics in turbulent channel flow: implications for Lagrangian stochastic models

    Science.gov (United States)

    Stelzenmuller, Nickolas; Polanco, Juan Igancio; Vinkovic, Ivana; Mordant, Nicolas

    2016-11-01

    Lagrangian acceleration and velocity correlations in statistically one-dimesional turbulence are presented in the context of the development of Lagrangian stochastic models of inhomogeneous turbulent flows. These correlations are measured experimentally by 3D PTV in a high aspect ratio water channel at Reτ = 1450 , and numerically from DNS performed at the same Reynolds number. Lagrangian timescales, key components of Lagrangian stochastic models, are extracted from acceleration and velocity autocorrelations. The evolution of these timescales as a function of distance to the wall is presented, and compared to similar quantities measured in homogeneous isotropic turbulence. A strong dependance of all Lagrangian timescales on wall distance is present across the width of the channel. Significant cross-correlations are observed between the streamwise and wall-normal components of both acceleration and velocity. Lagrangian stochastic models of this flow must therefore retain dependance on the wall-normal coordinate and the components of acceleration and velocity, resulting in significantly more complex models than those used for homogeneous isotropic turbulence. We gratefully acknowledge funding from the Agence Nationale de la Recherche, LabEx Tec 21, and CONICYT Becas Chile.

  8. Bayesian Statistical Inference in Ion-Channel Models with Exact Missed Event Correction.

    Science.gov (United States)

    Epstein, Michael; Calderhead, Ben; Girolami, Mark A; Sivilotti, Lucia G

    2016-07-26

    The stochastic behavior of single ion channels is most often described as an aggregated continuous-time Markov process with discrete states. For ligand-gated channels each state can represent a different conformation of the channel protein or a different number of bound ligands. Single-channel recordings show only whether the channel is open or shut: states of equal conductance are aggregated, so transitions between them have to be inferred indirectly. The requirement to filter noise from the raw signal further complicates the modeling process, as it limits the time resolution of the data. The consequence of the reduced bandwidth is that openings or shuttings that are shorter than the resolution cannot be observed; these are known as missed events. Postulated models fitted using filtered data must therefore explicitly account for missed events to avoid bias in the estimation of rate parameters and therefore assess parameter identifiability accurately. In this article, we present the first, to our knowledge, Bayesian modeling of ion-channels with exact missed events correction. Bayesian analysis represents uncertain knowledge of the true value of model parameters by considering these parameters as random variables. This allows us to gain a full appreciation of parameter identifiability and uncertainty when estimating values for model parameters. However, Bayesian inference is particularly challenging in this context as the correction for missed events increases the computational complexity of the model likelihood. Nonetheless, we successfully implemented a two-step Markov chain Monte Carlo method that we called "BICME", which performs Bayesian inference in models of realistic complexity. The method is demonstrated on synthetic and real single-channel data from muscle nicotinic acetylcholine channels. We show that parameter uncertainty can be characterized more accurately than with maximum-likelihood methods. Our code for performing inference in these ion channel

  9. GLOBAL APPROACH OF CHANNEL MODELING IN MOBILE AD HOC NETWORKS INCLUDING SECOND ORDER STATISTICS AND SYSTEM PERFORMANCES ANALYSIS

    Directory of Open Access Journals (Sweden)

    Basile L. AGBA

    2008-06-01

    Full Text Available Mobile ad hoc networks (MANET are very difficult to design in terms of scenarios specification and propagation modeling. All these aspects must be taken into account when designing MANET. For cost-effective designing, powerful and accurate simulation tools are needed. Our first contribution in this paper is to provide a global approach process (GAP in channel modeling combining scenarios and propagation in order to have a better analysis of the physical layer, and finally to improve performances of the whole network. The GAP is implemented in an integrated simulation tool, Ad-SMPro. Moreover, channel statistics, throughput and delay are some key points to be considered when studying a mobile wireless networks. A carefully analysis of mobility effects over second order channel statistics and system performances is made based on our optimized simulation tool, Ad-SMProl. The channel is modeled by large scale fading and small scale fading including Doppler spectrum due to the double mobility of the nodes. Level Cross Rate and Average Duration of Fade are simulated as function of double mobility degree, a defined to be the ratio of the nodes' speeds. These results are compared to the theoretical predictions. We demonstrate that, in mobile ad hoc networks, flat fading channels and frequency-selective fading channels are differently affected. In addition, Bit Error rate is analysed as function of the ratio of the average bit energy to thermal noise density. Other performances (such as throughput, delay and routing traffic are analysed and conclusions related to the proposed simulation model and the mobility effects are drawn.

  10. Statistical Metamodeling and Sequential Design of Computer Experiments to Model Glyco-Altered Gating of Sodium Channels in Cardiac Myocytes.

    Science.gov (United States)

    Du, Dongping; Yang, Hui; Ednie, Andrew R; Bennett, Eric S

    2016-09-01

    Glycan structures account for up to 35% of the mass of cardiac sodium ( Nav ) channels. To question whether and how reduced sialylation affects Nav activity and cardiac electrical signaling, we conducted a series of in vitro experiments on ventricular apex myocytes under two different glycosylation conditions, reduced protein sialylation (ST3Gal4(-/-)) and full glycosylation (control). Although aberrant electrical signaling is observed in reduced sialylation, realizing a better understanding of mechanistic details of pathological variations in INa and AP is difficult without performing in silico studies. However, computer model of Nav channels and cardiac myocytes involves greater levels of complexity, e.g., high-dimensional parameter space, nonlinear and nonconvex equations. Traditional linear and nonlinear optimization methods have encountered many difficulties for model calibration. This paper presents a new statistical metamodeling approach for efficient computer experiments and optimization of Nav models. First, we utilize a fractional factorial design to identify control variables from the large set of model parameters, thereby reducing the dimensionality of parametric space. Further, we develop the Gaussian process model as a surrogate of expensive and time-consuming computer models and then identify the next best design point that yields the maximal probability of improvement. This process iterates until convergence, and the performance is evaluated and validated with real-world experimental data. Experimental results show the proposed algorithm achieves superior performance in modeling the kinetics of Nav channels under a variety of glycosylation conditions. As a result, in silico models provide a better understanding of glyco-altered mechanistic details in state transitions and distributions of Nav channels. Notably, ST3Gal4(-/-) myocytes are shown to have higher probabilities accumulated in intermediate inactivation during the repolarization and yield a

  11. Statistical model of natural stimuli predicts edge-like pooling of spatial frequency channels in V2

    Directory of Open Access Journals (Sweden)

    Gutmann Michael

    2005-02-01

    Full Text Available Abstract Background It has been shown that the classical receptive fields of simple and complex cells in the primary visual cortex emerge from the statistical properties of natural images by forcing the cell responses to be maximally sparse or independent. We investigate how to learn features beyond the primary visual cortex from the statistical properties of modelled complex-cell outputs. In previous work, we showed that a new model, non-negative sparse coding, led to the emergence of features which code for contours of a given spatial frequency band. Results We applied ordinary independent component analysis to modelled outputs of complex cells that span different frequency bands. The analysis led to the emergence of features which pool spatially coherent across-frequency activity in the modelled primary visual cortex. Thus, the statistically optimal way of processing complex-cell outputs abandons separate frequency channels, while preserving and even enhancing orientation tuning and spatial localization. As a technical aside, we found that the non-negativity constraint is not necessary: ordinary independent component analysis produces essentially the same results as our previous work. Conclusion We propose that the pooling that emerges allows the features to code for realistic low-level image features related to step edges. Further, the results prove the viability of statistical modelling of natural images as a framework that produces quantitative predictions of visual processing.

  12. Determinating Timing Channels in Statistically Multiplexed Clouds

    CERN Document Server

    Aviram, Amittai; Ford, Bryan; Gummadi, Ramakrishna

    2010-01-01

    Timing side-channels represent an insidious security challenge for cloud computing, because: (a) they enable one customer to steal information from another without leaving a trail or raising alarms; (b) only the cloud provider can feasibly detect and report such attacks, but the provider's incentives are not to; and (c) known general-purpose timing channel control methods undermine statistical resource sharing efficiency, and, with it, the cloud computing business model. We propose a new cloud architecture that uses provider-enforced deterministic execution to eliminate all timing channels internal to a shared cloud domain, without limiting internal resource sharing. A prototype determinism-enforcing hypervisor demonstrates that utilizing such a cloud might be both convenient and efficient. The hypervisor enables parallel guest processes and threads to interact via familiar shared memory and file system abstractions, and runs moderately coarse-grained parallel tasks as efficiently and scalably as current nond...

  13. Turbo Detection in Rayleigh flat fading channel with unknown statistics

    Directory of Open Access Journals (Sweden)

    Paul Fortier

    2010-11-01

    Full Text Available The turbo detection of turbo coded symbols over correlated Rayleigh flat fading channels generatedaccording to Jakes’ model is considered in this paper. We propose a method to estimate the channelsignal-to-noise ratio (SNR and the maximum Doppler frequency. These statistics are required bythe linear minimum mean squared error (LMMSE channel estimator. To improve the system convergence,we redefine the channel reliability factor by taking into account the channel estimationerror statistics. Simulation results for rate 1=3 turbo code and two different normalized fading ratesshow that the use of the new reliability factor greatly improves the performance. The improvementis more substantial when channel statistics are unknown.

  14. Statistical Modelling and Characterization of Experimental mm-Wave Indoor Channels for Future 5G Wireless Communication Networks.

    Science.gov (United States)

    Al-Samman, A M; Rahman, T A; Azmi, M H; Hindia, M N; Khan, I; Hanafi, E

    This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios.

  15. Nonequilibrium statistics of the laser beam intensity profile at the output of a model channel with strong turbulence

    Science.gov (United States)

    Arsenyan, Tatiana I.; Babanin, Eugeniy A.; Komarov, Aleksandr G.; Suhareva, Natalia A.; Zotov, Aleksey M.

    2014-11-01

    The experimentally obtained space-time distortions of the signal beam profile in the optical data transmitting channels are presented. Interpretation and prognostication of distortion structure was carried out using the non-equilibrium thermodynamics and statistics methods, particularly non-extensive Renyi entropy. The method of media state operational control using a single sampling frame is proposed.

  16. Statistical mechanical analysis of the Kronecker channel model for multiple-input multiple-output wireless communication.

    Science.gov (United States)

    Hatabu, Atsushi; Takeda, Koujin; Kabashima, Yoshiyuki

    2009-12-01

    The Kronecker channel model of wireless communication is analyzed using statistical mechanics methods. In the model, spatial proximities among transmission/reception antennas are taken into account as certain correlation matrices, which generally yield nontrivial dependence among symbols to be estimated. This prevents accurate assessment of the communication performance by naively using a previously developed analytical scheme based on a matrix integration formula. In order to resolve this difficulty, we develop a formalism that can formally handle the correlations in Kronecker models based on the known scheme. Unfortunately, direct application of the developed scheme is, in general, practically difficult. However, the formalism is still useful, indicating that the effect of the correlations generally increase after the fourth order with respect to correlation strength. Therefore, the known analytical scheme offers a good approximation in performance evaluation when the correlation strength is sufficiently small. For a class of specific correlation, we show that the performance analysis can be mapped to the problem of one-dimensional spin systems in random fields, which can be investigated without approximation by the belief propagation algorithm.

  17. Statistics of polymer extensions in turbulent channel flow

    CERN Document Server

    Bagheri, Faranggis; Perlekar, Prasad; Brandt, Luca

    2012-01-01

    We present direct numerical simulations~(DNSs) of turbulent channel flow with passive Lagrangian polymers. To understand the polymer behavior we investigate the behavior of infinitesimal line elements and calculate, for the first time, the PDF of finite-time Lyapunov exponents and from them the corresponding Cramer's function for the channel flow. We study the statistics of polymer elongation for both the Oldroyd-B model (for Weissenberg number Wi 1 (FENE model) the polymer are significantly more stretched near the wall than at the centre of the flow. Furthermore near the wall the polymers show a strong tendency to orient along the stream-wise direction of the flow but near the centerline the statistics of orientation of the polymers is consistent with analogous results obtained recently in homogeneous and isotropic flows [2].

  18. Statistical Hot Channel Analysis for the NBSR

    Energy Technology Data Exchange (ETDEWEB)

    Cuadra A.; Baek J.

    2014-05-27

    A statistical analysis of thermal limits has been carried out for the research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The objective of this analysis was to update the uncertainties of the hot channel factors with respect to previous analysis for both high-enriched uranium (HEU) and low-enriched uranium (LEU) fuels. Although uncertainties in key parameters which enter into the analysis are not yet known for the LEU core, the current analysis uses reasonable approximations instead of conservative estimates based on HEU values. Cumulative distribution functions (CDFs) were obtained for critical heat flux ratio (CHFR), and onset of flow instability ratio (OFIR). As was done previously, the Sudo-Kaminaga correlation was used for CHF and the Saha-Zuber correlation was used for OFI. Results were obtained for probability levels of 90%, 95%, and 99.9%. As an example of the analysis, the results for both the existing reactor with HEU fuel and the LEU core show that CHFR would have to be above 1.39 to assure with 95% probability that there is no CHF. For the OFIR, the results show that the ratio should be above 1.40 to assure with a 95% probability that OFI is not reached.

  19. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...

  20. Channel Statistics for MIMO Handsets in Data Mode

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ødum; Yanakiev, Boyan; Barrio, Samantha Caporal Del;

    2014-01-01

    The presented work is based on a large dual- band, dual-base outdoor-to-indoor multiple-input multiple- output (MIMO) channel measurement campaign, involving ten different realistic MIMO handsets, held in data mode by eight test users. Various different use cases (UCs) are measured. Statistics on...... on the channel capacity, mean effective gain (MEG), branch power ratio (BPR), and correlation coefficients between Rx, Tx, and cross-link channels are presented....

  1. Modeling cosmic void statistics

    Science.gov (United States)

    Hamaus, Nico; Sutter, P. M.; Wandelt, Benjamin D.

    2016-10-01

    Understanding the internal structure and spatial distribution of cosmic voids is crucial when considering them as probes of cosmology. We present recent advances in modeling void density- and velocity-profiles in real space, as well as void two-point statistics in redshift space, by examining voids identified via the watershed transform in state-of-the-art ΛCDM n-body simulations and mock galaxy catalogs. The simple and universal characteristics that emerge from these statistics indicate the self-similarity of large-scale structure and suggest cosmic voids to be among the most pristine objects to consider for future studies on the nature of dark energy, dark matter and modified gravity.

  2. Statistical-mechanical analysis of multiuser channel capacity with imperfect channel state information

    Institute of Scientific and Technical Information of China (English)

    Wang Hui-Song; Zeng Gui-Hua

    2008-01-01

    In this paper,the effect of imperfect channel state information at the receiver, which is caused by noise and other interference, on the multi-access channel capacity is analysed through a statistical-mechanical approach. Replica analyses focus on analytically studying how the minimum mean square error (MMSE) channel estimation error appears in a multiuser channel capacity formula. And the relevant mathematical expressions are derived. At the same time,numerical simulation results are demonstrated to validate the Replica analyses. The simulation results show how the system parameters, such as channel estimation error, system load and signal-to-noise ratio, affect the channel capacity.

  3. Statistical Mechanics Analysis of LDPC Coding in MIMO Gaussian Channels

    OpenAIRE

    Alamino, Roberto C.; Saad, David

    2007-01-01

    Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under LDPC network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and the symmetric and asymm...

  4. Algebraic Statistics for Network Models

    Science.gov (United States)

    2014-02-19

    AFRL-OSR-VA-TR-2014-0070 (DARPA) Algebraic Statistics for Network Models SONJA PETROVIC PENNSYLVANIA STATE UNIVERSITY 02/19/2014 Final Report...DARPA GRAPHS Phase I Algebraic Statistics for Network Models FA9550-12-1-0392 Sonja Petrović petrovic@psu.edu1 Department of Statistics Pennsylvania...Department of Statistics, Heinz College , Machine Learning Department, Cylab Carnegie Mellon University 1. Abstract This project focused on the family of

  5. Statistical Model for Content Extraction

    DEFF Research Database (Denmark)

    2011-01-01

    We present a statistical model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features to predict significance of the node towards overall content...

  6. Methods of statistical model estimation

    CERN Document Server

    Hilbe, Joseph

    2013-01-01

    Methods of Statistical Model Estimation examines the most important and popular methods used to estimate parameters for statistical models and provide informative model summary statistics. Designed for R users, the book is also ideal for anyone wanting to better understand the algorithms used for statistical model fitting. The text presents algorithms for the estimation of a variety of regression procedures using maximum likelihood estimation, iteratively reweighted least squares regression, the EM algorithm, and MCMC sampling. Fully developed, working R code is constructed for each method. Th

  7. Axial electron channeling statistical method of site occupancy determination

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Multibeams dynamical theory of electron diffraction has been used to calculate the fast electron thickness-integrated probability density on Ti and Al sites in the γ-TiAl phase as a function of the incident electron beam orientation along \\[100\\], \\[110\\] and \\[011\\] zone axes, with the effect of absorption considered. Both of the calculation and experiments show that there are big differences in electron channeling effect for different zone axes or the same axis but with different orientations, so we should choose proper zone axis and suitable incident beam tilting angles when using the axial electron channeling statistical method to determine the site occupancies of impurities. It is suggested to calculate the channeling effect map before the experiments.

  8. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation.

    Science.gov (United States)

    Finnerty, Justin John; Peyser, Alexander; Carloni, Paolo

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores.

  9. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  10. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  11. Sensometrics: Thurstonian and Statistical Models

    DEFF Research Database (Denmark)

    Christensen, Rune Haubo Bojesen

    This thesis is concerned with the development and bridging of Thurstonian and statistical models for sensory discrimination testing as applied in the scientific discipline of sensometrics. In sensory discrimination testing sensory differences between products are detected and quantified by the use...... and sensory discrimination testing in particular in a series of papers by advancing Thurstonian models for a range of sensory discrimination protocols in addition to facilitating their application by providing software for fitting these models. The main focus is on identifying Thurstonian models...... for discrimination methods as versions of well-known statistical models. The Thurstonian models for a group of discrimination methods leading to binomial responses are shown to be versions of a statistical class of models known as generalized linear models. Thurstonian models for A-not A with sureness and 2...

  12. Statistical Theory of Selectivity and Conductivity in Biological Channels

    CERN Document Server

    Luchinsky, D G; Kaufman, I; Timucin, D A; Eisenberg, R S; McClintock, P V E

    2016-01-01

    We present an equilibrium statistical-mechanical theory of selectivity in biological ion channels. In doing so, we introduce a grand canonical ensemble for ions in a channel's selectivity filter coupled to internal and external bath solutions for a mixture of ions at arbitrary concentrations, we use linear response theory to find the current through the filter for small gradients of electrochemical potential, and we show that the conductivity of the filter is given by the generalized Einstein relation. We apply the theory to the permeation of ions through the potassium selectivity filter, and are thereby able to resolve the long-standing paradox of why the high selectivity of the filter brings no associated delay in permeation. We show that the Eisenman selectivity relation follows directly from the condition of diffusion-limited conductivity through the filter. We also discuss the effect of wall fluctuations on the filter conductivity.

  13. Statistical mechanics analysis of LDPC coding in MIMO Gaussian channels

    Energy Technology Data Exchange (ETDEWEB)

    Alamino, Roberto C; Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2007-10-12

    Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under low-density parity-check (LDPC) network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and symmetric and asymmetric interference. Both dynamical and thermodynamical transitions from the ferromagnetic solution of perfect decoding to a non-ferromagnetic solution are identified for the cases considered, marking the practical and theoretical limits of the system under the current coding scheme. Numerical results are provided, showing the typical level of improvement/deterioration achieved with respect to the single transmitter/receiver result, for the various cases.

  14. Statistical Hauser-Feshbach theory with width fluctuation correction including direct reaction channels for neutron induced reaction at low energies

    CERN Document Server

    Kawano, T; Hilaire, S

    2016-01-01

    A model to calculate particle-induced reaction cross sections with statistical Hauser-Feshbach theory including direct reactions is given. The energy average of scattering matrix from the coupled-channels optical model is diagonalized by the transformation proposed by Engelbrecht and Weidenm\\"{u}ller. The ensemble average of $S$-matrix elements in the diagonalized channel space is approximated by a model of Moldauer [Phys.Rev.C {\\bf 12}, 744 (1975)] using newly parametrized channel degree-of-freedom $\

  15. A Survey on Statistical Based Single Channel Speech Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Sunnydayal. V

    2014-11-01

    Full Text Available Speech enhancement is a long standing problem with various applications like hearing aids, automatic recognition and coding of speech signals. Single channel speech enhancement technique is used for enhancement of the speech degraded by additive background noises. The background noise can have an adverse impact on our ability to converse without hindrance or smoothly in very noisy environments, such as busy streets, in a car or cockpit of an airplane. Such type of noises can affect quality and intelligibility of speech. This is a survey paper and its object is to provide an overview of speech enhancement algorithms so that enhance the noisy speech signal which is corrupted by additive noise. The algorithms are mainly based on statistical based approaches. Different estimators are compared. Challenges and Opportunities of speech enhancement are also discussed. This paper helps in choosing the best statistical based technique for speech enhancement

  16. Channel capacity and digital modulation schemes in correlated Weibull fading channels with nonidentical statistics

    Institute of Scientific and Technical Information of China (English)

    Xiao Hailin; Nie Zaiping; Yang Shiwen

    2007-01-01

    The novel closed-form expressions for the average channel capacity of dual selection diversity is presented, as well as, the bit-error rate (BER) of several coherent and noncoherent digital modulation schemes in the correlated Weibull fading channels with nonidentical statistics.The results are expressed in terms of Meijer's Gfunction, which can be easily evaluated numerically.The simulation results are presented to validate the proposed theoretical analysis and to examine the effects of the fading severity on the concerned quantities.

  17. A Statistical Programme Assignment Model

    DEFF Research Database (Denmark)

    Rosholm, Michael; Staghøj, Jonas; Svarer, Michael

    When treatment effects of active labour market programmes are heterogeneous in an observable way  across the population, the allocation of the unemployed into different programmes becomes a particularly  important issue. In this paper, we present a statistical model designed to improve the present...... assignment  mechanism, which is based on the discretionary choice of case workers. This is done in a duration model context, using the timing-of-events framework to identify causal effects. We compare different assignment  mechanisms, and the results suggest that a significant reduction in the average...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the  plementation of such a system, especially the interplay between the statistical model and  case workers....

  18. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  19. Improved model for statistical alignment

    Energy Technology Data Exchange (ETDEWEB)

    Miklos, I.; Toroczkai, Z. (Zoltan)

    2001-01-01

    The statistical approach to molecular sequence evolution involves the stochastic modeling of the substitution, insertion and deletion processes. Substitution has been modeled in a reliable way for more than three decades by using finite Markov-processes. Insertion and deletion, however, seem to be more difficult to model, and thc recent approaches cannot acceptably deal with multiple insertions and deletions. A new method based on a generating function approach is introduced to describe the multiple insertion process. The presented algorithm computes the approximate joint probability of two sequences in 0(13) running time where 1 is the geometric mean of the sequence lengths.

  20. Eigenvalue and Entropy Statistics for Products of Conjugate Random Quantum Channels

    Directory of Open Access Journals (Sweden)

    Benoît Collins

    2010-06-01

    Full Text Available Using the graphical calculus and integration techniques introduced by the authors, we study the statistical properties of outputs of products of random quantum channels for entangled inputs. In particular, we revisit and generalize models of relevance for the recent counterexamples to the minimum output entropy additivity problems. Our main result is a classification of regimes for which the von Neumann entropy is lower on average than the elementary bounds that can be obtained with linear algebra techniques.

  1. Statistical development and validation of discharge equations for natural channels

    Science.gov (United States)

    Dingman, S. Lawrence; Sharma, Keshav P.

    1997-12-01

    Although the Manning equation is widely accepted as the empirical flow law for rough turbulent open-channel flow, using the equation in practical situations such as slope-area computations is fraught with uncertainty because of the difficulty in specifying the value of the reach resistance, Manning's n. Riggs (1976, J. Res. US Geol. Surv., 4: 285-291) found that n was correlated with water-surface slope, and proposed a multiple-regression equation that obviates the need for estimating n in slope-area estimates of discharge. Because his relation was developed from a relatively small sample ( N = 62), had potential flaws owing to multicollinearity, and was not thoroughly validated, we used an expanded data base ( N = 520) and objective methods to develop a new relation for the same purpose: Q = 1.564 A1.173R0.400S-0.0543log S where Q is discharge (m 3 s -1), A is cross-sectional area (m 2), R is hydraulic radius (m), and S is water-surface slope. We validated Rigg's model and our model using 100 measurements not included in model development and found that both give similar results. Riggs's model is somewhat better in terms of actual (m 3 s -1) error, but ours is better in terms of relative (log Q) error. We conclude that either Riggs's or our model can be used in place of Manning's equation in slope-area computations, but that our model is preferable because it has less bias, minimizes multicollinearity, and performs better when applied to discharge changes in individual reaches. We also found that our model performs better than those of Jarrett (1984, J. Hydraul. Eng., 110: 1519-1539) or Riggs in the range of applicability of Jarrett's equation (0.15 m ≤ R ≤ 2.13 m; 0.002 ≤ S ≤ 0.052). Both Riggs's and our models significantly overestimate Q in flows satisfying both the following conditions: Q < 3 m 3s -1 and Froude number less than 0.2. For other in-bank flows in relatively straight reaches, our model can be recommended for use in slope-area computations

  2. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    (PDF) of turbulence driven short-term extreme wind shear events, conditioned on the mean wind speed, for an arbitrary recurrence period. The model is based on an asymptotic expansion, and only a few and easily accessible parameters are needed as input. The model of the extreme PDF is supplemented....... the measurements have been extracted from "Database on Wind Charateristics" (http://www.winddata.com/), and they refer to a site caracterised by a flat homogeneous terrain. The comparison has been conducted for three different (high wind) mean wind speed, and model predictions and experimental results......In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...

  3. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    (PDF) of turbulence driven short-term extreme wind shear events, conditioned on the mean wind speed, for an arbitrary recurrence period. The model is based on an asymptotic expansion, and only a few and easily accessible parameters are needed as input. The model of the extreme PDF is supplemented....... The measurements have been extracted from "Database on Wind Characteristics" (http://www.winddata.com/), and they refer to a site characterised by a flat homogeneous terrain. The comparison has been conducted for three different mean wind speeds in the range 15m/s – 19m/s, and model predictions and experimental......In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...

  4. A Survey of Fading Models for Mobile Radio Channel Characterization

    Directory of Open Access Journals (Sweden)

    L. D. Arya

    2010-02-01

    Full Text Available Future 3G and 4G mobile communication systems will be required to support wide range of data rates and quality of service matrix. For the efficient design of data link and transport protocols system designer needs knowledge of the statistical properties of physicallayer. Studies have shown that without proper characterization of the channel, blind application of existing protocols and transmission policy may results in disastrous performance unless proper measures are not being taken. Channel characterization also helps in llocation of resources, selection of transmission policy andprotocols. A feasible measure is to have an accurate and thoroughly reproducible optimum channel model which can mimic the mobile radio channel in diversities of fading error environments. Objective of channel model is to supply proper outputs for designing of upper layer protocol in such a fashion as if it were running on the actualphysical layer. The model should fit very well to the measured data and should easily handle analytically. Various approaches for characterization of fading mobile channels have appeared in iterature over last five decades. This article surveys the fading channel models for proper characterization of the radio channel andprovides approaches to classify the existing channel models. The paper also presents the contribution made by these channel models with their assumptions, suitability, applications, shortcomingsand further improvement issues. In present environment Markov Models are best suited for characterization of the fading radio channel. Inthese models radio channel is presented in terms of fading states and modeled as stochastic process. A proper constructed channel model may be valuable means to enhance the reliability and capacity of future mobile radio channel.

  5. Statistical models for trisomic phenotypes

    Energy Technology Data Exchange (ETDEWEB)

    Lamb, N.E.; Sherman, S.L.; Feingold, E. [Emory Univ., Atlanta, GA (United States)

    1996-01-01

    Certain genetic disorders are rare in the general population but more common in individuals with specific trisomies, which suggests that the genes involved in the etiology of these disorders may be located on the trisomic chromosome. As with all aneuploid syndromes, however, a considerable degree of variation exists within each phenotype so that any given trait is present only among a subset of the trisomic population. We have previously presented a simple gene-dosage model to explain this phenotypic variation and developed a strategy to map genes for such traits. The mapping strategy does not depend on the simple model but works in theory under any model that predicts that affected individuals have an increased likelihood of disomic homozygosity at the trait locus. This paper explores the robustness of our mapping method by investigating what kinds of models give an expected increase in disomic homozygosity. We describe a number of basic statistical models for trisomic phenotypes. Some of these are logical extensions of standard models for disomic phenotypes, and some are more specific to trisomy. Where possible, we discuss genetic mechanisms applicable to each model. We investigate which models and which parameter values give an expected increase in disomic homozygosity in individuals with the trait. Finally, we determine the sample sizes required to identify the increased disomic homozygosity under each model. Most of the models we explore yield detectable increases in disomic homozygosity for some reasonable range of parameter values, usually corresponding to smaller trait frequencies. It therefore appears that our mapping method should be effective for a wide variety of moderately infrequent traits, even though the exact mode of inheritance is unlikely to be known. 21 refs., 8 figs., 1 tab.

  6. Secure Communication over Fading Channels with Statistical QoS Constraints

    OpenAIRE

    Qiao, Deli; Gursoy, Mustafa Cenk; Velipasalar, Senem

    2010-01-01

    In this paper, the secure transmission of information over an ergodic fading channel is investigated in the presence of statistical quality of service (QoS) constraints. We employ effective capacity, which provides the maximum constant arrival rate that a given process can support while satisfying statistical delay constraints, to measure the secure throughput of the system, i.e., effective secure throughput. We assume that the channel side information (CSI) of the main channel is available a...

  7. Secure Communication over Fading Channels with Statistical QoS Constraints

    CERN Document Server

    Qiao, Deli; Velipasalar, Senem

    2010-01-01

    In this paper, the secure transmission of information over an ergodic fading channel is investigated in the presence of statistical quality of service (QoS) constraints. We employ effective capacity, which provides the maximum constant arrival rate that a given process can support while satisfying statistical delay constraints, to measure the secure throughput of the system, i.e., effective secure throughput. We assume that the channel side information (CSI) of the main channel is available at the transmitter side. Depending on the availability of the CSI of the eavesdropper channel, we obtain the corresponding optimal power control policies that maximize the effective secure throughput. In particular, when the CSI of the eavesdropper channel is available at the transmitter, the transmitter can no longer wait for transmission when the main channel is much better than the eavesdropper channel due to the introduction of QoS constraints. Moreover, the CSI of the eavesdropper channel becomes useless as QoS constr...

  8. Degenerate RFID Channel Modeling for Positioning Applications

    Directory of Open Access Journals (Sweden)

    A. Povalac

    2012-12-01

    Full Text Available This paper introduces the theory of channel modeling for positioning applications in UHF RFID. It explains basic parameters for channel characterization from both the narrowband and wideband point of view. More details are given about ranging and direction finding. Finally, several positioning scenarios are analyzed with developed channel models. All the described models use a degenerate channel, i.e. combined signal propagation from the transmitter to the tag and from the tag to the receiver.

  9. Visualizing statistical models and concepts

    CERN Document Server

    Farebrother, RW

    2002-01-01

    Examines classic algorithms, geometric diagrams, and mechanical principles for enhancing visualization of statistical estimation procedures and mathematical concepts in physics, engineering, and computer programming.

  10. Optimal transmission strategy for spatially correlated MIMO systems with channel statistical information

    Institute of Scientific and Technical Information of China (English)

    ZHAO Zhen-shan; XU Guo-zhi

    2007-01-01

    In real multiple-input multiple-output (MIMO) systems, the perfect channel state information (CSI) may be costly or impossible to acquire. But the channel statistical information can be considered relatively stationary during long-term transmission.The statistical information can be obtained at the receiver and fed back to the transmitter and do not require frequent update. By exploiting channel mean and covariance information at the transmitter simultaneously, this paper investigates the optimal transmission strategy for spatially correlated MIMO channels. An upper bound of ergodic capacity is derived and taken as the performance criterion. Simulation results are also given to show the performance improvement of the optimal transmission strategy.

  11. Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-03-01

    In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present simple and efficient closed-form expression to the higher order moments of the channel capacity of dual hop transmission system with Rayleigh fading channels. In order to analyze the behavior of the higher order capacity statistics and investigate the usefulness of the mathematical analysis, some selected numerical and simulation results are presented. Our results are found to be in perfect agreement. © 2012 IEEE.

  12. Box model for channels of human migration

    CERN Document Server

    Vitanov, Nikolay K

    2016-01-01

    We discuss a mathematical model of migration channel based on the truncated Waring distribution. The truncated Waring distribution is obtained for a more general model of motion of substance through a channel containing finite number of boxes. The model is applied then for case of migrants moving through a channel consisting of finite number of countries or cities. The number of migrants in the channel strongly depends on the number of migrants that enter the channel through the country of entrance. It is shown that if the final destination country is very popular then large percentage of migrants may concentrate there.

  13. Fermi breakup and the statistical multifragmentation model

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, B.V., E-mail: brett@ita.br [Departamento de Fisica, Instituto Tecnologico de Aeronautica - CTA, 12228-900 Sao Jose dos Campos (Brazil); Donangelo, R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Facultad de Ingenieria, Universidad de la Republica, Julio Herrera y Reissig 565, 11.300 Montevideo (Uruguay); Souza, S.R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Universidade Federal do Rio Grande do Sul, Av. Bento Goncalves 9500, CP 15051, 91501-970, Porto Alegre (Brazil); Lynch, W.G.; Steiner, A.W.; Tsang, M.B. [Joint Institute for Nuclear Astrophysics, National Superconducting Cyclotron Laboratory and the Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States)

    2012-02-15

    We demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical statistical multifragmentation model used to describe the disintegration of highly excited fragments of nuclear reactions. We argue that such a model better fulfills the hypothesis of statistical equilibrium than the Fermi breakup model generally used to describe statistical disintegration of light mass nuclei.

  14. Radio propagation measurement and channel modelling

    CERN Document Server

    Salous, Sana

    2013-01-01

    While there are numerous books describing modern wireless communication systems that contain overviews of radio propagation and radio channel modelling, there are none that contain detailed information on the design, implementation and calibration of radio channel measurement equipment, the planning of experiments and the in depth analysis of measured data. The book would begin with an explanation of the fundamentals of radio wave propagation and progress through a series of topics, including the measurement of radio channel characteristics, radio channel sounders, measurement strategies

  15. Group statistical channel coding dimming scheme in visible light communication system

    Science.gov (United States)

    Zhuang, Kaiyu; Huang, Zhitong; Zhang, Ruqi; Li, Jianfeng; Ji, Yuefeng

    2016-10-01

    In this paper, we propose a group statistical channel coding (GSCC) scheme, which achieves dimming by changing the ratio of the 0-1 symbol of the original data stream through probabilistic statistics method. The simulation under various brightness conditions displays that the GSCC maintains good performance comparing to PWM dimming with half satisfice of transmission rate and a larger dimming intensity. Simulation of GSCC after combining with other channel coding schemes reflects that GSCC has good compatibility to arbitrary access coded signal.

  16. Universality of Poisson indicator and Fano factor of transport event statistics in ion channels and enzyme kinetics.

    Science.gov (United States)

    Chaudhury, Srabanti; Cao, Jianshu; Sinitsyn, Nikolai A

    2013-01-17

    We consider a generic stochastic model of ion transport through a single channel with arbitrary internal structure and kinetic rates of transitions between internal states. This model is also applicable to describe kinetics of a class of enzymes in which turnover events correspond to conversion of substrate into product by a single enzyme molecule. We show that measurement of statistics of single molecule transition time through the channel contains only restricted information about internal structure of the channel. In particular, the most accessible flux fluctuation characteristics, such as the Poisson indicator (P) and the Fano factor (F) as function of solute concentration, depend only on three parameters in addition to the parameters of the Michaelis-Menten curve that characterizes average current through the channel. Nevertheless, measurement of Poisson indicator or Fano factor for such renewal processes can discriminate reactions with multiple intermediate steps as well as provide valuable information about the internal kinetic rates.

  17. Analysis and Realization on MIMO Channel Model

    Directory of Open Access Journals (Sweden)

    Liu Hui

    2014-04-01

    Full Text Available In order to build the MIMO (Multiple Input Multiple Output channel model based on IEEE 802.16, the way and analysis on how to build good MIMO channel model are described in this study. By exploiting the spatial freedom of wireless channels, MIMO systems have the potential to achieve high bandwidth efficiency, promoting MIMO to be a key technique in the next generation communication systems. As a basic researching field of MIMO technologies, MIMO channel modeling significantly serve to the performance evaluation of space-time encoding algorithms as well as system level calibration and simulation. Having the superiorities of low inner-antenna correlation and small array size, multi-polarization tends to be a promising technique in future MIMO systems. However, polarization characteristics have not yet been modeled well in current MIMO channel models, so establishing meaningful multi-polarized MIMO channel models has become a hot spot in recent channel modeling investigation. In this study, I have mainly made further research on the related theories in the channel models and channel estimation and implementation algorithms on the others’ research work.

  18. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  19. New Approaches for Channel Prediction Based on Sinusoidal Modeling

    Directory of Open Access Journals (Sweden)

    Ekman Torbjörn

    2007-01-01

    Full Text Available Long-range channel prediction is considered to be one of the most important enabling technologies to future wireless communication systems. The prediction of Rayleigh fading channels is studied in the frame of sinusoidal modeling in this paper. A stochastic sinusoidal model to represent a Rayleigh fading channel is proposed. Three different predictors based on the statistical sinusoidal model are proposed. These methods outperform the standard linear predictor (LP in Monte Carlo simulations, but underperform with real measurement data, probably due to nonstationary model parameters. To mitigate these modeling errors, a joint moving average and sinusoidal (JMAS prediction model and the associated joint least-squares (LS predictor are proposed. It combines the sinusoidal model with an LP to handle unmodeled dynamics in the signal. The joint LS predictor outperforms all the other sinusoidal LMMSE predictors in suburban environments, but still performs slightly worse than the standard LP in urban environments.

  20. Statistical characterization of the dynamic human body communication channel at 45 MHz.

    Science.gov (United States)

    Nie, Zedong; Ma, Jingjing; Chen, Hong; Wang, Lei

    2013-01-01

    The dynamic human body communication (HBC) propagation channel at 45 MHz was statistical characterized in this paper. A large amount of measurement data has been gathered in practical environment with real activities -treadmill running at different speeds in a lab room. The received power between two lower legs was acquired from three volunteers, with more than 60,000 snap shot of data in total. The statistical analyses confirmed that the HBC propagation channel at 45 MHz followed the Gamma and Lognormal distributions at the slower (2 km/h and 4 km/h) and faster (6 km/h and 8 km/h) running activities, respectively. The channel is insensitive to body motion with the maximum average fade duration is 0.0413 s and the most averaging bad channel duration time being less than 60 ms with the percentage of the bad channel duration time being less than 4.35%.

  1. On the Statistical Properties of Nakagami-Hoyt Vehicle-to-Vehicle Fading Channel under Nonisotropic Scattering

    Directory of Open Access Journals (Sweden)

    Muhammad Imran Akram

    2012-01-01

    Full Text Available This paper presents the statistical properties of the vehicle-to-vehicle Nakagami-Hoyt (Nakagami-q channel model under non-isotropic condition. The spatial time correlation function (STCF, the power spectral density (PSD, squared time autocorrelation function (SQCF, level crossing rate (LCR, and the average duration of Fade (ADF of the Nakagami-Hoyt channel have been derived under the assumption that both the transmitter and receiver are nonstationary having nonomnidirectional antennas. A simulator that uses the inverse-fast-fourier-transform- (IFFT- based computation method is designed for this model. The simulator and analytical results are compared.

  2. Radio channel modeling in body area networks

    NARCIS (Netherlands)

    An, L.; Bentum, M.J.; Meijerink, A.; Scanlon, W.G.

    2010-01-01

    A body area network (BAN) is a network of bodyworn or implanted electronic devices, including wireless sensors which can monitor body parameters or to detect movements. One of the big challenges in BANs is the propagation channel modeling. Channel models can be used to understand wave propagation in

  3. Radio channel modeling in body area networks

    NARCIS (Netherlands)

    An, L.; Bentum, M.J.; Meijerink, A.; Scanlon, W.G.

    2009-01-01

    A body area network (BAN) is a network of bodyworn or implanted electronic devices, including wireless sensors which can monitor body parameters or to de- tect movements. One of the big challenges in BANs is the propagation channel modeling. Channel models can be used to understand wave propagation

  4. Statistical modelling for ship propulsion efficiency

    DEFF Research Database (Denmark)

    Petersen, Jóan Petur; Jacobsen, Daniel J.; Winther, Ole

    2012-01-01

    This paper presents a state-of-the-art systems approach to statistical modelling of fuel efficiency in ship propulsion, and also a novel and publicly available data set of high quality sensory data. Two statistical model approaches are investigated and compared: artificial neural networks...

  5. Statistical Models and Methods for Lifetime Data

    CERN Document Server

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  6. On Ergodic Secrecy Capacity of Multiple Input Wiretap Channel with Statistical CSIT

    CERN Document Server

    Lin, Shih-Chun

    2012-01-01

    We consider the secure transmission in ergodic fast-Rayleigh fading multiple-input single-output single-antennaeavesdropper (MISOSE) wiretap channels. We assume that the statistics of both the legitimate and eavesdropper channels is the only available channel state information at the transmitter (CSIT). By introducing a new secrecy capacity upper bound, we prove that the secrecy capacity is achieved by Gaussian input without prefixing. To attain this, we form another MISOSE channel for upper-bounding, and tighten the bound by finding the worst correlations between the legitimate and eavesdropper channel coefficients. The resulting upper bound is tighter than the others in the literature which are based on modifying the correlation between the noises at the legitimate receiver and eavesdropper. Next, we fully characterize the ergodic secrecy capacity by showing that the optimal channel input covariance matrix is a scaled identity matrix, with the transmit power allocated uniformly among the antennas. The key t...

  7. Pair normalized channel feature and statistics-based learning for high-performance pedestrian detection

    Science.gov (United States)

    Zeng, Bobo; Wang, Guijin; Ruan, Zhiwei; Lin, Xinggang; Meng, Long

    2012-07-01

    High-performance pedestrian detection with good accuracy and fast speed is an important yet challenging task in computer vision. We design a novel feature named pair normalized channel feature (PNCF), which simultaneously combines and normalizes two channel features in image channels, achieving a highly discriminative power and computational efficiency. PNCF applies to both gradient channels and color channels so that shape and appearance information are described and integrated in the same feature. To efficiently explore the formidably large PNCF feature space, we propose a statistics-based feature learning method to select a small number of potentially discriminative candidate features, which are fed into the boosting algorithm. In addition, channel compression and a hybrid pyramid are employed to speed up the multiscale detection. Experiments illustrate the effectiveness of PNCF and its learning method. Our proposed detector outperforms the state-of-the-art on several benchmark datasets in both detection accuracy and efficiency.

  8. What is the meaning of the statistical hadronization model?

    CERN Document Server

    Becattini, F

    2005-01-01

    The statistical model of hadronization succeeds in reproducing particle abundances and transverse momentum spectra in high energy collisions of elementary particles as well as of heavy ions. Despite its apparent success, the interpretation of these results is controversial and the validity of the approach very often questioned. In this paper, we would like to summarize the whole issue by first outlining a basic formulation of the model and then comment on the main criticisms and different kinds of interpretations, with special emphasis on the so-called "phase space dominance". While the ultimate answer to the question why the statistical model works should certainly be pursued, we stress that it is a priority to confirm or disprove the fundamental scheme of the statistical model by performing some detailed tests on the rates of exclusive channels at lower energy.

  9. Statistical modelling of fish stocks

    DEFF Research Database (Denmark)

    Kvist, Trine

    1999-01-01

    for modelling the dynamics of a fish population is suggested. A new approach is introduced to analyse the sources of variation in age composition data, which is one of the most important sources of information in the cohort based models for estimation of stock abundancies and mortalities. The approach combines...... and it is argued that an approach utilising stochastic differential equations might be advantagous in fish stoch assessments....

  10. Statistical Modeling of Bivariate Data.

    Science.gov (United States)

    1982-08-01

    end identify by lock nsum br) joint density-quantile function, dependence-density, non-parametric bivariate density estimation, entropy , exponential...estimated, by autoregressive or exponential model estimators I with maximum entropy properties, is investigated in this thesis. The results provide...important and useful procedures for nonparametric bivariate density estimation. The thesis discusses estimators of the entropy H(d) of ul2) which seem to me

  11. Sensitivity Analysis and Statistical Convergence of a Saltating Particle Model

    CERN Document Server

    Maldonado, S

    2016-01-01

    Saltation models provide considerable insight into near-bed sediment transport. This paper outlines a simple, efficient numerical model of stochastic saltation, which is validated against previously published experimental data on saltation in a channel of nearly horizontal bed. Convergence tests are systematically applied to ensure the model is free from statistical errors emanating from the number of particle hops considered. Two criteria for statistical convergence are derived; according to the first criterion, at least $10^3$ hops appear to be necessary for convergent results, whereas $10^4$ saltations seem to be the minimum required in order to achieve statistical convergence in accordance with the second criterion. Two empirical formulae for lift force are considered: one dependent on the slip (relative) velocity of the particle multiplied by the vertical gradient of the horizontal flow velocity component; the other dependent on the difference between the squares of the slip velocity components at the to...

  12. On the computation of the higher-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-12-01

    The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.

  13. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  14. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  15. Investigation of Particles Statistics in large Eddy Simulated Turbulent Channel Flow using Generalized lattice Boltzmann Method

    Directory of Open Access Journals (Sweden)

    Mandana Samari Kermani

    2016-01-01

    Full Text Available The interaction of spherical solid particles with turbulent eddies in a 3-D turbulent channel flow with friction Reynolds number was studied. A generalized lattice Boltzmann equation (GLBE was used for computation of instantaneous turbulent flow field for which large eddy simulation (LES was employed. The sub-grid-scale (SGS turbulence effects were simulated through a shear-improved Smagorinsky model (SISM, which can predict turbulent near wall region without any wall function. Statistical properties of particles behavior such as root mean square (RMS velocities were studied as a function of dimensionless particle relaxation time ( by using a Lagrangian approach. Combination of SISM in GLBE with particle tracking analysis in turbulent channel flow is novelty of the present work. Both GLBE and SISM solve the flow field equations locally. This is an advantage of this method and makes it easy implementing. Comparison of the present results with previous available data indicated that SISM in GLBE is a reliable method for simulation of turbulent flows which is a key point to predict particles behavior correctly.

  16. Topology for statistical modeling of petascale data.

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  17. Axial electron channeling statistical method of site occupancy determination

    Institute of Scientific and Technical Information of China (English)

    YE; Jia

    2001-01-01

    Karman, Th., Zur theorie der spanungszustnde in plastischen und sandartigen medion, Nachr. Gesellsch. Wissensch., Gttingen, 1909.[17]Szczepinski, W., Introduction to the Mechanics of Plastic Forming of Metals, Netherlands: Sijthoff and Noordhoff, 1979.[18]Chen, W. F., Limit Analysis and Soil Plasticity, New York: Elsevier, 1975.[19]Yu, M. H., He, L. N., A new model and theory on yield and failure of materials under complex stress state, Mechanical Behaviors of Materials~6, Oxford: Pergamon Press, 1991, 3: 841—846.[20]Yu, M. H., New System of Strength Theory (in Chinese), Xi'an: Xi'an Jiaotong Universitry Press, 1992.[21]Yu, M. H., He, L. N., Song, L. Y., Twin shear stress theory and its generalization, Scientia Sinica (Science in China), Series A, 1985, 28(11): 1174—1183.[22]Yu, M. H., Yang, S. Y. et al., Unified elasto-plastic associated and non-associated constitutive model and its engineering applications, Computers and Structures, 1999, 71: 627—636.[23]Ma, G. W., Shoji, I., Plastic limit analysis of circular plates with respect to unified yield criterion, Int. J. Mech. Sci., 1998, 40(10): 963.[24]Ma, G. W., Hao, H., Unified plastic limit analyses of circular plates under arbitrary load, Journal of Applied Mechanics, ASME, 1999, 66(2): 568.[25]Qiang, H. F., Lu, N., Liu, B. J., Unified solutions of crack tip plastic zone under small scale yielding, Chinese Journal of Mechanical Engineering, (in Chinese with English abstract), 1999, 35(1): 34—38.[26]Yang, S. Y., Yu, M. H., Constitutive descriptions of multiphase poropus media, Acta Mechanica Sinica (in Chinese with English abstract), 2000, 32(1):11—24.[27]Yang, S. Y., Yu, M. H., An elasto-plastic damage model for saturated and unsaturated geomaterials, Acta Mechanica Sinica (in Chinese with English abstract), 2000, 32(2): 198—206.[28]Cheng, H. X., Li, J. J., Zhang, G. S. et al., Finite element analysis program system HAJIF(X), Chinese Journal of

  18. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  19. Quantum Biological Channel Modeling and Capacity Calculation

    Directory of Open Access Journals (Sweden)

    Ivan B. Djordjevic

    2012-12-01

    Full Text Available Quantum mechanics has an important role in photosynthesis, magnetoreception, and evolution. There were many attempts in an effort to explain the structure of genetic code and transfer of information from DNA to protein by using the concepts of quantum mechanics. The existing biological quantum channel models are not sufficiently general to incorporate all relevant contributions responsible for imperfect protein synthesis. Moreover, the problem of determination of quantum biological channel capacity is still an open problem. To solve these problems, we construct the operator-sum representation of biological channel based on codon basekets (basis vectors, and determine the quantum channel model suitable for study of the quantum biological channel capacity and beyond. The transcription process, DNA point mutations, insertions, deletions, and translation are interpreted as the quantum noise processes. The various types of quantum errors are classified into several broad categories: (i storage errors that occur in DNA itself as it represents an imperfect storage of genetic information, (ii replication errors introduced during DNA replication process, (iii transcription errors introduced during DNA to mRNA transcription, and (iv translation errors introduced during the translation process. By using this model, we determine the biological quantum channel capacity and compare it against corresponding classical biological channel capacity. We demonstrate that the quantum biological channel capacity is higher than the classical one, for a coherent quantum channel model, suggesting that quantum effects have an important role in biological systems. The proposed model is of crucial importance towards future study of quantum DNA error correction, developing quantum mechanical model of aging, developing the quantum mechanical models for tumors/cancer, and study of intracellular dynamics in general.

  20. Semantic Importance Sampling for Statistical Model Checking

    Science.gov (United States)

    2015-01-16

    approach called Statistical Model Checking (SMC) [16], which relies on Monte - Carlo -based simulations to solve this verification task more scalably...Conclusion Statistical model checking (SMC) is a prominent approach for rigorous analysis of stochastic systems using Monte - Carlo simulations. In this... Monte - Carlo simulations, for computing the bounded probability that a specific event occurs during a stochastic system’s execution. Estimating the

  1. Matrix Tricks for Linear Statistical Models

    CERN Document Server

    Puntanen, Simo; Styan, George PH

    2011-01-01

    In teaching linear statistical models to first-year graduate students or to final-year undergraduate students there is no way to proceed smoothly without matrices and related concepts of linear algebra; their use is really essential. Our experience is that making some particular matrix tricks very familiar to students can substantially increase their insight into linear statistical models (and also multivariate statistical analysis). In matrix algebra, there are handy, sometimes even very simple "tricks" which simplify and clarify the treatment of a problem - both for the student and

  2. Modeling the Noise for Indoor Power Line Channel

    Directory of Open Access Journals (Sweden)

    Syed Samser Ali

    2013-07-01

    Full Text Available Electromagnetic interference, man-made noise, and multipath effects are main causes of bit errors in power-line communication. To design an efficient powerline transmission system, the channel characterization has to be known and this paper deals with a statistical noise model (SNM for the indoor powerline channel in a frequency band from 1 MHz to 30 MHz . The SNM parameters are obtained from large-scale measurements of the noise density spectrum on a real powerline channel. All measurements are between line and neutral at different locations in the same grid. The SNM is used for simulation of the noise density spectrum and offline analysis on the powerline channel

  3. Probability and Statistics in Sensor Performance Modeling

    Science.gov (United States)

    2010-12-01

    transformed Rice- Nakagami distribution ......................................................................... 49 Report Documentation Page...acoustic or electromagnetic waves are scattered by both objects and turbulent wind. A version of the Rice- Nakagami model (specifically with a...Gaussian, lognormal, exponential, gamma, and the 2XX → transformed Rice- Nakagami —as well as a discrete model. (Other examples of statistical models

  4. Infinite Random Graphs as Statistical Mechanical Models

    DEFF Research Database (Denmark)

    Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria

    2011-01-01

    We discuss two examples of infinite random graphs obtained as limits of finite statistical mechanical systems: a model of two-dimensional dis-cretized quantum gravity defined in terms of causal triangulated surfaces, and the Ising model on generic random trees. For the former model we describe...

  5. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  6. Vehicular Channel Characterization and Modeling

    OpenAIRE

    Oestges, Claude; 10th European Conference on Antennas and Propagation (EuCAP)

    2016-01-01

    Vehicle-to-vehicle transmissions have emerged as a key component of future communication standards, whose design and testing critically depends upon the understanding of propagation mechanisms. An important and specific aspect of vehicular communication channels lies in the fact that these are essentially non-stationary. Hence, this communication addresses two recent contributions in the field of non-stationary vehicular propagation, based on extensive measurements conducted at 5.3 GHz in sub...

  7. UWB channel modeling for indoor line-of-sight environment

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    SV/IEEE 802.15.3a model has been the standard model for Ultra-wide bandwidth (UWB) indoor non-line-of-sight (NLOS) wireless propagation, but for line-of-sight (LOS) case, it is not well defined. In this paper, a new statistical distribution model exclusively used for LOS environment is proposed based on investigation of the experimental data. By reducing the number of the visible random arriving clusters, the model itself and the parameters estimating of the corresponding model are simplified in comparison with SV/IEEE 802.15.3a model. The simulation result indicates that the proposed model is more accurate in modeling smallscale LOS environment than SV/IEEE 802.15.3a model when considering cumulative distribution functions(CDFs) for the three key channel impulse response (CIR) statistics.

  8. Distributions with given marginals and statistical modelling

    CERN Document Server

    Fortiana, Josep; Rodriguez-Lallena, José

    2002-01-01

    This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.

  9. Statistical distribution of the S-matrix in the one-channel case

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, G.; Mello, P.A.; Seligman, T.H.

    1981-10-01

    It is shown that the statistical distribution of an ensemble of one-channel S-matrices is uniquely determined by requiring that: 1) S has poles only in the lower half of the energy plane and 2) the function S(E) is ergodic in a sense to be defined. A Monte Carlo calculation was performed to illustrate numerically the above statement.

  10. Statistical distribution of the S-matrix in the on-channel case

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, G.; Mello, P.A.; Seligman, T.H.

    1981-10-01

    It is shown that the statistical distribution of an ensemble of one-channel S-matrices is uniquely determined by requiring that: 1) S has poles only in the lower half of the energy plane and 2) the function S(E) is ergodic in a sense to be defined. A Monte Carlo calculation was performed to illustrate numerically the above statement.

  11. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  12. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  13. Simple statistical model for branched aggregates

    DEFF Research Database (Denmark)

    Lemarchand, Claire; Hansen, Jesper Schmidt

    2015-01-01

    , given that it already has bonds with others. The model is applied here to asphaltene nanoaggregates observed in molecular dynamics simulations of Cooee bitumen. The variation with temperature of the probabilities deduced from this model is discussed in terms of statistical mechanics arguments......We propose a statistical model that can reproduce the size distribution of any branched aggregate, including amylopectin, dendrimers, molecular clusters of monoalcohols, and asphaltene nanoaggregates. It is based on the conditional probability for one molecule to form a new bond with a molecule....... The relevance of the statistical model in the case of asphaltene nanoaggregates is checked by comparing the predicted value of the probability for one molecule to have exactly i bonds with the same probability directly measured in the molecular dynamics simulations. The agreement is satisfactory...

  14. Statistical Modeling for Radiation Hardness Assurance

    Science.gov (United States)

    Ladbury, Raymond L.

    2014-01-01

    We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.

  15. Performance modeling, loss networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi

    2009-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of understanding the phenomenon of statistical multiplexing. The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the important ideas of Palm distributions associated with traffic models and their role in performance measures. Also presented are recent ideas of large buffer, and many sources asymptotics that play an important role in understanding statistical multiplexing. I

  16. Dielectronic recombination rate in statistical model

    OpenAIRE

    Demura A.V.; Leontyev D.S.; Lisitsa V.S.; Shurigyn V.A.

    2017-01-01

    The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear...

  17. Dielectronic recombination rate in statistical model

    Directory of Open Access Journals (Sweden)

    Demura A.V.

    2017-01-01

    Full Text Available The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear plasmas with the tungsten impurities.

  18. Dielectronic recombination rate in statistical model

    Science.gov (United States)

    Demura, A. V.; Leontyev, D. S.; Lisitsa, V. S.; Shurigyn, V. A.

    2016-12-01

    The dielectronic recombination rate of multielectron ions was calculated by means of the statistical approach. It is based on an idea of collective excitations of atomic electrons with the local plasma frequencies. These frequencies are expressed via the Thomas-Fermi model electron density distribution. The statistical approach provides fast computation of DR rates that are compared with the modern quantum mechanical calculations. The results are important for current studies of thermonuclear plasmas with the tungsten impurities.

  19. An analytical channel thermal noise model for deep-submicron MOSFETs with short channel effects

    Science.gov (United States)

    Jeon, Jongwook; Lee, Jong Duk; Park, Byung-Gook; Shin, Hyungcheol

    2007-07-01

    In this work, an analytical channel thermal noise model for short channel MOSFETs is derived. The transfer function of the noise was derived by following the Tsividis' method. The proposed model takes into account the channel length modulation, velocity saturation, and carrier heating effects in the gradual channel region. Modeling results show good agreements with the measured noise data.

  20. Radio Channel Modelling Using Stochastic Propagation Graphs

    DEFF Research Database (Denmark)

    Pedersen, Troels; Fleury, Bernard Henri

    2007-01-01

    In this contribution the radio channel model proposed in [1] is extended to include multiple transmitters and receivers. The propagation environment is modelled using random graphs where vertices of a graph represent scatterers and edges model the wave propagation between scatterers. Furthermore...

  1. Statistical Model Checking for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Du, Dehui; Larsen, Kim Guldstrand

    2012-01-01

    This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique ap...

  2. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Bremer, P. -T. [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, the approach of the entire team involving all three institutions is based on the complementary techniques of combinatorial topology and statistical modelling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modelling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. The overall technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modelling, and (3) new integrated topological and statistical methods. Roughly speaking, the division of labor between our 3 groups (Sandia Labs in Livermore, Texas A&M in College Station, and U Utah in Salt Lake City) is as follows: the Sandia group focuses on statistical methods and their formulation in algebraic terms, and finds the application problems (and data sets) most relevant to this project, the Texas A&M Group develops new algebraic geometry algorithms, in particular with fewnomial theory, and the Utah group develops new algorithms in computational topology via Discrete Morse Theory. However, we hasten to point out that our three groups stay in tight contact via videconference every 2 weeks, so there is much synergy of ideas between the groups. The following of this document is focused on the contributions that had grater direct involvement from the team at the University of Utah in Salt Lake City.

  3. Three Generative, Lexicalised Models for Statistical Parsing

    CERN Document Server

    Collins, M

    1997-01-01

    In this paper we first propose a new statistical parsing model, which is a generative model of lexicalised context-free grammar. We then extend the model to include a probabilistic treatment of both subcategorisation and wh-movement. Results on Wall Street Journal text show that the parser performs at 88.1/87.5% constituent precision/recall, an average improvement of 2.3% over (Collins 96).

  4. Growth curve models and statistical diagnostics

    CERN Document Server

    Pan, Jian-Xin

    2002-01-01

    Growth-curve models are generalized multivariate analysis-of-variance models. These models are especially useful for investigating growth problems on short times in economics, biology, medical research, and epidemiology. This book systematically introduces the theory of the GCM with particular emphasis on their multivariate statistical diagnostics, which are based mainly on recent developments made by the authors and their collaborators. The authors provide complete proofs of theorems as well as practical data sets and MATLAB code.

  5. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  6. An R companion to linear statistical models

    CERN Document Server

    Hay-Jahans, Christopher

    2011-01-01

    Focusing on user-developed programming, An R Companion to Linear Statistical Models serves two audiences: those who are familiar with the theory and applications of linear statistical models and wish to learn or enhance their skills in R; and those who are enrolled in an R-based course on regression and analysis of variance. For those who have never used R, the book begins with a self-contained introduction to R that lays the foundation for later chapters.This book includes extensive and carefully explained examples of how to write programs using the R programming language. These examples cove

  7. Advanced Concepts for Underwater Acoustic Channel Modeling

    Science.gov (United States)

    Etter, P. C.; Haas, C. H.; Ramani, D. V.

    2014-12-01

    This paper examines nearshore underwater-acoustic channel modeling concepts and compares channel-state information requirements against existing modeling capabilities. This process defines a subset of candidate acoustic models suitable for simulating signal propagation in underwater communications. Underwater-acoustic communications find many practical applications in coastal oceanography, and networking is the enabling technology for these applications. Such networks can be formed by establishing two-way acoustic links between autonomous underwater vehicles and moored oceanographic sensors. These networks can be connected to a surface unit for further data transfer to ships, satellites, or shore stations via a radio-frequency link. This configuration establishes an interactive environment in which researchers can extract real-time data from multiple, but distant, underwater instruments. After evaluating the obtained data, control messages can be sent back to individual instruments to adapt the networks to changing situations. Underwater networks can also be used to increase the operating ranges of autonomous underwater vehicles by hopping the control and data messages through networks that cover large areas. A model of the ocean medium between acoustic sources and receivers is called a channel model. In an oceanic channel, characteristics of the acoustic signals change as they travel from transmitters to receivers. These characteristics depend upon the acoustic frequency, the distances between sources and receivers, the paths followed by the signals, and the prevailing ocean environment in the vicinity of the paths. Properties of the received signals can be derived from those of the transmitted signals using these channel models. This study concludes that ray-theory models are best suited to the simulation of acoustic signal propagation in oceanic channels and identifies 33 such models that are eligible candidates.

  8. Secure Broadcasting over Fading Channels with Statistical QoS Constraints

    CERN Document Server

    Qiao, Deli; Velipasalar, Senem

    2010-01-01

    In this paper, the fading broadcast channel with confidential messages is studied in the presence of statistical quality of service (QoS) constraints in the form of limitations on the buffer length. We employ the effective capacity formulation to measure the throughput of the confidential and common messages. We assume that the channel side information (CSI) is available at both the transmitter and the receivers. Assuming average power constraints at the transmitter side, we first define the effective secure throughput region, and prove that the throughput region is convex. Then, we obtain the optimal power control policies that achieve the boundary points of the effective secure throughput region.

  9. Connectivity of channelized reservoirs: a modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Larue, David K. [ChevronTexaco, Bakersfield, CA (United States); Hovadik, Joseph [ChevronTexaco, San Ramon, CA (United States)

    2006-07-01

    Connectivity represents one of the fundamental properties of a reservoir that directly affects recovery. If a portion of the reservoir is not connected to a well, it cannot be drained. Geobody or sandbody connectivity is defined as the percentage of the reservoir that is connected, and reservoir connectivity is defined as the percentage of the reservoir that is connected to wells. Previous studies have mostly considered mathematical, physical and engineering aspects of connectivity. In the current study, the stratigraphy of connectivity is characterized using simple, 3D geostatistical models. Based on these modelling studies, stratigraphic connectivity is good, usually greater than 90%, if the net: gross ratio, or sand fraction, is greater than about 30%. At net: gross values less than 30%, there is a rapid diminishment of connectivity as a function of net: gross. This behaviour between net: gross and connectivity defines a characteristic 'S-curve', in which the connectivity is high for net: gross values above 30%, then diminishes rapidly and approaches 0. Well configuration factors that can influence reservoir connectivity are well density, well orientation (vertical or horizontal; horizontal parallel to channels or perpendicular) and length of completion zones. Reservoir connectivity as a function of net: gross can be improved by several factors: presence of overbank sandy facies, deposition of channels in a channel belt, deposition of channels with high width/thickness ratios, and deposition of channels during variable floodplain aggradation rates. Connectivity can be reduced substantially in two-dimensional reservoirs, in map view or in cross-section, by volume support effects and by stratigraphic heterogeneities. It is well known that in two dimensions, the cascade zone for the 'S-curve' of net: gross plotted against connectivity occurs at about 60% net: gross. Generalizing this knowledge, any time that a reservoir can be regarded as &apos

  10. STATISTICAL MODELS OF REPRESENTING INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2016-07-01

    Full Text Available This article entitled Statistical Models of Representing Intellectual Capital approaches and analyses the concept of intellectual capital, as well as the main models which can support enterprisers/managers in evaluating and quantifying the advantages of intellectual capital. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. Among the group of static models for evaluating organisational intellectual capital the canonical model stands out. This model enables the structuring of organisational intellectual capital in: human capital, structural capital and relational capital. Although the model is widely spread, it is a static one and can thus create a series of errors in the process of evaluation, because all the three entities mentioned above are not independent from the viewpoint of their contents, as any logic of structuring complex entities requires.

  11. Nonperturbative approach to the modified statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Magdy, M.A.; Bekmezci, A.; Sever, R. [Middle East Technical Univ., Ankara (Turkey)

    1993-12-01

    The modified form of the statistical model is used without making any perturbation. The mass spectra of the lowest S, P and D levels of the (Q{bar Q}) and the non-self-conjugate (Q{bar q}) mesons are studied with the Song-Lin potential. The authors results are in good agreement with the experimental and theoretical findings.

  12. Statistical Modeling Efforts for Headspace Gas

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Brian Phillip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-17

    The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.

  13. Topology for Statistical Modeling of Petascale Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Janine Camille [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pebay, Philippe Pierre [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States); Levine, Joshua [Univ. of Utah, Salt Lake City, UT (United States); Gyulassy, Attila [Univ. of Utah, Salt Lake City, UT (United States); Rojas, Maurice [Texas A & M Univ., College Station, TX (United States)

    2014-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled "Topology for Statistical Modeling of Petascale Data", funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program.

  14. Statistical Model Checking for Biological Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2014-01-01

    Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic t...

  15. Dynamic propagation channel characterization and modeling for human body communication.

    Science.gov (United States)

    Nie, Zedong; Ma, Jingjing; Li, Zhicheng; Chen, Hong; Wang, Lei

    2012-12-18

    This paper presents the first characterization and modeling of dynamic propagation channels for human body communication (HBC). In-situ experiments were performed using customized transceivers in an anechoic chamber. Three HBC propagation channels, i.e., from right leg to left leg, from right hand to left hand and from right hand to left leg, were investigated under thirty-three motion scenarios. Snapshots of data (2,800,000) were acquired from five volunteers. Various path gains caused by different locations and movements were quantified and the statistical distributions were estimated. In general, for a given reference threshold è = -10 dB, the maximum average level crossing rate of the HBC was approximately 1.99 Hz, the maximum average fade time was 59.4 ms, and the percentage of bad channel duration time was less than 4.16%. The HBC exhibited a fade depth of -4 dB at 90% complementary cumulative probability. The statistical parameters were observed to be centered for each propagation channel. Subsequently a Fritchman model was implemented to estimate the burst characteristics of the on-body fading. It was concluded that the HBC is motion-insensitive, which is sufficient for reliable communication link during motions, and therefore it has great potential for body sensor/area networks.

  16. Dynamic Propagation Channel Characterization and Modeling for Human Body Communication

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2012-12-01

    Full Text Available This paper presents the first characterization and modeling of dynamic propagation channels for human body communication (HBC. In-situ experiments were performed using customized transceivers in an anechoic chamber. Three HBC propagation channels, i.e., from right leg to left leg, from right hand to left hand and from right hand to left leg, were investigated under thirty-three motion scenarios. Snapshots of data (2,800,000 were acquired from five volunteers. Various path gains caused by different locations and movements were quantified and the statistical distributions were estimated. In general, for a given reference threshold è = −10 dB, the maximum average level crossing rate of the HBC was approximately 1.99 Hz, the maximum average fade time was 59.4 ms, and the percentage of bad channel duration time was less than 4.16%. The HBC exhibited a fade depth of −4 dB at 90% complementary cumulative probability. The statistical parameters were observed to be centered for each propagation channel. Subsequently a Fritchman model was implemented to estimate the burst characteristics of the on-body fading. It was concluded that the HBC is motion-insensitive, which is sufficient for reliable communication link during motions, and therefore it has great potential for body sensor/area networks.

  17. Blind channel identication of nonlinear folding mixing model

    Institute of Scientific and Technical Information of China (English)

    Su Yong; Xu Shangzhi; Ye Zhongfu

    2006-01-01

    Signals from multi-sensor systems are often mixtures of (statistically) independent sources by unknown mixing method. Blind source separation(BSS) and independent component analysis(ICA) are the methods to identify/recover the channels and the sources. BSS/ICA of nonlinear mixing models are difficult problems. For instance, the post-nonlinear model has been studied by several authors. It is noticed that in most cases, the proposed models are always with an invertible mixing. According to this fact there is an interesting question: how about the situation of the non-invertible non-linear mixing in BSS or ICA? A new simple non-linear mixing model is proposed with a kind of non-invertible mixing, the folding mixing, and method to identify its channel, blindly.

  18. Statistical modeling of space shuttle environmental data

    Science.gov (United States)

    Tubbs, J. D.; Brewer, D. W.

    1983-01-01

    Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.

  19. Statistical physical models of cellular motility

    Science.gov (United States)

    Banigan, Edward J.

    Cellular motility is required for a wide range of biological behaviors and functions, and the topic poses a number of interesting physical questions. In this work, we construct and analyze models of various aspects of cellular motility using tools and ideas from statistical physics. We begin with a Brownian dynamics model for actin-polymerization-driven motility, which is responsible for cell crawling and "rocketing" motility of pathogens. Within this model, we explore the robustness of self-diffusiophoresis, which is a general mechanism of motility. Using this mechanism, an object such as a cell catalyzes a reaction that generates a steady-state concentration gradient that propels the object in a particular direction. We then apply these ideas to a model for depolymerization-driven motility during bacterial chromosome segregation. We find that depolymerization and protein-protein binding interactions alone are sufficient to robustly pull a chromosome, even against large loads. Next, we investigate how forces and kinetics interact during eukaryotic mitosis with a many-microtubule model. Microtubules exert forces on chromosomes, but since individual microtubules grow and shrink in a force-dependent way, these forces lead to bistable collective microtubule dynamics, which provides a mechanism for chromosome oscillations and microtubule-based tension sensing. Finally, we explore kinematic aspects of cell motility in the context of the immune system. We develop quantitative methods for analyzing cell migration statistics collected during imaging experiments. We find that during chronic infection in the brain, T cells run and pause stochastically, following the statistics of a generalized Levy walk. These statistics may contribute to immune function by mimicking an evolutionarily conserved efficient search strategy. Additionally, we find that naive T cells migrating in lymph nodes also obey non-Gaussian statistics. Altogether, our work demonstrates how physical

  20. Transmission Strategies in Multiple Access Fading Channels with Statistical QoS Constraints

    CERN Document Server

    Qiao, Deli; Velipasalar, Senem

    2010-01-01

    Effective capacity, which provides the maximum constant arrival rate that a given service process can support while satisfying statistical delay constraints, is analyzed in a multiuser scenario. In particular, the effective capacity region of fading multiple access channels (MAC) in the presence of quality of service (QoS) constraints is studied. Perfect channel side information (CSI) is assumed to be available at both the transmitters and the receiver. It is initially assumed the transmitters send the information at a fixed power level and hence do not employ power control policies. Under this assumption, the performance achieved by superposition coding with successive decoding techniques is investigated. It is shown that varying the decoding order with respect to the channel states can significantly increase the achievable throughput region. In the two-user case, the optimal decoding strategy is determined for the scenario in which the users have the same QoS constraints. The performance of orthogonal trans...

  1. Performance modeling, stochastic networks, and statistical multiplexing

    CERN Document Server

    Mazumdar, Ravi R

    2013-01-01

    This monograph presents a concise mathematical approach for modeling and analyzing the performance of communication networks with the aim of introducing an appropriate mathematical framework for modeling and analysis as well as understanding the phenomenon of statistical multiplexing. The models, techniques, and results presented form the core of traffic engineering methods used to design, control and allocate resources in communication networks.The novelty of the monograph is the fresh approach and insights provided by a sample-path methodology for queueing models that highlights the importan

  2. Information theory and statistical nuclear reactions. I. General theory and applications to few-channel problems

    Energy Technology Data Exchange (ETDEWEB)

    Mello, P.A.; Pereyra, P.; Seligman, T.H.

    1985-05-01

    Ensembles of scattering S-matrices have been used in the past to describe the statistical fluctuations exhibited by many nuclear-reaction cross sections as a function of energy. In recent years, there have been attempts to construct these ensembles explicitly in terms of S, by directly proposinng a statistical law for S. In the present paper, it is shown that, for an arbitrary number of channels, one can incorporate, in the ensemble of S-matrices, the conditions of flux conservation, time-reversal invariance, causality, ergodicity, and the requirement that the ensemble average coincide with the optical scattering matrix. Since these conditions do not specify the ensemble uniquely, the ensemble that has maximum information-entropy is dealt with among those that satisfy the above requirements. Some applications to few-channel problems and comparisons to Monte-Carlo calculations are presented.

  3. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  4. Pitfalls in statistical landslide susceptibility modelling

    Science.gov (United States)

    Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut

    2010-05-01

    The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible

  5. Brownsville Ship Channel Hydrodynamic Modeling

    Science.gov (United States)

    2012-01-01

    31  Figure 31. Laguna Madre analysis locations...wave resuspension and circulation of sediment in Laguna Madre .2 The navigation impacts are assessed by performing model simulations of the...to better resolve the shallow-water habitats, including South Bay, Bahia Grande, and South Laguna Madre . These habitats are discussed further

  6. Statistical Compressed Sensing of Gaussian Mixture Models

    CERN Document Server

    Yu, Guoshen

    2011-01-01

    A novel framework of compressed sensing, namely statistical compressed sensing (SCS), that aims at efficiently sampling a collection of signals that follow a statistical distribution, and achieving accurate reconstruction on average, is introduced. SCS based on Gaussian models is investigated in depth. For signals that follow a single Gaussian model, with Gaussian or Bernoulli sensing matrices of O(k) measurements, considerably smaller than the O(k log(N/k)) required by conventional CS based on sparse models, where N is the signal dimension, and with an optimal decoder implemented via linear filtering, significantly faster than the pursuit decoders applied in conventional CS, the error of SCS is shown tightly upper bounded by a constant times the best k-term approximation error, with overwhelming probability. The failure probability is also significantly smaller than that of conventional sparsity-oriented CS. Stronger yet simpler results further show that for any sensing matrix, the error of Gaussian SCS is u...

  7. Equilibrium statistical mechanics of lattice models

    CERN Document Server

    Lavis, David A

    2015-01-01

    Most interesting and difficult problems in equilibrium statistical mechanics concern models which exhibit phase transitions. For graduate students and more experienced researchers this book provides an invaluable reference source of approximate and exact solutions for a comprehensive range of such models. Part I contains background material on classical thermodynamics and statistical mechanics, together with a classification and survey of lattice models. The geometry of phase transitions is described and scaling theory is used to introduce critical exponents and scaling laws. An introduction is given to finite-size scaling, conformal invariance and Schramm—Loewner evolution. Part II contains accounts of classical mean-field methods. The parallels between Landau expansions and catastrophe theory are discussed and Ginzburg—Landau theory is introduced. The extension of mean-field theory to higher-orders is explored using the Kikuchi—Hijmans—De Boer hierarchy of approximations. In Part III the use of alge...

  8. Statistical shape and appearance models of bones.

    Science.gov (United States)

    Sarkalkan, Nazli; Weinans, Harrie; Zadpoor, Amir A

    2014-03-01

    When applied to bones, statistical shape models (SSM) and statistical appearance models (SAM) respectively describe the mean shape and mean density distribution of bones within a certain population as well as the main modes of variations of shape and density distribution from their mean values. The availability of this quantitative information regarding the detailed anatomy of bones provides new opportunities for diagnosis, evaluation, and treatment of skeletal diseases. The potential of SSM and SAM has been recently recognized within the bone research community. For example, these models have been applied for studying the effects of bone shape on the etiology of osteoarthritis, improving the accuracy of clinical osteoporotic fracture prediction techniques, design of orthopedic implants, and surgery planning. This paper reviews the main concepts, methods, and applications of SSM and SAM as applied to bone.

  9. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  10. NUMERICAL MODELING OF COMPOUND CHANNEL FLOWS

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A numerical model capable of predicting flow characteristics in a compound channel was established with the 3-D steady continuity and momentum equations along with the transport equations for turbulence kinetic energy and dissipation rate. Closure was achieved with the aid of algebraic relations for turbulent shear stresses. The above equations were discretized with implicit difference approach and solved with a step method along the flow direction. The computational results showing the lateral distribution of vertical average velocities and the latio of total flow in the compound channel agree well with the available experimental data.

  11. Statistics, Computation, and Modeling in Cosmology

    Science.gov (United States)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and

  12. Does the choice of the forcing term affect flow statistics in DNS of turbulent channel flow?

    CERN Document Server

    Quadrio, Maurizio; Hasegawa, Yosuke

    2015-01-01

    We seek possible statistical consequences of the way a forcing term is added to the Navier--Stokes equations in the Direct Numerical Simulation (DNS) of incompressible channel flow. Simulations driven by constant flow rate, constant pressure gradient and constant power input are used to build large databases, and in particular to store the complete temporal trace of the wall-shear stress for later analysis. As these approaches correspond to different dynamical systems, it can in principle be envisaged that these differences are reflect by certain statistics of the turbulent flow field. The instantaneous realizations of the flow in the various simulations are obviously different, but, as expected, the usual one-point, one-time statistics do not show any appreciable difference. However, the PDF for the fluctuations of the streamwise component of wall friction reveals that the simulation with constant flow rate presents lower probabilities for extreme events of large positive friction. The low probability value ...

  13. Statistical Language Model for Chinese Text Proofreading

    Institute of Scientific and Technical Information of China (English)

    张仰森; 曹元大

    2003-01-01

    Statistical language modeling techniques are investigated so as to construct a language model for Chinese text proofreading. After the defects of n-gram model are analyzed, a novel statistical language model for Chinese text proofreading is proposed. This model takes full account of the information located before and after the target word wi, and the relationship between un-neighboring words wi and wj in linguistic environment(LE). First, the word association degree between wi and wj is defined by using the distance-weighted factor, wj is l words apart from wi in the LE, then Bayes formula is used to calculate the LE related degree of word wi, and lastly, the LE related degree is taken as criterion to predict the reasonability of word wi that appears in context. Comparing the proposed model with the traditional n-gram in a Chinese text automatic error detection system, the experiments results show that the error detection recall rate and precision rate of the system have been improved.

  14. Modeling the ion channel structure of cecropin.

    OpenAIRE

    Durell, S R; Raghunathan, G.; Guy, H R

    1992-01-01

    Atomic-scale computer models were developed for how cecropin peptides may assemble in membranes to form two types of ion channels. The models are based on experimental data and physiochemical principles. Initially, cecropin peptides, in a helix-bend-helix motif, were arranged as antiparallel dimers to position conserved residues of adjacent monomers in contact. The dimers were postulated to bind to the membrane with the NH2-terminal helices sunken into the head-group layer and the COOH-termin...

  15. A Noisy-Channel Model for Document Compression

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present a document compression system that uses a hierarchical noisy-channel model of text production. Our compression system first automatically derives the syntactic structure of each sentence and the overall discourse structure of the text given as input. The system then uses a statistical hierarchical model of text production in order to drop non-important syntactic and discourse constituents so as to generate coherent, grammatical document compressions of arbitrary length. The system outperforms both a baseline and a sentence-based compression system that operates by simplifying sequentially all sentences in a text. Our results support the claim that discourse knowledge plays an important role in document summarization.

  16. Statistical mechanics of Monod-Wyman-Changeux (MWC) models.

    Science.gov (United States)

    Marzen, Sarah; Garcia, Hernan G; Phillips, Rob

    2013-05-13

    The 50th anniversary of the classic Monod-Wyman-Changeux (MWC) model provides an opportunity to survey the broader conceptual and quantitative implications of this quintessential biophysical model. With the use of statistical mechanics, the mathematical implementation of the MWC concept links problems that seem otherwise to have no ostensible biological connection including ligand-receptor binding, ligand-gated ion channels, chemotaxis, chromatin structure and gene regulation. Hence, a thorough mathematical analysis of the MWC model can illuminate the performance limits of a number of unrelated biological systems in one stroke. The goal of our review is twofold. First, we describe in detail the general physical principles that are used to derive the activity of MWC molecules as a function of their regulatory ligands. Second, we illustrate the power of ideas from information theory and dynamical systems for quantifying how well the output of MWC molecules tracks their sensory input, giving a sense of the "design" constraints faced by these receptors.

  17. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  18. A survey of statistical network models

    CERN Document Server

    Goldenberg, Anna; Fienberg, Stephen E; Airoldi, Edoardo M

    2009-01-01

    Networks are ubiquitous in science and have become a focal point for discussion in everyday life. Formal statistical models for the analysis of network data have emerged as a major topic of interest in diverse areas of study, and most of these involve a form of graphical representation. Probability models on graphs date back to 1959. Along with empirical studies in social psychology and sociology from the 1960s, these early works generated an active network community and a substantial literature in the 1970s. This effort moved into the statistical literature in the late 1970s and 1980s, and the past decade has seen a burgeoning network literature in statistical physics and computer science. The growth of the World Wide Web and the emergence of online networking communities such as Facebook, MySpace, and LinkedIn, and a host of more specialized professional network communities has intensified interest in the study of networks and network data. Our goal in this review is to provide the reader with an entry poin...

  19. A unified framework for the statistical characterization of the SNR of amplify-and-forward multihop channels

    KAUST Repository

    Yilmaz, Ferkan

    2010-01-01

    In this paper, we present a unified approach to analyze the exact statistical characteristics of the harmonic mean of N ≥ 2 statistically independent and non-identically distributed random variables (RVs), which we term the N-normalized harmonic distribution (i.e., NHD distribution), for the purpose of modeling the amplify-and-forward multihop relay channels. We present exact statistical metrics for the moments-generating function (MGF), moments (Mellin moments-generating function), probability density function (PDF) and cumulative distribution function (CDF) of the NHD distribution. Aside from unifying past results based on the geometric-mean approximation of the harmonic-mean, our approach relies on the algebraic combination of Mellin and Laplace transforms to obtain exact single integral expressions which can be easily computed using the Gauss-Laguerre quadrature rule or can be readily expressed in terms of the multivariable Meijer\\'s G of Fox\\'s H functions. Numerical and simulation results, performed to verify the correctness of the proposed formulation, are in perfect agreement. The proposed formulation can be used to analyze the performance measures of the amplify-and-forward multihop relay channels such as outage probability, outage capacity, average capacity and average bit error probabilities. © 2009 IEEE.

  20. Statistical modelling for falls count data.

    Science.gov (United States)

    Ullah, Shahid; Finch, Caroline F; Day, Lesley

    2010-03-01

    Falls and their injury outcomes have count distributions that are highly skewed toward the right with clumping at zero, posing analytical challenges. Different modelling approaches have been used in the published literature to describe falls count distributions, often without consideration of the underlying statistical and modelling assumptions. This paper compares the use of modified Poisson and negative binomial (NB) models as alternatives to Poisson (P) regression, for the analysis of fall outcome counts. Four different count-based regression models (P, NB, zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB)) were each individually fitted to four separate fall count datasets from Australia, New Zealand and United States. The finite mixtures of P and NB regression models were also compared to the standard NB model. Both analytical (F, Vuong and bootstrap tests) and graphical approaches were used to select and compare models. Simulation studies assessed the size and power of each model fit. This study confirms that falls count distributions are over-dispersed, but not dispersed due to excess zero counts or heterogeneous population. Accordingly, the P model generally provided the poorest fit to all datasets. The fit improved significantly with NB and both zero-inflated models. The fit was also improved with the NB model, compared to finite mixtures of both P and NB regression models. Although there was little difference in fit between NB and ZINB models, in the interests of parsimony it is recommended that future studies involving modelling of falls count data routinely use the NB models in preference to the P or ZINB or finite mixture distribution. The fact that these conclusions apply across four separate datasets from four different samples of older people participating in studies of different methodology, adds strength to this general guiding principle.

  1. Fluctuations of offshore wind generation: Statistical modelling

    DEFF Research Database (Denmark)

    Pinson, Pierre; Christensen, Lasse E.A.; Madsen, Henrik

    2007-01-01

    The magnitude of power fluctuations at large offshore wind farms has a significant impact on the control and management strategies of their power output. If focusing on the minute scale, one observes successive periods with smaller and larger power fluctuations. It seems that different regimes...... yield different behaviours of the wind power output. This paper concentrates on the statistical modelling of offshore power fluctuations, with particular emphasis on regime-switching models. More precisely, Self-Exciting Threshold AutoRegressive (SETAR), Smooth Transition AutoRegressive (STAR......) and Markov-Switching AutoRegressive (MSAR) models are considered. The particularities of these models are presented, as well as methods for the estimation of their parameters. Simulation results are given for the case of the Horns Rev and Nysted offshore wind farms in Denmark, for time-series of power...

  2. Discrete channel modelling based on genetic algorithm and simulated annealing for training hidden Markov model

    Institute of Scientific and Technical Information of China (English)

    Zhao Zhi-Jin; Zheng Shi-Lian; Xu Chun-Yun; Kong Xian-Zheng

    2007-01-01

    Hidden Markov models (HMMs) have been used to model burst error sources of wireless channels. This paper proposes a hybrid method of using genetic algorithm (GA) and simulated annealing (SA) to train HMM for discrete channel modelling. The proposed method is compared with pure GA, and experimental results show that the HMMs trained by the hybrid method can better describe the error sequences due to SA's ability of facilitating hill-climbing at the later stage of the search. The burst error statistics of the HMMs trained by the proposed method and the corresponding error sequences are also presented to validate the proposed method.

  3. Statistical Modelling of the Soil Dielectric Constant

    Science.gov (United States)

    Usowicz, Boguslaw; Marczewski, Wojciech; Bogdan Usowicz, Jerzy; Lipiec, Jerzy

    2010-05-01

    The dielectric constant of soil is the physical property being very sensitive on water content. It funds several electrical measurement techniques for determining the water content by means of direct (TDR, FDR, and others related to effects of electrical conductance and/or capacitance) and indirect RS (Remote Sensing) methods. The work is devoted to a particular statistical manner of modelling the dielectric constant as the property accounting a wide range of specific soil composition, porosity, and mass density, within the unsaturated water content. Usually, similar models are determined for few particular soil types, and changing the soil type one needs switching the model on another type or to adjust it by parametrization of soil compounds. Therefore, it is difficult comparing and referring results between models. The presented model was developed for a generic representation of soil being a hypothetical mixture of spheres, each representing a soil fraction, in its proper phase state. The model generates a serial-parallel mesh of conductive and capacitive paths, which is analysed for a total conductive or capacitive property. The model was firstly developed to determine the thermal conductivity property, and now it is extended on the dielectric constant by analysing the capacitive mesh. The analysis is provided by statistical means obeying physical laws related to the serial-parallel branching of the representative electrical mesh. Physical relevance of the analysis is established electrically, but the definition of the electrical mesh is controlled statistically by parametrization of compound fractions, by determining the number of representative spheres per unitary volume per fraction, and by determining the number of fractions. That way the model is capable covering properties of nearly all possible soil types, all phase states within recognition of the Lorenz and Knudsen conditions. In effect the model allows on generating a hypothetical representative of

  4. Modelling debris flows down general channels

    Directory of Open Access Journals (Sweden)

    S. P. Pudasaini

    2005-01-01

    Full Text Available This paper is an extension of the single-phase cohesionless dry granular avalanche model over curved and twisted channels proposed by Pudasaini and Hutter (2003. It is a generalisation of the Savage and Hutter (1989, 1991 equations based on simple channel topography to a two-phase fluid-solid mixture of debris material. Important terms emerging from the correct treatment of the kinematic and dynamic boundary condition, and the variable basal topography are systematically taken into account. For vanishing fluid contribution and torsion-free channel topography our new model equations exactly degenerate to the previous Savage-Hutter model equations while such a degeneration was not possible by the Iverson and Denlinger (2001 model, which, in fact, also aimed to extend the Savage and Hutter model. The model equations of this paper have been rigorously derived; they include the effects of the curvature and torsion of the topography, generally for arbitrarily curved and twisted channels of variable channel width. The equations are put into a standard conservative form of partial differential equations. From these one can easily infer the importance and influence of the pore-fluid-pressure distribution in debris flow dynamics. The solid-phase is modelled by applying a Coulomb dry friction law whereas the fluid phase is assumed to be an incompressible Newtonian fluid. Input parameters of the equations are the internal and bed friction angles of the solid particles, the viscosity and volume fraction of the fluid, the total mixture density and the pore pressure distribution of the fluid at the bed. Given the bed topography and initial geometry and the initial velocity profile of the debris mixture, the model equations are able to describe the dynamics of the depth profile and bed parallel depth-averaged velocity distribution from the initial position to the final deposit. A shock capturing, total variation diminishing numerical scheme is implemented to

  5. A Wideband Channel Model for Intravehicular Nomadic Systems

    Directory of Open Access Journals (Sweden)

    François Bellens

    2011-01-01

    Full Text Available The increase in electronic entertainment equipments within vehicles has rendered the idea of replacing the wired links with intra-vehicle personal area networks. Ultra-wideband (UWB seems an appropriate candidate technology to meet the required data rates for interconnecting such devices. In particular, the multiband OFDM (MB-OFDM is able to provide very high transfer rates (up to 480 MBps over relatively short distances and low transmit power. In order to evaluate the performances of UWB systems within vehicles, a reliable channel model is needed. In this paper, a nomadic system where a base station placed in the center of the dashboard wants to communicate with fixed devices placed at the rear seat is investigated. A single-input single-output (SISO channel model for intra-vehicular communication (IVC systems is proposed, based on reverberation chamber theory. The model is based on measurements conducted in real traffic conditions, with a varying number of passengers in the car. Temporal variations of the wireless channels are also characterized and parametrized. The proposed model is validated by comparing model-independent statistics with the measurements.

  6. A Dynamic Wideband Directional Channel Model for Vehicle-to-Vehicle Communications

    OpenAIRE

    He, Ruisi; Renaudin, Olivier; Kolmonen, Veli-Matti; Haneda, Katsuyuki; Zhong, Zhangdui; Ai, Bo; Oestges, Claude

    2015-01-01

    Vehicle-to-vehicle (V2V) communications have received a lot of attention due to their numerous applications in traffic safety. The design, testing, and improvement of the V2V system hinge critically on the understanding of the propagation channels. An important feature of the V2V channel is the time variance. To statistically model the time-variant V2V channels, a dynamic wideband directional channel model is proposed in this paper, based on measurements conducted at 5.3 GHz in suburban, urba...

  7. Wireless Fading Channel Models: From Classical to Stochastic Differential Equations

    Energy Technology Data Exchange (ETDEWEB)

    Olama, Mohammed M [ORNL; Djouadi, Seddik M [ORNL; Charalambous, Prof. Charalambos [University of Cyprus

    2010-01-01

    The wireless communications channel constitutes the basic physical link between the transmitter and the receiver antennas. Its modeling has been and continues to be a tantalizing issue, while being one of the most fundamental components based on which transmitters and receivers are designed and optimized. The ultimate performance limits of any communication system are determined by the channel it operates in. Realistic channel models are thus of utmost importance for system design and testing. In addition to exponential power path-loss, wireless channels suffer from stochastic short term fading (STF) due to multipath, and stochastic long term fading (LTF) due to shadowing depending on the geographical area. STF corresponds to severe signal envelope fluctuations, and occurs in densely built-up areas filled with lots of objects like buildings, vehicles, etc. On the other hand, LTF corresponds to less severe mean signal envelope fluctuations, and occurs in sparsely populated or suburban areas. In general, LTF and STF are considered as superimposed and may be treated separately. Ossanna was the pioneer to characterize the statistical properties of the signal received by a mobile user, in terms of interference of incident and reflected waves. His model was better suited for describing fading occurring mainly in suburban areas (LTF environments). It is described by the average power loss due to distance and power loss due to reflection of signals from surfaces, which when measured in dB's give rise to normal distributions, and this implies that the channel attenuation coefficient is log-normally distributed. Furthermore, in mobile communications, the LTF channel models are also characterized by their special correlation characteristics which have been reported. Clarke introduced the first comprehensive scattering model describing STF occurring mainly in urban areas. An easy way to simulate Clarke's model using a computer simulation is described. This model was

  8. Statistical Decision-Tree Models for Parsing

    CERN Document Server

    Magerman, D M

    1995-01-01

    Syntactic natural language parsers have shown themselves to be inadequate for processing highly-ambiguous large-vocabulary text, as is evidenced by their poor performance on domains like the Wall Street Journal, and by the movement away from parsing-based approaches to text-processing in general. In this paper, I describe SPATTER, a statistical parser based on decision-tree learning techniques which constructs a complete parse for every sentence and achieves accuracy rates far better than any published result. This work is based on the following premises: (1) grammars are too complex and detailed to develop manually for most interesting domains; (2) parsing models must rely heavily on lexical and contextual information to analyze sentences accurately; and (3) existing {$n$}-gram modeling techniques are inadequate for parsing models. In experiments comparing SPATTER with IBM's computer manuals parser, SPATTER significantly outperforms the grammar-based parser. Evaluating SPATTER against the Penn Treebank Wall ...

  9. Statistical Model Checking for Product Lines

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2016-01-01

    average cost of products (in terms of the attributes of the products’ features) and the probability of features to be (un)installed at runtime. The product lines must be modelled in QFLan, which extends the probabilistic feature-oriented language PFLan with novel quantitative constraints among features......We report on the suitability of statistical model checking for the analysis of quantitative properties of product line models by an extended treatment of earlier work by the authors. The type of analysis that can be performed includes the likelihood of specific product behaviour, the expected...... and on behaviour and with advanced feature installation options. QFLan is a rich process-algebraic specification language whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and probabilistic...

  10. Electronic noise modeling in statistical iterative reconstruction.

    Science.gov (United States)

    Xu, Jingyan; Tsui, Benjamin M W

    2009-06-01

    We consider electronic noise modeling in tomographic image reconstruction when the measured signal is the sum of a Gaussian distributed electronic noise component and another random variable whose log-likelihood function satisfies a certain linearity condition. Examples of such likelihood functions include the Poisson distribution and an exponential dispersion (ED) model that can approximate the signal statistics in integration mode X-ray detectors. We formulate the image reconstruction problem as a maximum-likelihood estimation problem. Using an expectation-maximization approach, we demonstrate that a reconstruction algorithm can be obtained following a simple substitution rule from the one previously derived without electronic noise considerations. To illustrate the applicability of the substitution rule, we present examples of a fully iterative reconstruction algorithm and a sinogram smoothing algorithm both in transmission CT reconstruction when the measured signal contains additive electronic noise. Our simulation studies show the potential usefulness of accurate electronic noise modeling in low-dose CT applications.

  11. Statistical model with a standard Γ distribution

    Science.gov (United States)

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-07-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .

  12. Statistical model with a standard Gamma distribution

    Science.gov (United States)

    Chakraborti, Anirban; Patriarca, Marco

    2005-03-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T (λ), where particles exchange energy in a space with an effective dimension D (λ).

  13. Challenges in Dental Statistics: Data and Modelling

    Directory of Open Access Journals (Sweden)

    Domenica Matranga

    2013-03-01

    Full Text Available The aim of this work is to present the reflections and proposals derived from the first Workshop of the SISMEC STATDENT working group on statistical methods and applications in dentistry, held in Ancona (Italy on 28th September 2011. STATDENT began as a forum of comparison and discussion for statisticians working in the field of dental research in order to suggest new and improve existing biostatistical and clinical epidemiological methods. During the meeting, we dealt with very important topics of statistical methodology for the analysis of dental data, covering the analysis of hierarchically structured and over-dispersed data, the issue of calibration and reproducibility, as well as some problems related to survey methodology, such as the design and construction of unbiased statistical indicators and of well conducted clinical trials. This paper gathers some of the methodological topics discussed during the meeting, concerning multilevel and zero-inflated models for the analysis of caries data and methods for the training and calibration of raters in dental epidemiology.

  14. Statistical Analysis of Multipath Fading Channels Using Generalizations of Shot Noise

    Directory of Open Access Journals (Sweden)

    Djouadi SeddikM

    2008-01-01

    Full Text Available Abstract This paper provides a connection between the shot-noise analysis of Rice and the statistical analysis of multipath fading wireless channels when the received signals are a low-pass signal and a bandpass signal. Under certain conditions, explicit expressions are obtained for autocorrelation functions, power spectral densities, and moment-generating functions. In addition, a central limit theorem is derived identifying the mean and covariance of the received signals, which is a generalization of Campbell_s theorem. The results are easily applicable to transmitted signals which are random and to CDMA signals.

  15. Effects of roughness on density-weighted particle statistics in turbulent channel flows

    Energy Technology Data Exchange (ETDEWEB)

    Milici, Barbara [Faculty of Engineering and Architecture, Cittadella Universitaria - 94100 - Enna (Italy)

    2015-12-31

    The distribution of inertial particles in turbulent flows is strongly influenced by the characteristics of the coherent turbulent structures which develop in the carrier flow field. In wall-bounded flows, these turbulent structures, which control the turbulent regeneration cycles, are strongly affected by the roughness of the wall, nevertheless its effects on the particle transport in two-phase turbulent flows has been still poorly investigated. The issue is discussed here by addressing DNS combined with LPT to obtain statistics of velocity and preferential accumulation of a dilute dispersion of heavy particles in a turbulent channel flow, bounded by irregular two-dimensional rough surfaces, in the one-way coupling regime.

  16. A Statistical Model of Skewed Associativity

    OpenAIRE

    Michaud, Pierre

    2002-01-01

    This paper presents a statistical model of set-associativity, victim caching and skewed-associativity, with an emphasis on skewed-associativity. We show that set-associativity is not efficient when the working-set size is close to the cache size. We refer to this as the unit working-set problem. We show that victim-caching is not a practical solution to the unit working-se- t problem either, although victim caching emulates full associativity for working-sets much larger than the victim buffe...

  17. Projecting Policy Effects with Statistical Models Projecting Policy Effects with Statistical Models

    Directory of Open Access Journals (Sweden)

    Christopher Sims

    1988-03-01

    Full Text Available This paper attempts to briefly discus the current frontiers in quantitative modeling for forecastina and policy analvsis. It does so by summarizing some recent developmenrs in three areas: reduced form forecasting models; theoretical models including elements of stochastic optimization; and identification. In the process, the paper tries to provide some remarks on the direction we seem to be headed. Projecting Policy Effects with Statistical Models

  18. Modelling of meander migration in an incised channel

    Institute of Scientific and Technical Information of China (English)

    Jianchun HUANG; Blair P GREIMANN; Timothy J RANDLE

    2014-01-01

    An updated linear computer model for meandering rivers with incision has been developed. The model simulates the bed topography, flow field, and bank erosion rate in an incised meandering channel. In a scenario where the upstream sediment load decreases (e.g., after dam closure or soil conservation), alluvial river experiences cross section deepening and slope flattening. The channel migration rate might be affected in two ways:decreased channel slope and steeped bank height. The proposed numerical model combines the traditional one-dimensional (1D) sediment transport model in simulating the channel erosion and the linear model for channel meandering. A non-equilibrium sediment transport model is used to update the channel bed elevation and gradations. A linear meandering model was used to calculate the channel alignment and bank erosion/accretion, which in turn was used by the 1D sediment transport model. In the 1D sediment transport model, the channel bed elevation and gradations are represented in each channel cross section. In the meandering model, the bed elevation and gradations are stored in two dimensional (2D) cells to represent the channel and terrain properties (elevation and gradation). A new method is proposed to exchange information regarding bed elevations and bed material fractions between 1D river geometry and 2D channel and terrain. The ability of the model is demonstrated using the simulation of the laboratory channel migration of Friedkin in which channel incision occurs at the upstream end.

  19. Statistical pairwise interaction model of stock market

    Science.gov (United States)

    Bury, Thomas

    2013-03-01

    Financial markets are a classical example of complex systems as they are compound by many interacting stocks. As such, we can obtain a surprisingly good description of their structure by making the rough simplification of binary daily returns. Spin glass models have been applied and gave some valuable results but at the price of restrictive assumptions on the market dynamics or they are agent-based models with rules designed in order to recover some empirical behaviors. Here we show that the pairwise model is actually a statistically consistent model with the observed first and second moments of the stocks orientation without making such restrictive assumptions. This is done with an approach only based on empirical data of price returns. Our data analysis of six major indices suggests that the actual interaction structure may be thought as an Ising model on a complex network with interaction strengths scaling as the inverse of the system size. This has potentially important implications since many properties of such a model are already known and some techniques of the spin glass theory can be straightforwardly applied. Typical behaviors, as multiple equilibria or metastable states, different characteristic time scales, spatial patterns, order-disorder, could find an explanation in this picture.

  20. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  1. Statistical Mechanical Models of Integer Factorization Problem

    Science.gov (United States)

    Nakajima, Chihiro H.; Ohzeki, Masayuki

    2017-01-01

    We formulate the integer factorization problem via a formulation of the searching problem for the ground state of a statistical mechanical Hamiltonian. The first passage time required to find a correct divisor of a composite number signifies the exponential computational hardness. The analysis of the density of states of two macroscopic quantities, i.e., the energy and the Hamming distance from the correct solutions, leads to the conclusion that the ground state (correct solution) is completely isolated from the other low-energy states, with the distance being proportional to the system size. In addition, the profile of the microcanonical entropy of the model has two peculiar features that are each related to two marked changes in the energy region sampled via Monte Carlo simulation or simulated annealing. Hence, we find a peculiar first-order phase transition in our model.

  2. Statistical model semiquantitatively approximates arabinoxylooligosaccharides' structural diversity

    DEFF Research Database (Denmark)

    Dotsenko, Gleb; Nielsen, Michael Krogsgaard; Lange, Lene

    2016-01-01

    A statistical model describing the random distribution of substituted xylopyranosyl residues in arabinoxylooligosaccharides is suggested and compared with existing experimental data. Structural diversity of arabinoxylooligosaccharides of various length, originating from different arabinoxylans...... (wheat flour arabinoxylan (arabinose/xylose, A/X = 0.47); grass arabinoxylan (A/X = 0.24); wheat straw arabinoxylan (A/X = 0.15); and hydrothermally pretreated wheat straw arabinoxylan (A/X = 0.05)), is semiquantitatively approximated using the proposed model. The suggested approach can be applied...... not only for prediction and quantification of arabinoxylooligosaccharides' structural diversity, but also for estimate of yield and selection of the optimal source of arabinoxylan for production of arabinoxylooligosaccharides with desired structural features....

  3. An Adaptive Channel Estimation Algorithm Using Time-Frequency Polynomial Model for OFDM with Fading Multipath Channels

    Directory of Open Access Journals (Sweden)

    Liu KJ Ray

    2002-01-01

    Full Text Available Orthogonal frequency division multiplexing (OFDM is an effective technique for the future 3G communications because of its great immunity to impulse noise and intersymbol interference. The channel estimation is a crucial aspect in the design of OFDM systems. In this work, we propose a channel estimation algorithm based on a time-frequency polynomial model of the fading multipath channels. The algorithm exploits the correlation of the channel responses in both time and frequency domains and hence reduce more noise than the methods using only time or frequency polynomial model. The estimator is also more robust compared to the existing methods based on Fourier transform. The simulation shows that it has more than improvement in terms of mean-squared estimation error under some practical channel conditions. The algorithm needs little prior knowledge about the delay and fading properties of the channel. The algorithm can be implemented recursively and can adjust itself to follow the variation of the channel statistics.

  4. Multi-channels statistical and morphological features based mitosis detection in breast cancer histopathology.

    Science.gov (United States)

    Irshad, Humayun; Roux, Ludovic; Racoceanu, Daniel

    2013-01-01

    Accurate counting of mitosis in breast cancer histopathology plays a critical role in the grading process. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. This work aims at improving the accuracy of mitosis detection by selecting the color channels that better capture the statistical and morphological features having mitosis discrimination from other objects. The proposed framework includes comprehensive analysis of first and second order statistical features together with morphological features in selected color channels and a study on balancing the skewed dataset using SMOTE method for increasing the predictive accuracy of mitosis classification. The proposed framework has been evaluated on MITOS data set during an ICPR 2012 contest and ranked second from 17 finalists. The proposed framework achieved 74% detection rate, 70% precision and 72% F-Measure. In future work, we plan to apply our mitosis detection tool to images produced by different types of slide scanners, including multi-spectral and multi-focal microscopy.

  5. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  6. Physics-based statistical model and simulation method of RF propagation in urban environments

    Science.gov (United States)

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  7. ZERODUR strength modeling with Weibull statistical distributions

    Science.gov (United States)

    Hartmann, Peter

    2016-07-01

    The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a

  8. Attacking and Defending Covert Channels and Behavioral Models

    CERN Document Server

    Crespi, Valentino; Giani, Annarita

    2011-01-01

    In this paper we present methods for attacking and defending $k$-gram statistical analysis techniques that are used, for example, in network traffic analysis and covert channel detection. The main new result is our demonstration of how to use a behavior's or process' $k$-order statistics to build a stochastic process that has those same $k$-order stationary statistics but possesses different, deliberately designed, $(k+1)$-order statistics if desired. Such a model realizes a "complexification" of the process or behavior which a defender can use to monitor whether an attacker is shaping the behavior. By deliberately introducing designed $(k+1)$-order behaviors, the defender can check to see if those behaviors are present in the data. We also develop constructs for source codes that respect the $k$-order statistics of a process while encoding covert information. One fundamental consequence of these results is that certain types of behavior analyses techniques come down to an {\\em arms race} in the sense that th...

  9. Physical and Statistical Modeling of Saturn's Troposphere

    Science.gov (United States)

    Yanamandra-Fisher, Padmavati A.; Braverman, Amy J.; Orton, Glenn S.

    2002-12-01

    The 5.2-μm atmospheric window on Saturn is dominated by thermal radiation and weak gaseous absorption, with a 20% contribution from sunlight reflected from clouds. The striking variability displayed by Saturn's clouds at 5.2 μm and the detection of PH3 (an atmospheric tracer) variability near or below the 2-bar level and possibly at lower pressures provide salient constraints on the dynamical organization of Saturn's atmosphere by constraining the strength of vertical motions at two levels across the disk. We analyse the 5.2-μm spectra of Saturn by utilising two independent methods: (a) physical models based on the relevant atmospheric parameters and (b) statistical analysis, based on principal components analysis (PCA), to determine the influence of the variation of phosphine and the opacity of clouds deep within Saturn's atmosphere to understand the dynamics in its atmosphere.

  10. Wireless multi-antenna channels modeling and simulation

    CERN Document Server

    Primak, Serguei

    2011-01-01

    This book offers a practical guide on how to use and apply channel models for system evaluation In this book, the authors focus on modeling and simulation of multiple antennas channels, including multiple input multiple output (MIMO) communication channels, and the impact of such models on channel estimation and system performance. Both narrowband and wideband models are addressed. Furthermore, the book covers topics related to modeling of MIMO channel, their numerical simulation, estimation and prediction, as well as applications to receive diversity, capacity and space-time c

  11. New advances in statistical modeling and applications

    CERN Document Server

    Santos, Rui; Oliveira, Maria; Paulino, Carlos

    2014-01-01

    This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.

  12. Global precedence, spatial frequency channels, and the statistics of natural images.

    Science.gov (United States)

    Hughes, H C; Nozawa, G; Kitterle, F

    1996-01-01

    that assume the channels are independent. In view of previous work showing that global precedence depends upon the low frequency content of the stimuli, we suggest that low spatial frequencies represent the sine qua non for the dominance of configurational cues in human pattern perception, and that this configurational dominance reflects the microgenesis of visual pattern perception. This general view of the temporal dynamics of visual pattern recognition is discussed, is considered from an evolutionary perspective, and is related to certain statistical regularities in natural scenes. Potential adaptive advantages of an interactive parallel architecture that confers an initial processing advantage to low resolution information are explored.

  13. Description of interacting channel gating using a stochastic Markovian model.

    Science.gov (United States)

    Manivannan, K; Mathias, R T; Gudowska-Nowak, E

    1996-01-01

    Single-channel recordings from membrane patches frequently exhibit multiple conductance levels. In some preparations, the steady-state probabilities of observing these levels do not follow a binomial distribution. This behavior has been reported in sodium channels, potassium channels, acetylcholine receptor channels and gap junction channels. A non-binomial distribution suggests interaction of the channels or the presence of channels or the presence of channels with different open probabilities. However, the current trace sometimes exhibits single transitions spanning several levels. Since the probability of simultaneous transitions of independent channels is infinitesimally small, such observations strongly suggest a cooperative gating behavior. We present a Markov model to describe the cooperative gating of channels using only the all-points current amplitude histograms for the probability of observing the various conductance levels. We investigate the steady-state (or equilibrium) properties of a system of N channels and provide a scheme to express all the probabilities in terms of just two parameters. The main feature of our model is that lateral interaction of channels gives rise to cooperative gating. Another useful feature is the introduction of the language of graph theory which can potentially provide a different avenue to study ion channel kinetics. We write down explicit expressions for systems of two, three and four channels and provide a procedure to describe the system of N channels.

  14. Ion channel stability and hydrogen bonding. Molecular modelling of channels formed by synthetic alamethicin analogues.

    Science.gov (United States)

    Breed, J; Kerr, I D; Molle, G; Duclohier, H; Sansom, M S

    1997-12-04

    Several analogues of the channel-forming peptaibol alamethicin have been demonstrated to exhibit faster switching between channel substates than does unmodified alamethicin. Molecular modelling studies are used to explore the possible molecular basis of these differences. Models of channels formed by alamethicin analogues were generated by restrained molecular dynamics in vacuo and refined by short molecular dynamics simulations with water molecules within and at either mouth of the channel. A decrease in backbone solvation was found to correlate with a decrease in open channel stability between alamethicin and an analogue in which all alpha-amino-isobutyric acid residues of alamethicin were replaced by leucine. A decrease in the extent of hydrogen-bonding at residue 7 correlates with lower open channel stabilities of analogues in which the glutamine at position 7 was replaced by smaller polar sidechains. These two observations indicate the importance of alamethicin/water H-bonds in stabilizing the open channel.

  15. Statistically steady measurements of Rayleigh-Taylor mixing in a gas channel

    Science.gov (United States)

    Banerjee, Arindam

    A novel gas channel experiment was constructed to study the development of high Atwood number Rayleigh-Taylor mixing. Two gas streams, one containing air and the other containing helium-air mixture, flow parallel to each other separated by a thin splitter plate. The streams meet at the end of a splitter plate leading to the formation of an unstable interface and of buoyancy driven mixing. This buoyancy driven mixing experiment allows for long data collection times, short transients and was statistically steady. The facility was designed to be capable of large Atwood number studies of At ˜ 0.75. We describe work to measure the self similar evolution of mixing at density differences corresponding to 0.035 hot-wire anemometer, and high resolution digital image analysis. The hot-wire probe gives velocity, density and velocity-density statistics of the mixing layer. Two different multi-position single-wire techniques were used to measure the velocity fluctuations in three mutually perpendicular directions. Analysis of the measured data was used to explain the mixing as it develops to a self-similar regime in this flow. These measurements are to our knowledge, the first use of hot-wire anemometry in the Rayleigh-Taylor community. Since the measurement involved extensive calibration of the probes in a binary gas mixture of air and helium, a new convective heat transfer correlation was formulated to account for variable-density low Reynolds number flows past a heated cylinder. In addition to the hot-wire measurements, a digital image analysis procedure was used to characterize various properties of the flow and also to validate the hot-wire measurements. A test of statistical convergence was performed and the study revealed that the statistical convergence was a direct consequence of the number of different large three-dimensional structures that were averaged over the duration of the run.

  16. Modelling earthquake interaction and seismicity statistics

    Science.gov (United States)

    Steacy, S.; Hetherington, A.

    2009-04-01

    The effects of earthquake interaction and fault complexity on seismicity statistics are investigated in a 3D model composed of a number of cellular automata (each representing an individual fault) distributed in a volume. Each automaton is assigned a fractal distribution of strength. Failure occurs when the 3D Coulomb stress on any cell exceeds its strength and stress transfer during simulated earthquake rupture is via nearest-neighbor rules formulated to give realistic stress concentrations. An event continues until all neighboring cells whose stresses exceed their strengths have ruptured and the size of the event is determined from its area and stress drop. Long-range stress interactions are computed following the termination of simulated ruptures using a boundary element code. In practice, these stress perturbations are only computed for events above a certain size (e.g. a threshold length of 10 km) and stresses are updated on nearby structures. Events which occur as a result of these stress interactions are considered to be "triggered" earthquakes and they, in turn, can trigger further seismic activity. The threshold length for computing interaction stresses is a free parameter and hence interaction can be "turned off" by setting this to an unrealistically high value. We consider 3 synthetic fault networks of increasing degrees of complexity - modelled on the North Anatolian fault system, the structures in the San Francisco Bay Area, and the Southern California fault network. We find that the effect of interaction is dramatically different in networks of differing complexity. In the North Anatolian analogue, for example, interaction leads to a decreased number of events, increased b-values, and an increase in recurrence intervals. In the Bay Area model, by contrast, we observe that interaction increases the number of events, decreases the b-values, and has little effect on recurrence intervals. For all networks, we find that interaction can activate mis

  17. Pathway Model and Nonextensive Statistical Mechanics

    Science.gov (United States)

    Mathai, A. M.; Haubold, H. J.; Tsallis, C.

    2015-12-01

    The established technique of eliminating upper or lower parameters in a general hypergeometric series is profitably exploited to create pathways among confluent hypergeometric functions, binomial functions, Bessel functions, and exponential series. One such pathway, from the mathematical statistics point of view, results in distributions which naturally emerge within nonextensive statistical mechanics and Beck-Cohen superstatistics, as pursued in generalizations of Boltzmann-Gibbs statistics.

  18. Statistical Ensemble Theory of Gompertz Growth Model

    Directory of Open Access Journals (Sweden)

    Takuya Yamano

    2009-11-01

    Full Text Available An ensemble formulation for the Gompertz growth function within the framework of statistical mechanics is presented, where the two growth parameters are assumed to be statistically distributed. The growth can be viewed as a self-referential process, which enables us to use the Bose-Einstein statistics picture. The analytical entropy expression pertain to the law can be obtained in terms of the growth velocity distribution as well as the Gompertz function itself for the whole process.

  19. Mathematical Modeling on Open Limestone Channel

    CERN Document Server

    Bandstra, Joel; Wu, Naiyi

    2014-01-01

    Acid mine drainage (AMD) is the outflow of acidic water from metal mines or coal mines. When exposed to air and water, metal sulfides from the deposits of the mines are oxidized and produce acid, metal ions and sulfate, which lower the pH value of the water. An open limestone channel (OLC) is a passive and low cost way to neutralize AMD. The dissolution of calcium into the water increases the pH value of the solution. A differential equation model is numerically solved to predict the variation of concentration of each species in the OLC solution. The diffusion of Calcium due to iron precipitates is modeled by a linear equation. The results give the variation of pH value and the concentration of Calcium.

  20. Rician Channel Modeling for Multiprobe Anechoic Chamber Setups

    DEFF Research Database (Denmark)

    Fan, Wei; Kyösti, Pekka; Hentilä, Lassi;

    2014-01-01

    This paper discusses over the air (OTA) testing for multiple input multiple output (MIMO) capable terminals, with emphasis on modeling Rician channel models in the multi-probe anechoic chamber setups. A technique to model Rician channels is proposed. The line-of-sight (LOS) component, with an arb......This paper discusses over the air (OTA) testing for multiple input multiple output (MIMO) capable terminals, with emphasis on modeling Rician channel models in the multi-probe anechoic chamber setups. A technique to model Rician channels is proposed. The line-of-sight (LOS) component...

  1. Propagation channel characterization, parameter estimation, and modeling for wireless communications

    CERN Document Server

    Yin, Xuefeng

    2016-01-01

    Thoroughly covering channel characteristics and parameters, this book provides the knowledge needed to design various wireless systems, such as cellular communication systems, RFID and ad hoc wireless communication systems. It gives a detailed introduction to aspects of channels before presenting the novel estimation and modelling techniques which can be used to achieve accurate models. To systematically guide readers through the topic, the book is organised in three distinct parts. The first part covers the fundamentals of the characterization of propagation channels, including the conventional single-input single-output (SISO) propagation channel characterization as well as its extension to multiple-input multiple-output (MIMO) cases. Part two focuses on channel measurements and channel data post-processing. Wideband channel measurements are introduced, including the equipment, technology and advantages and disadvantages of different data acquisition schemes. The channel parameter estimation methods are ...

  2. Statistical Model Checking of Rich Models and Properties

    DEFF Research Database (Denmark)

    Poulsen, Danny Bøgsted

    in undecidability issues for the traditional model checking approaches. Statistical model checking has proven itself a valuable supplement to model checking and this thesis is concerned with extending this software validation technique to stochastic hybrid systems. The thesis consists of two parts: the first part......Software is in increasing fashion embedded within safety- and business critical processes of society. Errors in these embedded systems can lead to human casualties or severe monetary loss. Model checking technology has proven formal methods capable of finding and correcting errors in software....... However, software is approaching the boundary in terms of the complexity and size that model checking can handle. Furthermore, software systems are nowadays more frequently interacting with their environment hence accurately modelling such systems requires modelling the environment as well - resulting...

  3. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  4. Channel Modelling for Multiprobe Over-the-Air MIMO Testing

    Directory of Open Access Journals (Sweden)

    Pekka Kyösti

    2012-01-01

    a fading emulator, an anechoic chamber, and multiple probes. Creation of a propagation environment inside an anechoic chamber requires unconventional radio channel modelling, namely, a specific mapping of the original models onto the probe antennas. We introduce two novel methods to generate fading emulator channel coefficients; the prefaded signals synthesis and the plane wave synthesis. To verify both methods we present a set of simulation results. We also show that the geometric description is a prerequisite for the original channel model.

  5. A model for the distribution channels planning process

    NARCIS (Netherlands)

    Neves, M.F.; Zuurbier, P.; Campomar, M.C.

    2001-01-01

    Research of existing literature reveals some models (sequence of steps) for companies that want to plan distribution channels. None of these models uses strong contributions from transaction cost economics, bringing a possibility to elaborate on a "distribution channels planning model", with these c

  6. Dividing Streamline Formation Channel Confluences by Physical Modeling

    Directory of Open Access Journals (Sweden)

    Minarni Nur Trilita

    2010-02-01

    Full Text Available Confluence channels are often found in open channel network system and is the most important element. The incoming flow from the branch channel to the main cause various forms and cause vortex flow. Phenomenon can cause erosion of the side wall of the channel, the bed channel scour and sedimentation in the downstream confluence channel. To control these problems needed research into the current width of the branch channel. The incoming flow from the branch channel to the main channel flow bounded by a line distributors (dividing streamline. In this paper, the wide dividing streamline observed in the laboratory using a physical model of two open channels, a square that formed an angle of 30º. Observations were made with a variety of flow coming from each channel. The results obtained in the laboratory observation that the width of dividing streamline flow is influenced by the discharge ratio between the channel branch with the main channel. While the results of a comparison with previous studies showing that the observation in the laboratory is smaller than the results of previous research.

  7. Integer Set Compression and Statistical Modeling

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...... enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities...

  8. Cooperative Transmission for Relay Networks Based on Second-Order Statistics of Channel State Information

    Science.gov (United States)

    Li, Jiangyuan; Petropulu, Athina P.; Poor, H. Vincent

    2011-03-01

    Cooperative beamforming in relay networks is considered, in which a source transmits to its destination with the help of a set of cooperating nodes. The source first transmits locally. The cooperating nodes that receive the source signal retransmit a weighted version of it in an amplify-and-forward (AF) fashion. Assuming knowledge of the second-order statistics of the channel state information, beamforming weights are determined so that the signal-to-noise ratio (SNR) at the destination is maximized subject to two different power constraints, i.e., a total (source and relay) power constraint, and individual relay power constraints. For the former constraint, the original problem is transformed into a problem of one variable, which can be solved via Newton's method. For the latter constraint, the original problem is transformed into a homogeneous quadratically constrained quadratic programming (QCQP) problem. In this case, it is shown that when the number of relays does not exceed three the global solution can always be constructed via semidefinite programming (SDP) relaxation and the matrix rank-one decomposition technique. For the cases in which the SDP relaxation does not generate a rank one solution, two methods are proposed to solve the problem: the first one is based on the coordinate descent method, and the second one transforms the QCQP problem into an infinity norm maximization problem in which a smooth finite norm approximation can lead to the solution using the augmented Lagrangian method.

  9. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar

  10. Enhanced surrogate models for statistical design exploiting space mapping technology

    DEFF Research Database (Denmark)

    Koziel, Slawek; Bandler, John W.; Mohamed, Achmed S.;

    2005-01-01

    We present advances in microwave and RF device modeling exploiting Space Mapping (SM) technology. We propose new SM modeling formulations utilizing input mappings, output mappings, frequency scaling and quadratic approximations. Our aim is to enhance circuit models for statistical analysis...

  11. Comparison of Statistical Models for Regional Crop Trial Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Qun-yuan; KONG Fan-ling

    2002-01-01

    Based on the review and comparison of main statistical analysis models for estimating varietyenvironment cell means in regional crop trials, a new statistical model, LR-PCA composite model was proposed, and the predictive precision of these models were compared by cross validation of an example data. Results showed that the order of model precision was LR-PCA model > AMMI model > PCA model > Treatment Means (TM) model > Linear Regression (LR) model > Additive Main Effects ANOVA model. The precision gain factor of LR-PCA model was 1.55, increasing by 8.4% compared with AMMI.

  12. Dynamical Properties of Potassium Ion Channels with a Hierarchical Model

    Institute of Scientific and Technical Information of China (English)

    ZHAN Yong; AN Hai-Long; YU Hui; ZHANG Su-Hua; HAN Ying-Rong

    2006-01-01

    @@ It is well known that potassium ion channels have higher permeability than K ions, and the permeable rate of a single K ion channel is about 108 ions per second. We develop a hierarchical model of potassium ion channel permeation involving ab initio quantum calculations and Brownian dynamics simulations, which can consistently explain a range of channel dynamics. The results show that the average velocity of K ions, the mean permeable time of K ions and the permeable rate of single channel are about 0.92nm/ns, 4.35ns and 2.30×108 ions/s,respectively.

  13. Seismic imaging and evaluation of channels modeled by boolean approach

    Energy Technology Data Exchange (ETDEWEB)

    Spinola, M.; Aggio, A. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas

    1999-07-01

    The seismic method attempt to image the subsurface architecture and has been able to significantly contribute to detect areal and vertical changes in rock properties. This work presents a seismic imaging study of channel objects generated using the boolean technique. Three channels having different thicknesses were simulated, using the same width, sinuosity and direction. A velocity model was constructed in order to allow seismic contrasts between the interior of channels and the embedding rock. To examine the seismic response for different channel thicknesses, a 3D ray tracing with a normal incident point survey was performed. The three channels were resolved and the way the seismic could image them was studied. (author)

  14. Statistical Compressive Sensing of Gaussian Mixture Models

    CERN Document Server

    Yu, Guoshen

    2010-01-01

    A new framework of compressive sensing (CS), namely statistical compressive sensing (SCS), that aims at efficiently sampling a collection of signals that follow a statistical distribution and achieving accurate reconstruction on average, is introduced. For signals following a Gaussian distribution, with Gaussian or Bernoulli sensing matrices of O(k) measurements, considerably smaller than the O(k log(N/k)) required by conventional CS, where N is the signal dimension, and with an optimal decoder implemented with linear filtering, significantly faster than the pursuit decoders applied in conventional CS, the error of SCS is shown tightly upper bounded by a constant times the k-best term approximation error, with overwhelming probability. The failure probability is also significantly smaller than that of conventional CS. Stronger yet simpler results further show that for any sensing matrix, the error of Gaussian SCS is upper bounded by a constant times the k-best term approximation with probability one, and the ...

  15. Statistics-based investigation on typhoon transition modeling

    DEFF Research Database (Denmark)

    Zhang, Shuoyun; Nishijima, Kazuyoshi

    and the seasonality are taken into account by developing the models for different spatial grids and seasons separately. An appropriate size of spatial grids is investigated. The statistical characteristics of the random residual terms in the models are also examined. Finally, Monte Carlo simulations are performed......The present study revisits the statistical modeling of typhoon transition. The objective of the study is to provide insights on plausible statistical typhoon transition models based on extensive statistical analysis. First, the correlation structures of the typhoon transition are estimated in terms...

  16. Modeling the morphogenesis of brine channels in sea ice

    CERN Document Server

    Kutschan, B; gemming, S

    2009-01-01

    Brine channels are formed in sea ice under certain constraints and represent a habitat of different microorganisms. The complex system depends on a number of various quantities as salinity, density, pH-value or temperature. Each quantity governs the process of brine channel formation. There exists a strong link between bulk salinity and the presence of brine drainage channels in growing ice with respect to both the horizontal and vertical planes. We develop a suitable phenomenological model for the formation of brine channels both referring to the Ginzburg-Landau-theory of phase transitions as well as to the chemical basis of morphogenesis according to Turing. It is possible to conclude from the critical wavenumber on the size of the structure and the critical parameters. The theoretically deduced transition rates have the same magnitude as the experimental values. The model creates channels of similar size as observed experimentally. An extension of the model towards channels with different sizes is possible...

  17. A 3D Geometry-based Stochastic Model for 5G Massive MIMO Channels

    Directory of Open Access Journals (Sweden)

    Yi Xie

    2015-09-01

    Full Text Available Massive MIMO is one of the most promising technologies for the fifth generation (5G mobile communication systems. In order to better assess the system performance, it is essential to build a corresponding channel model accurately. In this paper, a three-dimension (3D two-cylinder regular-shaped geometry-based stochastic model (GBSM for non-isotropic scattering massive MIMO channels is proposed. Based on geometric method, all the scatters are distributed on the surface of a cylinder as equivalent scatters. Non-stationary property is that one antenna has its own visible area of scatters by using a virtual sphere. The proposed channel model is evaluated by comparing with the 3GPP 3D channel model [1]. The statistical properties are investigated. Simulation results show that close agreements are achieved between the characteristics of the proposed channel model and those of the 3GPP channel model, which justify the correctness of the proposed model. The model has advantages such as good applicability.

  18. Markov modeling of ion channels: implications for understanding disease.

    Science.gov (United States)

    Lampert, Angelika; Korngreen, Alon

    2014-01-01

    Ion channels are the bridge between the biochemical and electrical domains of our life. These membrane crossing proteins use the electric energy stored in transmembrane ion gradients, which are produced by biochemical activity to generate ionic currents. Each ion channel can be imagined as a small power plant similar to a hydroelectric power station, in which potential energy is converted into electric current. This current drives basically all physiological mechanisms of our body. It is clear that a functional blueprint of these amazing cellular power plants is essential for understanding the principle of all aspects of physiology, particularly neurophysiology. The golden path toward this blueprint starts with the biophysical investigation of ion channel activity and continues through detailed numerical modeling of these channels that will eventually lead to a full system-level description of cellular and organ physiology. Here, we discuss the first two stages of this process focusing on voltage-gated channels, particularly the voltage-gated sodium channel which is neurologically and pathologically important. We first detail the correlations between the known structure of the channel and its activity and describe some pathologies. We then provide a hands-on description of Markov modeling for voltage-gated channels. These two sections of the chapter highlight the dichotomy between the vast amounts of electrophysiological data available on voltage-gated channels and the relatively meager number of physiologically relevant models for these channels.

  19. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  20. Effective Relaying in Two-user Interference Channel with Different Models of Channel Output Feedback

    CERN Document Server

    Sahai, Achaleshwar; Yuksel, Melda; Sabharwal, Ashutosh

    2011-01-01

    In this paper, we study the impact of channel output feedback architectures on the capacity of two-user interference channel. For a two-user interference channel, a feedback link can exist between receivers and transmitters in 9 canonical architectures, ranging from only one feedback link to four-feedback links. We derive exact capacity region for the deterministic interference channel and constant-gap capacity region for the Gaussian interference channel for all but two of the 9 architectures (or models). We find that the sum-capacity in deterministic interference channel with only one feedback link, from any one receiver to its own transmitter, is identical to the interference channel with four feedback links; for the Gaussian model, the gap is bounded for all channel gains. However, one feedback link is not sufficient to achieve the whole capacity region of four feedback links. To achieve the full capacity region requires at least two feedback links. To prove the results, we derive several new outer bounds...

  1. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  2. 12th Workshop on Stochastic Models, Statistics and Their Applications

    CERN Document Server

    Rafajłowicz, Ewaryst; Szajowski, Krzysztof

    2015-01-01

    This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.

  3. Statistical Tests for Mixed Linear Models

    CERN Document Server

    Khuri, André I; Sinha, Bimal K

    2011-01-01

    An advanced discussion of linear models with mixed or random effects. In recent years a breakthrough has occurred in our ability to draw inferences from exact and optimum tests of variance component models, generating much research activity that relies on linear models with mixed and random effects. This volume covers the most important research of the past decade as well as the latest developments in hypothesis testing. It compiles all currently available results in the area of exact and optimum tests for variance component models and offers the only comprehensive treatment for these models a

  4. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  5. Shannon information capacity of time reversal wideband multiple-input multiple-output system based on correlated statistical channels

    Science.gov (United States)

    Yu, Yang; Bing-Zhong, Wang; Shuai, Ding

    2016-05-01

    Utilizing channel reciprocity, time reversal (TR) technique increases the signal-to-noise ratio (SNR) at the receiver with very low transmitter complexity in complex multipath environment. Present research works about TR multiple-input multiple-output (MIMO) communication all focus on the system implementation and network building. The aim of this work is to analyze the influence of antenna coupling on the capacity of wideband TR MIMO system, which is a realistic question in designing a practical communication system. It turns out that antenna coupling stabilizes the capacity in a small variation range with statistical wideband channel response. Meanwhile, antenna coupling only causes a slight detriment to the channel capacity in a wideband TR MIMO system. Comparatively, uncorrelated stochastic channels without coupling exhibit a wider range of random capacity distribution which greatly depends on the statistical channel. The conclusions drawn from information difference entropy theory provide a guideline for designing better high-performance wideband TR MIMO communication systems. Project supported by the National Natural Science Foundation of China (Grant Nos. 61331007, 61361166008, and 61401065) and the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20120185130001).

  6. Channel modelling and performance analysis of V2I communication systems in blind bend scattering environments

    KAUST Repository

    Chelli, Ali

    2014-01-01

    In this paper, we derive a new geometrical blind bend scattering model for vehicle-to- infrastructure (V2I) communications. The proposed model takes into account single-bounce and double- bounce scattering stemming from fixed scatterers located on both sides of a curved street. Starting from the geometrical blind bend model, the exact expression of the angle of departure (AOD) is derived. Based on this expression, the probability density function (PDF) of the AOD and the Doppler power spectrum are determined. Analytical expressions for the channel gain and the temporal autocorrelation function (ACF) are provided under non-line-of-sight (NLOS) conditions. Additionally, we investigate the impact of the position of transmitting vehicle relatively to the receiving road-side unit on the channel statistics. Moreover, we study the performance of different digital modulations over a sum of singly and doubly scattered (SSDS) channel. Note that the proposed V2I channel model falls under the umbrella of SSDS channels since the transmitted signal undergoes a combination of single-bounce and double-bounce scattering. We study some characteristic quantities of SSDS channels and derive expressions for the average symbol error probability of several modulation schemes over SSDS channels with and without diversity combining. The validity of these analytical expressions is confirmed by computer-based simulations.

  7. Book review: Statistical Analysis and Modelling of Spatial Point Patterns

    DEFF Research Database (Denmark)

    Møller, Jesper

    2009-01-01

    Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...

  8. Ultrawideband MIMO Channel Measurements and Modeling in a Warehouse Environment

    OpenAIRE

    Sangodoyin, Seun; He, Ruisi; Molisch, Andreas; Kristem, Vinod; Tufvesson, Fredrik

    2015-01-01

    This paper presents a detailed description of a propagation channel measurement campaign performed in a warehouse environment and provide a comprehensive channel model for this environment. Using a vector network analyzer (VNA), we explored both Line-of-sight(LOS) and Non-Line-of-sight (NLOS) scenarios over a 2-8 GHz frequency range. We extracted both small-scale and large-scale channel parameters such as distance-dependent pathloss exponent (n), frequency-dependent pathloss exponent (k), sha...

  9. Optimal Power Allocation for CC-HARQ-based Cognitive Radio with Statistical CSI in Nakagami Slow Fading Channels

    Science.gov (United States)

    Xu, Ding; Li, Qun

    2017-01-01

    This paper addresses the power allocation problem for cognitive radio (CR) based on hybrid-automatic-repeat-request (HARQ) with chase combining (CC) in Nakagamimslow fading channels. We assume that, instead of the perfect instantaneous channel state information (CSI), only the statistical CSI is available at the secondary user (SU) transmitter. The aim is to minimize the SU outage probability under the primary user (PU) interference outage constraint. Using the Lagrange multiplier method, an iterative and recursive algorithm is derived to obtain the optimal power allocation for each transmission round. Extensive numerical results are presented to illustrate the performance of the proposed algorithm.

  10. Statistical analysis of turbulent super-streamwise vortices based on observations of streaky structures near the free surface in the smooth open channel flow

    Science.gov (United States)

    Zhong, Qiang; Chen, Qigang; Wang, Hao; Li, Danxun; Wang, Xingkui

    2016-05-01

    Long streamwise-elongated high- and low-speed streaks are repeatedly observed near the free surface in open channel flows in natural rivers and lab experiments. Super-streamwise vortex model has been proposed to explain this widespread phenomenon for quite some time. However, statistical evidence of the existence of the super-streamwise vortices as one type of coherent structures is still insufficient. Correlation and proper orthogonal decomposition (POD) analysis based on PIV experimental data in the streamwise-spanwise plane near the free surface in a smooth open channel flow are employed to investigate this topic. Correlation analysis revealed that the streaky structures appear frequently near the free surface and their occurrence probability at any spanwise position is equal. The spanwise velocity fluctuation usually flows from low-speed streaks toward high-speed streaks. The average spanwise width and spacing between neighboring low (or high) speed streaks are approximately h and 2h respectively. POD analysis reveals that there are streaks with different spanwise width in the instantaneous flow fields. Typical streamwise rotational movement can be sketched out directly based on the results from statistical analyses. Point-by-point analysis indicates that this pattern is consistent everywhere in the measurement window and is without any inhomogeneity in the spanwise direction, which reveals the essential difference between coherent structures and secondary flow cells. The pattern found by statistical analysis is consistent with the notion that the super-streamwise vortices exist universally as one type of coherent structure in open channel flows.

  11. Statistical Model of the 3-D Braided Composites Strength

    Institute of Scientific and Technical Information of China (English)

    XIAO Laiyuan; ZUO Weiwei; CAI Ganwei; LIAO Daoxun

    2007-01-01

    Based on the statistical model for the tensile statistical strength of unidirectional composite materials and the stress analysis of 3-D braided composites, a new method is proposed to calculate the tensile statistical strength of the 3-D braided composites. With this method, the strength of 3-D braided composites can be calculated with very large accuracy, and the statistical parameters of 3-D braided composites can be determined. The numerical result shows that the tensile statistical strength of 3-D braided composites can be predicted using this method.

  12. Statistical Modeling of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  13. Exponential order statistic models of software reliability growth

    Science.gov (United States)

    Miller, D. R.

    1986-01-01

    Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  14. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  15. Statistical modelling of fine red wine production

    OpenAIRE

    María Rosa Castro; Marcelo Eduardo Echegaray; Rosa Ana Rodríguez; Stella Maris Udaquiola

    2010-01-01

    Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was ...

  16. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  17. Statistics of Certain Models of Evolution

    CERN Document Server

    Standish, R K

    1999-01-01

    In a recent paper, Newman surveys the literature on power law spectra in evolution, self-organised criticality and presents a model of his own to arrive at a conclusion that self-organised criticality is not necessary for evolution. Not only did he miss a key model (Ecolab) that has a clear self-organised critical mechanism, but also Newman's model exhibits the same mechanism that gives rise to power law behaviour as does Ecolab. I would argue that this mechanism should be described as self-organised critical. In this paper, I have also implemented Newman's model using the Ecolab software, removing the restriction that the number of species remains constant. It turns out that the requirement of constant species number is non-trivial, leading to a global coupling between species that is similar in effect to the species interactions seen in Ecolab. In fact, the model must self-organise to a state where the long time average of speciations balances that of the extinctions, otherwise the system either collapses o...

  18. A statistical model of facial attractiveness.

    Science.gov (United States)

    Said, Christopher P; Todorov, Alexander

    2011-09-01

    Previous research has identified facial averageness and sexual dimorphism as important factors in facial attractiveness. The averageness and sexual dimorphism accounts provide important first steps in understanding what makes faces attractive, and should be valued for their parsimony. However, we show that they explain relatively little of the variance in facial attractiveness, particularly for male faces. As an alternative to these accounts, we built a regression model that defines attractiveness as a function of a face's position in a multidimensional face space. The model provides much more predictive power than the averageness and sexual dimorphism accounts and reveals previously unreported components of attractiveness. The model shows that averageness is attractive in some dimensions but not in others and resolves previous contradictory reports about the effects of sexual dimorphism on the attractiveness of male faces.

  19. Reo: A Channel-based Coordination Model for Component Composition

    NARCIS (Netherlands)

    Arbab, F.

    2004-01-01

    In this paper, we present Reo, which forms a paradigm for composition of software components based on the notion of mobile channels. Reo is a channel-based exogenous coordination model in which complex coordinators, called connectors, are compositionally built out of simpler ones. The simplest conne

  20. Molecular modeling of mechanosensory ion channel structural and functional features.

    Science.gov (United States)

    Gessmann, Renate; Kourtis, Nikos; Petratos, Kyriacos; Tavernarakis, Nektarios

    2010-09-16

    The DEG/ENaC (Degenerin/Epithelial Sodium Channel) protein family comprises related ion channel subunits from all metazoans, including humans. Members of this protein family play roles in several important biological processes such as transduction of mechanical stimuli, sodium re-absorption and blood pressure regulation. Several blocks of amino acid sequence are conserved in DEG/ENaC proteins, but structure/function relations in this channel class are poorly understood. Given the considerable experimental limitations associated with the crystallization of integral membrane proteins, knowledge-based modeling is often the only route towards obtaining reliable structural information. To gain insight into the structural characteristics of DEG/ENaC ion channels, we derived three-dimensional models of MEC-4 and UNC-8, based on the available crystal structures of ASIC1 (Acid Sensing Ion Channel 1). MEC-4 and UNC-8 are two DEG/ENaC family members involved in mechanosensation and proprioception respectively, in the nematode Caenorhabditis elegans. We used these models to examine the structural effects of specific mutations that alter channel function in vivo. The trimeric MEC-4 model provides insight into the mechanism by which gain-of-function mutations cause structural alterations that result in increased channel permeability, which trigger cell degeneration. Our analysis provides an introductory framework to further investigate the multimeric organization of the DEG/ENaC ion channel complex.

  1. Molecular modeling of mechanosensory ion channel structural and functional features.

    Directory of Open Access Journals (Sweden)

    Renate Gessmann

    Full Text Available The DEG/ENaC (Degenerin/Epithelial Sodium Channel protein family comprises related ion channel subunits from all metazoans, including humans. Members of this protein family play roles in several important biological processes such as transduction of mechanical stimuli, sodium re-absorption and blood pressure regulation. Several blocks of amino acid sequence are conserved in DEG/ENaC proteins, but structure/function relations in this channel class are poorly understood. Given the considerable experimental limitations associated with the crystallization of integral membrane proteins, knowledge-based modeling is often the only route towards obtaining reliable structural information. To gain insight into the structural characteristics of DEG/ENaC ion channels, we derived three-dimensional models of MEC-4 and UNC-8, based on the available crystal structures of ASIC1 (Acid Sensing Ion Channel 1. MEC-4 and UNC-8 are two DEG/ENaC family members involved in mechanosensation and proprioception respectively, in the nematode Caenorhabditis elegans. We used these models to examine the structural effects of specific mutations that alter channel function in vivo. The trimeric MEC-4 model provides insight into the mechanism by which gain-of-function mutations cause structural alterations that result in increased channel permeability, which trigger cell degeneration. Our analysis provides an introductory framework to further investigate the multimeric organization of the DEG/ENaC ion channel complex.

  2. A channel-based coordination model for component composition

    NARCIS (Netherlands)

    Arbab, F.

    2002-01-01

    In this paper, we present $P epsilon omega$, a paradigm for composition of software components based on the notion of mobile channels. $P repsilon omega$ is a channel-based exogenous coordination model wherein complex coordinators, called {em connectors are compositionally built out of simpler ones.

  3. Structured Statistical Models of Inductive Reasoning

    Science.gov (United States)

    Kemp, Charles; Tenenbaum, Joshua B.

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet…

  4. Statistical modelling of fine red wine production

    Directory of Open Access Journals (Sweden)

    María Rosa Castro

    2010-05-01

    Full Text Available Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was thus used in this work. The sampling coefficient of correlation was calculated and a dispersion diagram was then constructed; this indicated a li- neal relationship between the litres of wine obtained and the kilograms of crushed grape. Two lineal models were then adopted and variance analysis was carried out because the data came from normal populations having the same variance. The most appropriate model was obtained from this analysis; it was validated with experimental values, a good approach being obtained.

  5. Statistical modelling in biostatistics and bioinformatics selected papers

    CERN Document Server

    Peng, Defen

    2014-01-01

    This book presents selected papers on statistical model development related mainly to the fields of Biostatistics and Bioinformatics. The coverage of the material falls squarely into the following categories: (a) Survival analysis and multivariate survival analysis, (b) Time series and longitudinal data analysis, (c) Statistical model development and (d) Applied statistical modelling. Innovations in statistical modelling are presented throughout each of the four areas, with some intriguing new ideas on hierarchical generalized non-linear models and on frailty models with structural dispersion, just to mention two examples. The contributors include distinguished international statisticians such as Philip Hougaard, John Hinde, Il Do Ha, Roger Payne and Alessandra Durio, among others, as well as promising newcomers. Some of the contributions have come from researchers working in the BIO-SI research programme on Biostatistics and Bioinformatics, centred on the Universities of Limerick and Galway in Ireland and fu...

  6. Functional summary statistics for the Johnson-Mehl model

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    of functional summary statistics. This paper therefore invents four functional summary statistics adapted to the Johnson-Mehl model, with two of them based on the second-order properties and the other two on the nuclei-boundary distances for the associated Johnson-Mehl tessellation. The functional summary...... statistics theoretical properties are investigated, non-parametric estimators are suggested, and their usefulness for model checking is examined in a simulation study. The functional summary statistics are also used for checking fitted parametric Johnson-Mehl models for a neurotransmitters dataset.......The Johnson-Mehl germination-growth model is a spatio-temporal point process model which among other things have been used for the description of neurotransmitters datasets. However, for such datasets parametric Johnson-Mehl models fitted by maximum likelihood have yet not been evaluated by means...

  7. On the computation of the higher order statistics of the channel capacity for amplify-and-forward multihop transmission

    KAUST Repository

    Yilmaz, Ferkan

    2014-01-01

    Higher order statistics (HOS) of the channel capacity provide useful information regarding the level of reliability of signal transmission at a particular rate. In this paper, we propose a novel and unified analysis, which is based on the moment-generating function (MGF) approach, to efficiently and accurately compute the HOS of the channel capacity for amplify-and-forward (AF) multihop transmission over generalized fading channels. More precisely, our easy-to-use and tractable mathematical formalism requires only the reciprocal MGFs of the transmission hop signal-to-noise ratio (SNR). Numerical and simulation results, which are performed to exemplify the usefulness of the proposed MGF-based analysis, are shown to be in perfect agreement. © 2013 IEEE.

  8. Modeling and Simulation of MIMO Mobile-to-Mobile Wireless Fading Channels

    Directory of Open Access Journals (Sweden)

    Gholamreza Bakhshi

    2012-01-01

    Full Text Available Analysis and design of multielement antenna systems in mobile fading channels require a model for the space-time cross-correlation among the links of the underlying multipleinput multiple-output (MIMO Mobile-to-Mobile (M-to-M communication channels. In this paper, we propose the modified geometrical two-ring model, a MIMO channel reference model for M-to-M communication systems. This model is based on the extension of single-bounce two-ring scattering model for flat fading channel under the assumption that the transmitter and the receiver are moving. Assuming single-bounce scattering model in both isotropic and nonisotropic environment, a closed-form expression for the space-time cross-correlation function (CCF between any two subchannels is derived. The proposed model provides an important framework in M-to-M system design, where includes many existing correlation models as special cases. Also, two realizable statistical simulation models are proposed for simulating both isotropic and nonisotropic reference model. The realizable simulation models are based on Sum-of-Sinusoids (SoS simulation model. Finally, the correctness of the proposed simulation models is shown via different simulation scenarios.

  9. Model validation of channel zapping quality

    NARCIS (Netherlands)

    Kooij, R.E; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call

  10. A vehicle-to-infrastructure channel model for blind corner scattering environments

    KAUST Repository

    Chelli, Ali

    2013-09-01

    In this paper, we derive a new geometrical blind corner scattering model for vehicle-to-infrastructure (V2I) communications. The proposed model takes into account single-bounce and double-bounce scattering stemming from fixed scatterers located on both sides of the curved street. Starting from the geometrical blind corner model, the exact expression of the angle of departure (AOD) is derived. Based on this expression, the probability density function (PDF) of the AOD and the Doppler power spectrum are determined. Analytical expressions for the channel gain and the temporal autocorrelation function (ACF) are provided under non-line-of-sight (NLOS) conditions. Moreover, we investigate the impact of the position of transmitting vehicle relatively to the receiving road-side unit on the channel statistics. The proposed channel model is useful for the design and analysis of future V2I communication systems. Copyright © 2013 by the Institute of Electrical and Electronic Engineers, Inc.

  11. Network Data: Statistical Theory and New Models

    Science.gov (United States)

    2016-02-17

    max scale): Number of graduating undergraduates funded by a DoD funded Center of Excellence grant for Education , Research and Engineering: The number...structured networks The major goals of this project are to develop and implement algorithms based on high dimensional statis- tics theory, especially...invariance to local deformation. These techniques have been adapted to modeling higher order visual areas such as area MT on two experimental datasets provided

  12. Daisy Models Semi-Poisson statistics and beyond

    CERN Document Server

    Hernández-Saldaña, H; Seligman, T H

    1999-01-01

    Semi-Poisson statistics are shown to be obtained by removing every other number from a random sequence. Retaining every (r+1)th level we obtain a family of secuences which we call daisy models. Their statistical properties coincide with those of Bogomolny's nearest-neighbour interaction Coulomb gas if the inverse temperature coincides with the integer r. In particular the case r=2 reproduces closely the statistics of quasi-optimal solutions of the traveling salesman problem.

  13. Process Model Construction and Optimization Using Statistical Experimental Design,

    Science.gov (United States)

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  14. Post hoc pattern matching: assigning significance to statistically defined expression patterns in single channel microarray data

    Directory of Open Access Journals (Sweden)

    Blalock Eric M

    2007-07-01

    Full Text Available Abstract Background Researchers using RNA expression microarrays in experimental designs with more than two treatment groups often identify statistically significant genes with ANOVA approaches. However, the ANOVA test does not discriminate which of the multiple treatment groups differ from one another. Thus, post hoc tests, such as linear contrasts, template correlations, and pairwise comparisons are used. Linear contrasts and template correlations work extremely well, especially when the researcher has a priori information pointing to a particular pattern/template among the different treatment groups. Further, all pairwise comparisons can be used to identify particular, treatment group-dependent patterns of gene expression. However, these approaches are biased by the researcher's assumptions, and some treatment-based patterns may fail to be detected using these approaches. Finally, different patterns may have different probabilities of occurring by chance, importantly influencing researchers' conclusions about a pattern and its constituent genes. Results We developed a four step, post hoc pattern matching (PPM algorithm to automate single channel gene expression pattern identification/significance. First, 1-Way Analysis of Variance (ANOVA, coupled with post hoc 'all pairwise' comparisons are calculated for all genes. Second, for each ANOVA-significant gene, all pairwise contrast results are encoded to create unique pattern ID numbers. The # genes found in each pattern in the data is identified as that pattern's 'actual' frequency. Third, using Monte Carlo simulations, those patterns' frequencies are estimated in random data ('random' gene pattern frequency. Fourth, a Z-score for overrepresentation of the pattern is calculated ('actual' against 'random' gene pattern frequencies. We wrote a Visual Basic program (StatiGen that automates PPM procedure, constructs an Excel workbook with standardized graphs of overrepresented patterns, and lists of

  15. Behavioral and Statistical Models of Educational Inequality

    DEFF Research Database (Denmark)

    Holm, Anders; Breen, Richard

    2016-01-01

    This paper addresses the question of how students and their families make educational decisions. We describe three types of behavioral model that might underlie decision-making and we show that they have consequences for what decisions are made. Our study thus has policy implications if we wish...... to encourage students and their families to make better educational choices. We also establish the conditions under which empirical analysis can distinguish between the three sorts of decision-making and we illustrate our arguments using data from the National Educational Longitudinal Study....

  16. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  17. Mixed deterministic statistical modelling of regional ozone air pollution

    KAUST Repository

    Kalenderski, Stoitchko Dimitrov

    2011-03-17

    We develop a physically motivated statistical model for regional ozone air pollution by separating the ground-level pollutant concentration field into three components, namely: transport, local production and large-scale mean trend mostly dominated by emission rates. The model is novel in the field of environmental spatial statistics in that it is a combined deterministic-statistical model, which gives a new perspective to the modelling of air pollution. The model is presented in a Bayesian hierarchical formalism, and explicitly accounts for advection of pollutants, using the advection equation. We apply the model to a specific case of regional ozone pollution-the Lower Fraser valley of British Columbia, Canada. As a predictive tool, we demonstrate that the model vastly outperforms existing, simpler modelling approaches. Our study highlights the importance of simultaneously considering different aspects of an air pollution problem as well as taking into account the physical bases that govern the processes of interest. © 2011 John Wiley & Sons, Ltd..

  18. Isoscaling in Statistical Sequential Decay Model

    Institute of Scientific and Technical Information of China (English)

    TIAN Wen-Dong; SU Qian-Min; WANG Hong-Wei; WANG Kun; YAN Ting-ZHi; MA Yu-Gang; CAI Xiang-Zhou; FANG De-Qing; GUO Wei; MA Chun-Wang; LIU Gui-Hua; SHEN Wen-Qing; SHI Yu

    2007-01-01

    A sequential decay model is used to study isoscaling, I.e. The factorization of the isotope ratios from sources of different isospins and sizes over a broad range of excitation energies, into fugacity terms of proton and neutron number, R21(N, Z) = Y2(N, Z)/Y1(N, Z) = Cexp(αN +βZ). It is found that the isoscaling parameters α and β have a strong dependence on the isospin difference of equilibrated source and excitation energy, no significant influence of the source size on α andβ has been observed. It is found that α and β decrease with the excitation energy and are linear functions of 1/T and △(Z/A)2 or △(N/A)2 of the sources. Symmetry energy coefficient Csym is constrained from the relationship of α and source △(Z/A)2, β and source △(N/A)2.

  19. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  20. Development of statistical models for data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Downham, D.Y.

    2000-07-01

    Incidents that cause, or could cause, injury to personnel, and that satisfy specific criteria, are reported to the Offshore Safety Division (OSD) of the Health and Safety Executive (HSE). The underlying purpose of this report is to improve ways of quantifying risk, a recommendation in Lord Cullen's report into the Piper Alpha disaster. Records of injuries and hydrocarbon releases from 1 January, 1991, to 31 March 1996, are analysed, because the reporting of incidents was standardised after 1990. Models are identified for risk assessment and some are applied. The appropriate analyses of one or two factors (or variables) are tests of uniformity or of independence. Radar graphs are used to represent some temporal variables. Cusums are applied for the analysis of incident frequencies over time, and could be applied for regular monitoring. Log-linear models for Poisson-distributed data are identified as being suitable for identifying 'non-random' combinations of more than two factors. Some questions cannot be addressed with the available data: for example, more data are needed to assess the risk of injury per employee in a time interval. If the questions are considered sufficiently important, resources could be assigned to obtain the data. Some of the main results from the analyses are as follows: the cusum analyses identified a change-point at the end of July 1993, when the reported number of injuries reduced by 40%. Injuries were more likely to occur between 8am and 12am or between 2pm and 5pm than at other times: between 2pm and 3pm the number of injuries was almost twice the average and was more than three fold the smallest. No seasonal effects in the numbers of injuries were identified. Three-day injuries occurred more frequently on the 5th, 6th and 7th days into a tour of duty than on other days. Three-day injuries occurred less frequently on the 13th and 14th days of a tour of duty. An injury classified as 'lifting or craning' was

  1. ENHANCING HYDROLOGICAL SIMULATION PROGRAM - FORTRAN MODEL CHANNEL HYDRAULIC REPRESENTATION

    Science.gov (United States)

    The Hydrological Simulation Program– FORTRAN (HSPF) is a comprehensive watershed model that employs depth-area - volume - flow relationships known as the hydraulic function table (FTABLE) to represent the hydraulic characteristics of stream channel cross-sections and reservoirs. ...

  2. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  3. Improved Ray-Tracing for advanced radio propagation channel modeling

    OpenAIRE

    2012-01-01

    The characterization of the wireless propagation channel has always been an important issue in radio communications. However, in recent years, given the dramatic increase of demand in terms of capabilities of wireless systems, e.g. data rate, quality of service etc., the study of propagation has become of crucial importance. As measurements are generally costly and time consuming, channel models are widely used for this purpose. The modeling of propagation may rely on different types of mo...

  4. Theory of Stochastic Local Area Channel Modeling for Wireless Communications

    OpenAIRE

    Durgin, Gregory David

    2000-01-01

    This dissertation outlines work accomplished in the pursuit of this degree. This report is also designed to be a general introduction to the concepts and techniques of small-scale radio channel modeling. At the present time, there does not exist a comprehensive introduction and overview of basic concepts in this field. Furthermore, as the wireless industry continues to mature and develop technology, the need is now greater than ever for more sophisticated channel modeling research. Eac...

  5. Isospin dependence of nuclear multifragmentation in statistical model

    Institute of Scientific and Technical Information of China (English)

    张蕾; 谢东珠; 张艳萍; 高远

    2011-01-01

    The evolution of nuclear disintegration mechanisms with increasing excitation energy, from compound nucleus to multifragmentation, has been studied by using the Statistical Multifragmentation Model (SMM) within a micro-canonical ensemble. We discuss the o

  6. Statistical modeling of a considering work-piece

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2008-10-01

    Full Text Available In this article are presented the stochastic predictive models for controlling properly the independent variables of the drilling operation a combined approach of statistical design and Response Surface Methodology (RSM.

  7. Channel Modeling for Air-to-Ground Wireless Communication

    Institute of Scientific and Technical Information of China (English)

    Yingcheng Shi; Di He; Bin Li; Jianwu Dou

    2015-01-01

    In this paper, we discuss several large⁃scale fading models for different environments. The COST231⁃Hata model is adapted for air⁃to⁃ground modeling. We propose two criteria for air⁃to⁃ground channel modelling based on test data derived from field testing in Beijing. We develop a new propagation model that is more suitable for air⁃to⁃ground communication that pre⁃vious models. We focus on improving this propagation model using the field test data.

  8. A no extensive statistical model for the nucleon structure function

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos

    2013-03-01

    We studied an application of nonextensive thermodynamics to describe the structure function of nucleon, in a model where the usual Fermi-Dirac and Bose-Einstein energy distribution were replaced by the equivalent functions of the q-statistical. The parameters of the model are given by an effective temperature T, the q parameter (from Tsallis statistics), and two chemical potentials given by the corresponding up (u) and down (d) quark normalization in the nucleon.

  9. Model of risk assessment under ballistic statistical tests

    Science.gov (United States)

    Gabrovski, Ivan; Karakaneva, Juliana

    The material presents the application of a mathematical method for risk assessment under statistical determination of the ballistic limits of the protection equipment. The authors have implemented a mathematical model based on Pierson's criteria. The software accomplishment of the model allows to evaluate the V50 indicator and to assess the statistical hypothesis' reliability. The results supply the specialists with information about the interval valuations of the probability determined during the testing process.

  10. Modified Spatial Channel Model for MIMO Wireless Systems

    Directory of Open Access Journals (Sweden)

    Pekka Kyösti

    2007-12-01

    Full Text Available The third generation partnership Project's (3GPP spatial channel model (SCM is a stochastic channel model for MIMO systems. Due to fixed subpath power levels and angular directions, the SCM model does not show the degree of variation which is encountered in real channels. In this paper, we propose a modified SCM model which has random subpath powers and directions and still produces Laplace shape angular power spectrum. Simulation results on outage MIMO capacity with basic and modified SCM models show that the modified SCM model gives constantly smaller capacity values. Accordingly, it seems that the basic SCM gives too small correlation between MIMO antennas. Moreover, the variance in capacity values is larger using the proposed SCM model. Simulation results were supported by the outage capacity results from a measurement campaign conducted in the city centre of Oulu, Finland.

  11. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    Science.gov (United States)

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik

  12. Statistical Characterization of River and Channel Network Formation in Intermittently Flowing Vortex Systems.

    Science.gov (United States)

    Olson, C. J.; Reichhardt, C.; Nori, F.

    1997-03-01

    Vortices moving in dirty superconductors can form intricate flow patterns, resembling fluid rivers, as they interact with the pinning landscape (F. Nori, Science 271), 1373 (1996).. Weaker pinning produces relatively straight nori>vortex channels, while stronger pinning results in the formation of one or more winding channels that carry all flow. This corresponds to a crossover from elastic flow to plastic flow as the pinning strength is increased. For several pinning parameters, we find the fractal dimension of the channels that form, the vortex trail density, the distance travelled by vortices as they pass through the sample, the branching ratio, the sinuosity, and the size distribution of the rivers, and we compare our rivers with physical rivers that follow Horton's laws.

  13. 3D Massive MIMO Systems: Channel Modeling and Performance Analysis

    KAUST Repository

    Nadeem, Qurrat-Ul-Ain

    2015-03-01

    Multiple-input-multiple-output (MIMO) systems of current LTE releases are capable of adaptation in the azimuth only. More recently, the trend is to enhance the system performance by exploiting the channel\\'s degrees of freedom in the elevation through the dynamic adaptation of the vertical antenna beam pattern. This necessitates the derivation and characterization of three-dimensional (3D) channels. Over the years, channel models have evolved to address the challenges of wireless communication technologies. In parallel to theoretical studies on channel modeling, many standardized channel models like COST-based models, 3GPP SCM, WINNER, ITU have emerged that act as references for industries and telecommunication companies to assess system-level and link-level performances of advanced signal processing techniques over real-like channels. Given the existing channels are only two dimensional (2D) in nature; a large effort in channel modeling is needed to study the impact of the channel component in the elevation direction. The first part of this work sheds light on the current 3GPP activity around 3D channel modeling and beamforming, an aspect that to our knowledge has not been extensively covered by a research publication. The standardized MIMO channel model is presented, that incorporates both the propagation effects of the environment and the radio effects of the antennas. In order to facilitate future studies on the use of 3D beamforming, the main features of the proposed 3D channel model are discussed. A brief overview of the future 3GPP 3D channel model being outlined for the next generation of wireless networks is also provided. In the subsequent part of this work, we present an information-theoretic channel model for MIMO systems that supports the elevation dimension. The model is based on the principle of maximum entropy, which enables us to determine the distribution of the channel matrix consistent with the prior information on the angles of departure and

  14. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  15. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  16. Evaluation of stochastic differential equation approximation of ion channel gating models.

    Science.gov (United States)

    Bruce, Ian C

    2009-04-01

    Fox and Lu derived an algorithm based on stochastic differential equations for approximating the kinetics of ion channel gating that is simpler and faster than "exact" algorithms for simulating Markov process models of channel gating. However, the approximation may not be sufficiently accurate to predict statistics of action potential generation in some cases. The objective of this study was to develop a framework for analyzing the inaccuracies and determining their origin. Simulations of a patch of membrane with voltage-gated sodium and potassium channels were performed using an exact algorithm for the kinetics of channel gating and the approximate algorithm of Fox & Lu. The Fox & Lu algorithm assumes that channel gating particle dynamics have a stochastic term that is uncorrelated, zero-mean Gaussian noise, whereas the results of this study demonstrate that in many cases the stochastic term in the Fox & Lu algorithm should be correlated and non-Gaussian noise with a non-zero mean. The results indicate that: (i) the source of the inaccuracy is that the Fox & Lu algorithm does not adequately describe the combined behavior of the multiple activation particles in each sodium and potassium channel, and (ii) the accuracy does not improve with increasing numbers of channels.

  17. Electron impact ionization of tungsten ions in a statistical model

    Science.gov (United States)

    Demura, A. V.; Kadomtsev, M. B.; Lisitsa, V. S.; Shurygin, V. A.

    2015-01-01

    The statistical model for calculations of the electron impact ionization cross sections of multielectron ions is developed for the first time. The model is based on the idea of collective excitations of atomic electrons with the local plasma frequency, while the Thomas-Fermi model is used for atomic electrons density distribution. The electron impact ionization cross sections and related ionization rates of tungsten ions from W+ up to W63+ are calculated and then compared with the vast collection of modern experimental and modeling results. The reasonable correspondence between experimental and theoretical data demonstrates the universal nature of statistical approach to the description of atomic processes in multielectron systems.

  18. An Order Statistics Approach to the Halo Model for Galaxies

    Science.gov (United States)

    Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.

    2017-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts no luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-predicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the Halo Model for galaxies with more physically motivated galaxy formation models.

  19. FLOOD ROUTING MODELS IN CONFLUENT AND DIVIDING CHANNELS

    Institute of Scientific and Technical Information of China (English)

    范平; 李家春; 刘青泉

    2004-01-01

    By introducing a water depth connecting formula, the hydraulic equations in the dividing channel system were coupled and the relation of discharge distribution between the branches of the dividing channels can be yielded. In this manner, a numerical model for the confluent channels was established to study the variation of backwater effects with the parameters in the channel junction. The meeting of flood peaks in the mainstream and tributary can be analyzed with this model. The flood peak meeting is found to be a major factor for the extremely high water level in the mainstream during the 1998 Yangtze River flood. Subsequently the variations of discharge distribution and water level with channel parameters between each branch in this system were studied as well. As a result, flood evolution caused by Jingjiang River shortcut and sediment deposition in the entrance of dividing channels of the Yangtze River may be qualitatively elucidated. It is suggested to be an effective measure for flood mitigation to enhance regulation capability of reservoirs available upstream of the tributaries and harness branch entrance channels.

  20. Modeling Channelization in Coastal Wetlands with Ecological Feedbacks

    Science.gov (United States)

    Hughes, Z. J.; Mahadevan, A.; Pennings, S.; FitzGerald, D.

    2014-12-01

    In coastal wetlands in Georgia and South Carolina, dendritic channel networks are actively incising headward at the rate of nearly 2 m/yr. The future geomorphic evolution of these marshes remains in question as rates of relative sea-level rise increase. Our objective is to understand the mechanisms that lead to the evolution of these channel networks through field observations and modeling. We model the geomorphological evolution of tidal creeks by viewing the wetland as a permeable medium. The porosity of the medium affects its hydraulic conductivity, which in turn is altered by erosion. Our multiphase model spontaneously generates channelization and branching networks through flow and erosion. In our field studies, we find that crabs play an active role in grazing vegetation and in the bioturbation of sediments. These effects are incorporated in our model based on field and laboratory observations of crab behavior and its effects on the marsh. We find the erosional patterns and channelization are significantly altered by the faunal feedback. Crabs enhance the growth of channels, inducing the headward erosion of creeks where flow-induced stresses are weakest. They are instrumental in generating high rates of creek extension, which channelize the marsh more effectively in response to sea-level rise. This indicates that the evolution of coastal wetlands is responding to interactions between physics and ecology and highlights the importance of the faunal contribution to these feedbacks.

  1. An empirical conceptual gully evolution model for channelled sea cliffs

    Science.gov (United States)

    Leyland, Julian; Darby, Stephen E.

    2008-12-01

    Incised coastal channels are a specific form of incised channel that are found in locations where stream channels flowing to cliffed coasts have the excess energy required to cut down through the cliff to reach the outlet water body. The southern coast of the Isle of Wight, southern England, comprises soft cliffs that vary in height between 15 and 100 m and which are retreating at rates ≤ 1.5 m a - 1 , due to a combination of wave erosion and landslides. In several locations, river channels have cut through the cliffs to create deeply (≤ 45 m) incised gullies, known locally as 'Chines'. The Chines are unusual in that their formation is associated with dynamic shoreline encroachment during a period of rising sea-level, whereas existing models of incised channel evolution emphasise the significance of base level lowering. This paper develops a conceptual model of Chine evolution by applying space for time substitution methods using empirical data gathered from Chine channel surveys and remotely sensed data. The model identifies a sequence of evolutionary stages, which are classified based on a suite of morphometric indices and associated processes. The extent to which individual Chines are in a state of growth or decay is estimated by determining the relative rates of shoreline retreat and knickpoint recession, the former via analysis of historical aerial images and the latter through the use of a stream power erosion model.

  2. Numerical modelling of channel migration with application to laboratory rivers

    Institute of Scientific and Technical Information of China (English)

    Jian SUN; Bin-liang LIN; Hong-wei KUANG

    2015-01-01

    The paper presents the development of a morphological model and its application to experimental model rivers. The model takes into account the key processes of channel migration, including bed deformation, bank failure and wetting and drying. Secondary flows in bends play an important role in lateral sediment transport, which further affects channel migration. A new formula has been derived to predict the near-bed secondary flow speed, in which the magnitude of the speed is linked to the lateral water level gradient. Since only non-cohesive sediment is considered in the current study, the bank failure is modelled based on the concept of submerged angle of repose. The wetting and drying process is modelled using an existing method. Comparisons between the numerical model predictions and experimental observations for various discharges have been made. It is found that the model predicted channel planform and cross-sectional shapes agree generally well with the laboratory observations. A scenario analysis is also carried out to investigate the impact of secondary flow on the channel migration process. It shows that if the effect of secondary flow is ignored, the channel size in the lateral direction will be seriously underestimated.

  3. A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics

    Directory of Open Access Journals (Sweden)

    Guangkuo Lu

    2015-01-01

    Full Text Available Methods of utilizing independent component analysis (ICA give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments by applying a modified segmentation algorithm. Then the state space is reconstructed and the single-channel signal is transformed into a pseudo multiple input multiple output (MIMO mode using a method of nonlinear analysis based on the high order statistics (HOS. In the last step, ICA is performed on the pseudo MIMO data to decompose the single channel recording into its underlying independent components (ICs and the interested ICs are then extracted. Finally, the effectiveness and excellence of the higher order single-channel ICA (SCICA method are validated with measured data throughout experiments. Also, the proposed method in this paper is proved to be more robust under different SNR and/or embedding dimension via explicit formulae and simulations.

  4. Equilibrium Statistical-Thermal Models in High-Energy Physics

    CERN Document Server

    Tawfik, Abdel Nasser

    2014-01-01

    We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics, that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948 an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analysed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-par...

  5. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  6. In all likelihood statistical modelling and inference using likelihood

    CERN Document Server

    Pawitan, Yudi

    2001-01-01

    Based on a course in the theory of statistics this text concentrates on what can be achieved using the likelihood/Fisherian method of taking account of uncertainty when studying a statistical problem. It takes the concept ot the likelihood as providing the best methods for unifying the demands of statistical modelling and the theory of inference. Every likelihood concept is illustrated by realistic examples, which are not compromised by computational problems. Examples range from asimile comparison of two accident rates, to complex studies that require generalised linear or semiparametric mode

  7. Speech emotion recognition based on statistical pitch model

    Institute of Scientific and Technical Information of China (English)

    WANG Zhiping; ZHAO Li; ZOU Cairong

    2006-01-01

    A modified Parzen-window method, which keep high resolution in low frequencies and keep smoothness in high frequencies, is proposed to obtain statistical model. Then, a gender classification method utilizing the statistical model is proposed, which have a 98% accuracy of gender classification while long sentence is dealt with. By separation the male voice and female voice, the mean and standard deviation of speech training samples with different emotion are used to create the corresponding emotion models. Then the Bhattacharyya distance between the test sample and statistical models of pitch, are utilized for emotion recognition in speech.The normalization of pitch for the male voice and female voice are also considered, in order to illustrate them into a uniform space. Finally, the speech emotion recognition experiment based on K Nearest Neighbor shows that, the correct rate of 81% is achieved, where it is only 73.85%if the traditional parameters are utilized.

  8. Probabilistic Quantitative Precipitation Forecasting Using Ensemble Model Output Statistics

    CERN Document Server

    Scheuerer, Michael

    2013-01-01

    Statistical post-processing of dynamical forecast ensembles is an essential component of weather forecasting. In this article, we present a post-processing method that generates full predictive probability distributions for precipitation accumulations based on ensemble model output statistics (EMOS). We model precipitation amounts by a generalized extreme value distribution that is left-censored at zero. This distribution permits modelling precipitation on the original scale without prior transformation of the data. A closed form expression for its continuous rank probability score can be derived and permits computationally efficient model fitting. We discuss an extension of our approach that incorporates further statistics characterizing the spatial variability of precipitation amounts in the vicinity of the location of interest. The proposed EMOS method is applied to daily 18-h forecasts of 6-h accumulated precipitation over Germany in 2011 using the COSMO-DE ensemble prediction system operated by the Germa...

  9. Multiple commodities in statistical microeconomics: Model and market

    Science.gov (United States)

    Baaquie, Belal E.; Yu, Miao; Du, Xin

    2016-11-01

    A statistical generalization of microeconomics has been made in Baaquie (2013). In Baaquie et al. (2015), the market behavior of single commodities was analyzed and it was shown that market data provides strong support for the statistical microeconomic description of commodity prices. The case of multiple commodities is studied and a parsimonious generalization of the single commodity model is made for the multiple commodities case. Market data shows that the generalization can accurately model the simultaneous correlation functions of up to four commodities. To accurately model five or more commodities, further terms have to be included in the model. This study shows that the statistical microeconomics approach is a comprehensive and complete formulation of microeconomics, and which is independent to the mainstream formulation of microeconomics.

  10. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  11. A statistical model for the excitation of cavities through apertures

    CERN Document Server

    Gradoni, Gabriele; Anlage, Steven M; Ott, Edward

    2015-01-01

    In this paper, a statistical model for the coupling of electromagnetic radiation into enclosures through apertures is presented. The model gives a unified picture bridging deterministic theories of aperture radiation, and statistical models necessary for capturing the properties of irregular shaped enclosures. A Monte Carlo technique based on random matrix theory is used to predict and study the power transmitted through the aperture into the enclosure. Universal behavior of the net power entering the aperture is found. Results are of interest for predicting the coupling of external radiation through openings in irregular enclosures and reverberation chambers.

  12. General Linear Models: An Integrated Approach to Statistics

    OpenAIRE

    Andrew Faulkner; Sylvain Chartier

    2008-01-01

    Generally, in psychology, the various statistical analyses are taught independently from each other. As a consequence, students struggle to learn new statistical analyses, in contexts that differ from their textbooks. This paper gives a short introduction to the general linear model (GLM), in which it is showed that ANOVA (one-way, factorial, repeated measure and analysis of covariance) is simply a multiple correlation/regression analysis (MCRA). Generalizations to other cases, such as multiv...

  13. Statistical model of the classification of shale in a hydrocyclone

    Energy Technology Data Exchange (ETDEWEB)

    Lopachenok, L.V.; Punin, A.E.; Belyanin, Yu.I.; Proskuryakov, V.A.

    1977-10-01

    The mathematical model obtained by experimental and statistical methods for the classification of shale in a hydrocyclone is adequate for a real industrial-scale process, as indicated by the statistical analysis carried out for it, and together with the material-balance relationships it permits the calculation of the engineering parameters for any classification conditions within the region of the factor space investigated, as well as the search for the optimum conditions for the industrial realization of the process.

  14. Ferroelectric active models of ion channels in biomembranes.

    Science.gov (United States)

    Bystrov, V S; Lakhno, V D; Molchanov, M

    1994-06-21

    Ferroactive models of ion channels in the theory of biological membranes are presented. The main equations are derived and their possible solutions are shown. The estimates of some experimentally measured parameters are given. Possible physical consequences of the suggested models are listed and the possibility of their experimental finding is discussed. The functioning of the biomembrane's ion channel is qualitatively described on the basis of the suggested ferroactive models. The main directions and prospects for development of the ferroactive approach to the theory of biological membranes and their structures are indicated.

  15. Statistical Design Model (SDM) of satellite thermal control subsystem

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  16. Wireless Channel Characterization: Modeling the 5 GHz Microwave Landing System Extension Band for Future Airport Surface Communications

    Science.gov (United States)

    Matolak, D. W.; Apaza, Rafael; Foore, Lawrence R.

    2006-01-01

    We describe a recently completed wideband wireless channel characterization project for the 5 GHz Microwave Landing System (MLS) extension band, for airport surface areas. This work included mobile measurements at large and small airports, and fixed point-to-point measurements. Mobile measurements were made via transmission from the air traffic control tower (ATCT), or from an airport field site (AFS), to a receiving ground vehicle on the airport surface. The point-to-point measurements were between ATCT and AFSs. Detailed statistical channel models were developed from all these measurements. Measured quantities include propagation path loss and power delay profiles, from which we obtain delay spreads, frequency domain correlation (coherence bandwidths), fading amplitude statistics, and channel parameter correlations. In this paper we review the project motivation, measurement coordination, and illustrate measurement results. Example channel modeling results for several propagation conditions are also provided, highlighting new findings.

  17. Analysis and Approximation of Statistical Distribution of Eigenvalues in i. i. d. MIMO Channels under Rayleigh Fading

    Science.gov (United States)

    Taniguchi, Tetsuki; Sha, Shen; Karasawa, Yoshio

    In multiple input multiple output (MIMO) communication systems, eigenvalues of channel correlation matrices play an essential role for the performance analysis, and particularly the investigation about their behavior under time-variant environment ruled by a certain statistics is an important problem. This paper first gives the theoretical expressions for the marginal distributions of all the ordered eigenvalues of MIMO correlation matrices under i. i. d. (independent and identically distributed) Rayleigh fading environment. Then, an approximation method of those marginal distributions is presented: We show that the theory of SIMO space diversity using maximal ratio combining (MRC) is applicable to the approximation of statistical distributions of all eigenvalues in MIMO systems with the same number of diversity branches. The derived approximation has a monomial form suitable for the calculation of various performance measures utilized in MIMO systems. Through computer simulations, the effectiveness of the proposed method is demonstrated.

  18. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  19. Monte Carlo Modeling of Crystal Channeling at High Energies

    CERN Document Server

    Schoofs, Philippe; Cerutti, Francesco

    Charged particles entering a crystal close to some preferred direction can be trapped in the electromagnetic potential well existing between consecutive planes or strings of atoms. This channeling effect can be used to extract beam particles if the crystal is bent beforehand. Crystal channeling is becoming a reliable and efficient technique for collimating beams and removing halo particles. At CERN, the installation of silicon crystals in the LHC is under scrutiny by the UA9 collaboration with the goal of investigating if they are a viable option for the collimation system upgrade. This thesis describes a new Monte Carlo model of planar channeling which has been developed from scratch in order to be implemented in the FLUKA code simulating particle transport and interactions. Crystal channels are described through the concept of continuous potential taking into account thermal motion of the lattice atoms and using Moliere screening function. The energy of the particle transverse motion determines whether or n...

  20. Statistical Modeling for Wind-Temperature Meteorological Elements in Troposphere

    CERN Document Server

    Virtser, A; Golbraikh, E

    2010-01-01

    A comprehensive statistical model for vertical profiles of the horizontal wind and temperature throughout the troposphere is presented. The model is based on radiosonde measurements of wind and temperature during several years. The profiles measured under quite different atmospheric conditions exhibit qualitative similarity, and a proper choice of the reference scales for the wind, temperature and altitude levels allows to consider the measurement data as realizations of a random process with universal characteristics: means, the basic functions and parameters of standard distributions for transform coefficients of the Principal Component Analysis. The features of the atmospheric conditions are described by statistical characteristics of the wind-temperature ensemble of dimensional reference scales. The high effectiveness of the proposed approach is provided by a similarity of wind - temperature vertical profiles, which allow to carry out the statistical modeling in the low-dimension space of the dimensional ...

  1. Computationally efficient statistical differential equation modeling using homogenization

    Science.gov (United States)

    Hooten, Mevin B.; Garlick, Martha J.; Powell, James A.

    2013-01-01

    Statistical models using partial differential equations (PDEs) to describe dynamically evolving natural systems are appearing in the scientific literature with some regularity in recent years. Often such studies seek to characterize the dynamics of temporal or spatio-temporal phenomena such as invasive species, consumer-resource interactions, community evolution, and resource selection. Specifically, in the spatial setting, data are often available at varying spatial and temporal scales. Additionally, the necessary numerical integration of a PDE may be computationally infeasible over the spatial support of interest. We present an approach to impose computationally advantageous changes of support in statistical implementations of PDE models and demonstrate its utility through simulation using a form of PDE known as “ecological diffusion.” We also apply a statistical ecological diffusion model to a data set involving the spread of mountain pine beetle (Dendroctonus ponderosae) in Idaho, USA.

  2. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  3. Analyzing sickness absence with statistical models for survival data

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars;

    2007-01-01

    absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...... between the psychosocial work environment and sickness absence were used to illustrate the results. RESULTS: Standard methods were found to underestimate true effect sizes by approximately one-tenth [method i] and one-third [method ii] and to have lower statistical power than frailty models. CONCLUSIONS...

  4. Modern statistical models for forensic fingerprint examinations: a critical review.

    Science.gov (United States)

    Abraham, Joshua; Champod, Christophe; Lennard, Chris; Roux, Claude

    2013-10-10

    Over the last decade, the development of statistical models in support of forensic fingerprint identification has been the subject of increasing research attention, spurned on recently by commentators who claim that the scientific basis for fingerprint identification has not been adequately demonstrated. Such models are increasingly seen as useful tools in support of the fingerprint identification process within or in addition to the ACE-V framework. This paper provides a critical review of recent statistical models from both a practical and theoretical perspective. This includes analysis of models of two different methodologies: Probability of Random Correspondence (PRC) models that focus on calculating probabilities of the occurrence of fingerprint configurations for a given population, and Likelihood Ratio (LR) models which use analysis of corresponding features of fingerprints to derive a likelihood value representing the evidential weighting for a potential source.

  5. LETTER: Statistical physics of the Schelling model of segregation

    Science.gov (United States)

    Dall'Asta, L.; Castellano, C.; Marsili, M.

    2008-07-01

    We investigate the static and dynamic properties of a celebrated model of social segregation, providing a complete explanation of the mechanisms leading to segregation both in one- and two-dimensional systems. Standard statistical physics methods shed light on the rich phenomenology of this simple model, exhibiting static phase transitions typical of kinetic constrained models, non-trivial coarsening like in driven-particle systems and percolation-related phenomena.

  6. Understanding and forecasting polar stratospheric variability with statistical models

    Directory of Open Access Journals (Sweden)

    C. Blume

    2012-02-01

    Full Text Available The variability of the north-polar stratospheric vortex is a prominent aspect of the middle atmosphere. This work investigates a wide class of statistical models with respect to their ability to model geopotential and temperature anomalies, representing variability in the polar stratosphere. Four partly nonstationary, nonlinear models are assessed: linear discriminant analysis (LDA; a cluster method based on finite elements (FEM-VARX; a neural network, namely a multi-layer perceptron (MLP; and support vector regression (SVR. These methods model time series by incorporating all significant external factors simultaneously, including ENSO, QBO, the solar cycle, volcanoes, etc., to then quantify their statistical importance. We show that variability in reanalysis data from 1980 to 2005 is successfully modeled. FEM-VARX and MLP even satisfactorily forecast the period from 2005 to 2011. However, internal variability remains that cannot be statistically forecasted, such as the unexpected major warming in January 2009. Finally, the statistical model with the best generalization performance is used to predict a vortex breakdown in late January, early February 2012.

  7. Blind equalization of underwater acoustic channels using implicit higher-order statistics

    NARCIS (Netherlands)

    Blom, Koen C.H.; Dol, Henry S.; Kokkeler, André B.J.; Smit, Gerard J.M.

    2016-01-01

    In order to reduce the length of transmission time slots and energy consumption of underwater modems, this work focuses on equalization without the need for training sequences. This type of equalization is known as blind equalization. A blind equalizer cascade based on higher-order statistics is pre

  8. The Statistical Modeling of the Trends Concerning the Romanian Population

    Directory of Open Access Journals (Sweden)

    Gabriela OPAIT

    2014-11-01

    Full Text Available This paper reflects the statistical modeling concerning the resident population in Romania, respectively the total of the romanian population, through by means of the „Least Squares Method”. Any country it develops by increasing of the population, respectively of the workforce, which is a factor of influence for the growth of the Gross Domestic Product (G.D.P.. The „Least Squares Method” represents a statistical technique for to determine the trend line of the best fit concerning a model.

  9. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    Science.gov (United States)

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  10. Spectrum Sensing Based on Censored Observations in Time-Varying Channels using AR-1 Model

    Directory of Open Access Journals (Sweden)

    Dhaval K Patel

    2015-01-01

    Full Text Available Non-parametric sensing algorithms are preferred in cognitive radio. In this paper, spectrum sensing method based on censored observations is proposed. We evaluate the performance of Censored Anderson-Darling (CAD sensing method in time-varying and flat-fading channel using Monte Carlo simulations. We have shown the performance of the CAD sensing in terms of receiver operating characteristic (ROC. The considered channel is modeled by Gaussian variables and characterized by a first ordered autoregressive process ($AR1$. It is shown that the proposed method outperforms prevailing techniques such as the Energy detection (ED sensing and  Order-statistic (OS based sensing in time-varying channel at lower signal to noise ratio.

  11. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  12. Schedulability of Herschel revisited using statistical model checking

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2015-01-01

    Schedulability analysis is a main concern for several embedded applications due to their safety-critical nature. The classical method of response time analysis provides an efficient technique used in industrial practice. However, the method is based on conservative assumptions related to execution...... to obtain some guarantee on the (un)schedulability of the model even in the presence of undecidability. Two methods are considered: symbolic model checking and statistical model checking. Since the model uses stop-watches, the reachability problem becomes undecidable so we are using an over......-approximation technique. We can safely conclude that the system is schedulable for varying values of BCET. For the cases where deadlines are violated, we use polyhedra to try to confirm the witnesses. Our alternative method to confirm non-schedulability uses statistical model-checking (SMC) to generate counter...

  13. Application of Bayesian Hierarchical Prior Modeling to Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Shutin, Dmitriy

    2012-01-01

    . The estimators result as an application of the variational message-passing algorithm on the factor graph representing the signal model extended with the hierarchical prior models. Numerical results demonstrate the superior performance of our channel estimators as compared to traditional and state......Existing methods for sparse channel estimation typically provide an estimate computed as the solution maximizing an objective function defined as the sum of the log-likelihood function and a penalization term proportional to the l1-norm of the parameter of interest. However, other penalization......-of-the-art sparse methods....

  14. Modeling and equalization of nonlinear bandlimited satellite channels

    Science.gov (United States)

    Konstantinides, K.; Yao, K.

    1986-01-01

    The problem of modeling and equalization of a nonlinear satellite channel is considered. The channel is assumed to be bandlimited and exhibits both amplitude and phase nonlinearities. A discrete time satellite link is modeled under both uplink and downlink white Gaussian noise. Under conditions of practical interest, a simple and computationally efficient design technique for the minimum mean square error linear equalizer is presented. The bit error probability and some numerical results for a binary phase shift keyed (BPSK) system demonstrate that the proposed equalization technique outperforms standard linear receiver structures.

  15. Space-time clutter model for airborne bistatic radar with non-Gaussian statistics

    Institute of Scientific and Technical Information of China (English)

    Duan Rui; Wang Xuegang; Yiang Chaoshu; Chen Zhuming

    2009-01-01

    To validate the potential space-time adaptive processing (STAP) algorithms for airborne bistatic radar clutter suppression under nonstationary and non-Gaussian clutter environments, a statistically non-Gaussian, space-time clutter model in varying bistatic geometrical scenarios is presented. The inclusive effects of the model contain the range dependency of bistatic clutter spectrum and clutter power variation in range-angle cells. To capture them, a new approach to coordinate system conversion is initiated into formulating bistatic geometrical model, and the bistatic non-Gaussian amplitude clutter representation method based on a compound model is introduced. The veracity of the geometrical model is validated by using the bistatie configuration parameters of multi-channel airborne radar measurement (MCARM) experiment. And simulation results manifest that the proposed model can accurately shape the space-time clutter spectrum tied up with specific airborne bistatic radar scenario and can characterize the heterogeneity of clutter amplitude distribution in practical clutter environments.

  16. A Review of Modeling Bioelectrochemical Systems: Engineering and Statistical Aspects

    Directory of Open Access Journals (Sweden)

    Shuai Luo

    2016-02-01

    Full Text Available Bioelectrochemical systems (BES are promising technologies to convert organic compounds in wastewater to electrical energy through a series of complex physical-chemical, biological and electrochemical processes. Representative BES such as microbial fuel cells (MFCs have been studied and advanced for energy recovery. Substantial experimental and modeling efforts have been made for investigating the processes involved in electricity generation toward the improvement of the BES performance for practical applications. However, there are many parameters that will potentially affect these processes, thereby making the optimization of system performance hard to be achieved. Mathematical models, including engineering models and statistical models, are powerful tools to help understand the interactions among the parameters in BES and perform optimization of BES configuration/operation. This review paper aims to introduce and discuss the recent developments of BES modeling from engineering and statistical aspects, including analysis on the model structure, description of application cases and sensitivity analysis of various parameters. It is expected to serves as a compass for integrating the engineering and statistical modeling strategies to improve model accuracy for BES development.

  17. A simple statistical signal loss model for deep underground garage

    DEFF Research Database (Denmark)

    Nguyen, Huan Cong; Gimenez, Lucas Chavarria; Kovacs, Istvan;

    2016-01-01

    In this paper we address the channel modeling aspects for a deep-indoor scenario with extreme coverage conditions in terms of signal losses, namely underground garage areas. We provide an in-depth analysis in terms of path loss (gain) and large scale signal shadowing, and a propose simple...... propagation model which can be used to predict cellular signal levels in similar deep-indoor scenarios. The proposed frequency-independent floor attenuation factor (FAF) is shown to be in range of 5.2 dB per meter deep....

  18. Estimation of turbulent channel flow based on the wall measurement with a statistical approach

    Science.gov (United States)

    Hasegawa, Yosuke; Suzuki, Takao

    2016-11-01

    A turbulent channel flow at Ret au = 100 with periodic boundary conditions is estimated with linear stochastic estimation only based on the wall measurement, i.e. the shear-stress in the streamwise and spanwise directions as well as the pressure over the entire wavenumbers. The results reveal that instantaneous measurement on the wall governs the success of the estimation in y+ feed the velocity components from the linear stochastic estimation via the body-force term into the Navier-Stokes system; however, the estimation slightly improves in the log layer, indicating some benefit of involving a dynamical system but over-suppression of turbulent kinetic energy beyond the viscous sublayer by the linear stochastic estimation. Motions inaccurately estimated in the buffer layer prevent from further reconstruction toward the centerline even if we relax the feedback forcing and let the flow evolve nonlinearly through the estimator. We also argue the inherent limitation of turbulent flow estimation based on the wall measurement.

  19. A Mathematical Model of Membrane Gas Separation with Energy Transfer by Molecules of Gas Flowing in a Channel to Molecules Penetrating this Channel from the Adjacent Channel

    Directory of Open Access Journals (Sweden)

    Szwast Maciej

    2015-06-01

    Full Text Available The paper presents the mathematical modelling of selected isothermal separation processes of gaseous mixtures, taking place in plants using membranes, in particular nonporous polymer membranes. The modelling concerns membrane modules consisting of two channels - the feeding and the permeate channels. Different shapes of the channels cross-section were taken into account. Consideration was given to co-current and counter-current flows, for feeding and permeate streams, respectively, flowing together with the inert gas receiving permeate. In the proposed mathematical model it was considered that pressure of gas changes along the length of flow channels was the result of both - the drop of pressure connected with flow resistance, and energy transfer by molecules of gas flowing in a given channel to molecules which penetrate this channel from the adjacent channel. The literature on membrane technology takes into account only the drop of pressure connected with flow resistance. Consideration given to energy transfer by molecules of gas flowing in a given channel to molecules which penetrate this channel from the adjacent channel constitute the essential novelty in the current study. The paper also presents results of calculations obtained by means of a computer program which used equations of the derived model. Physicochemical data concerning separation of the CO2/CH4 mixture with He as the sweep gas and data concerning properties of the membrane made of PDMS were assumed for calculations.

  20. Modeling magnitude statistics of multilook SAR interferograms by generalizing G distributions

    Science.gov (United States)

    Gao, Gui; Shi, Gongtao

    2015-06-01

    Statistical analysis of multilook interferograms is a foundational issue in sensor signal processing of multiple-channel synthetic aperture radar (SAR), such as slow ground moving target indication (GMTI) in along-track interferometric (ATI) SAR. By an approximate derivation of the product of two modified Bessel functions, we propose in this paper a distribution (denoted simply as ΓIn) to model the interferometric magnitude of homogeneous clutter and analyze the capability of approximation using ΓIn according to numerical calculations. Following this, under the frame of the product model and by utilizing ΓIn, we analytically provide two distributions, KIn and Gn0, corresponding to heterogeneous and extremely heterogeneous terrain clutter, respectively. We show that the proposed ΓIn,KIn and G In0 are the multi-channel generalizations of the well-known Γ, K and G0, respectively, which belong to the special cases of G distribution for single-channel SAR images. Finally, the estimators of the proposed models are obtained by applying the Method of Log Cumulants (MoLC), which can accurately calculate the contained parameters. Experiments performed on the National Aeronautics and Space Administration Jet Propulsion Laboratory's (NASA/JPL) AirSAR images that used the Kullback-Leibler (KL) divergence as a similarity measurement verified the performance of the proposed models and estimators.

  1. Nowcasting GDP Growth: statistical models versus professional analysts

    NARCIS (Netherlands)

    J.M. de Winter (Jasper)

    2016-01-01

    markdownabstractThis thesis contains four chapters that cast new light on the ability of professional analysts and statistical models to assess economic growth in the current quarter (nowcast) and its development in the near future. This is not a trivial issue. An accurate assessment of the current

  2. Statistical Modeling for Radiation Hardness Assurance: Toward Bigger Data

    Science.gov (United States)

    Ladbury, R.; Campola, M. J.

    2015-01-01

    New approaches to statistical modeling in radiation hardness assurance are discussed. These approaches yield quantitative bounds on flight-part radiation performance even in the absence of conventional data sources. This allows the analyst to bound radiation risk at all stages and for all decisions in the RHA process. It also allows optimization of RHA procedures for the project's risk tolerance.

  3. Hypersonic Vehicle Tracking Based on Improved Current Statistical Model

    Directory of Open Access Journals (Sweden)

    He Guangjun

    2013-11-01

    Full Text Available A new method of tracking the near space hypersonic vehicle is put forward. According to hypersonic vehicles’ characteristics, we improved current statistical model through online identification of the maneuvering frequency. A Monte Carlo simulation is used to analyze the performance of the method. The results show that the improved method exhibits very good tracking performance in comparison with the old method.

  4. Environmental Concern and Sociodemographic Variables: A Study of Statistical Models

    Science.gov (United States)

    Xiao, Chenyang; McCright, Aaron M.

    2007-01-01

    Studies of the social bases of environmental concern over the past 30 years have produced somewhat inconsistent results regarding the effects of sociodemographic variables, such as gender, income, and place of residence. The authors argue that model specification errors resulting from violation of two statistical assumptions (interval-level…

  5. Modelling geographical graduate job search using circular statistics

    NARCIS (Netherlands)

    Faggian, Alessandra; Corcoran, Jonathan; McCann, Philip

    2013-01-01

    Theory suggests that the spatial patterns of migration flows are contingent both on individual human capital and underlying geographical structures. Here we demonstrate these features by using circular statistics in an econometric modelling framework applied to the flows of UK university graduates.

  6. Octet magnetic Moments and their sum rules in statistical model

    CERN Document Server

    Batra, M

    2013-01-01

    The statistical model is implemented to find the magnetic moments of all octet baryons. The well-known sum rules like GMO and CG sum rules has been checked in order to check the consistency of our approach. The small discrepancy between the results suggests the importance of breaking in SU(3) symmetry.

  7. Statistical sampling and modelling for cork oak and eucalyptus stands

    NARCIS (Netherlands)

    Paulo, M.J.

    2002-01-01

    This thesis focuses on the use of modern statistical methods to solve problems on sampling, optimal cutting time and agricultural modelling in Portuguese cork oak and eucalyptus stands. The results are contained in five chapters that have been submitted for publication as scientific manuscripts.The

  8. Complex Wall Boundary Conditions for Modeling Combustion in Catalytic Channels

    Science.gov (United States)

    Zhu, Huayang; Jackson, Gregory

    2000-11-01

    Monolith catalytic reactors for exothermic oxidation are being used in automobile exhaust clean-up and ultra-low emissions combustion systems. The reactors present a unique coupling between mass, heat, and momentum transport in a channel flow configuration. The use of porous catalytic coatings along the channel wall presents a complex boundary condition when modeled with the two-dimensional channel flow. This current work presents a 2-D transient model for predicting the performance of catalytic combustion systems for methane oxidation on Pd catalysts. The model solves the 2-D compressible transport equations for momentum, species, and energy, which are solved with a porous washcoat model for the wall boundary conditions. A time-splitting algorithm is used to separate the stiff chemical reactions from the convective/diffusive equations for the channel flow. A detailed surface chemistry mechanism is incorporated for the catalytic wall model and is used to predict transient ignition and steady-state conversion of CH4-air flows in the catalytic reactor.

  9. Applying the luminosity function statistics in the fireshell model

    Science.gov (United States)

    Rangel Lemos, L. J.; Bianco, C. L.; Ruffini, R.

    2015-12-01

    The luminosity function (LF) statistics applied to the data of BATSE, GBM/Fermi and BAT/Swift is the theme approached in this work. The LF is a strong statistical tool to extract useful information from astrophysical samples, and the key point of this statistical analysis is in the detector sensitivity, where we have performed careful analysis. We applied the tool of the LF statistics to three GRB classes predicted by the Fireshell model. We produced, by LF statistics, predicted distributions of: peak ux N(Fph pk), redshift N(z) and peak luminosity N(Lpk) for the three GRB classes predicted by Fireshell model; we also used three GRB rates. We looked for differences among the distributions, and in fact we found. We performed a comparison between the distributions predicted and observed (with and without redshifts), where we had to build a list with 217 GRBs with known redshifts. Our goal is transform the GRBs in a standard candle, where a alternative is find a correlation between the isotropic luminosity and the Band peak spectral energy (Liso - Epk).

  10. Statistical mechanics models for motion and force planning

    Science.gov (United States)

    Rodriguez, G.

    1990-01-01

    The models of statistical mechanics provide an alternative to the methods of classical mechanics more traditionally used in robotics. They have a potential to: improve analysis of object collisions; handle kinematic and dynamic contact interactions within the same frmework; and reduce the need for perfect deterministic world model information. The statistical mechanics models characterize the state of the system as a probability density function (p.d.f.) whose time evolution is governed by a partial differential equation subject to boundary and initial conditions. The boundary conditions when rigid objects collide reflect the conservation of momentum. The models are being developed to embedd in remote semi-autonomous systems with a need to reason and interact with a multiobject environment.

  11. Workshop on Model Uncertainty and its Statistical Implications

    CERN Document Server

    1988-01-01

    In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.

  12. Statistical models describing the energy signature of buildings

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Thavlov, Anders

    2010-01-01

    Approximately one third of the primary energy production in Denmark is used for heating in buildings. Therefore efforts to accurately describe and improve energy performance of the building mass are very important. For this purpose statistical models describing the energy signature of a building, i.......e. the heat dynamics of the building, have been developed. The models can be used to obtain rather detailed knowledge of the energy performance of the building and to optimize the control of the energy consumption for heating, which will be vital in conditions with increasing fluctuation of the energy supply...... or varying energy prices. The paper will give an overview of statistical methods and applied models based on experiments carried out in FlexHouse, which is an experimental building in SYSLAB, Risø DTU. The models are of different complexity and can provide estimates of physical quantities such as UA...

  13. An Order Statistics Approach to the Halo Model for Galaxies

    CERN Document Server

    Paul, Niladri; Sheth, Ravi K

    2016-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models -- one in which this luminosity function $p(L)$ is universal -- naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts $\\textit{no}$ luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a $\\textit{halo mass dependent}$ luminosity function $p(L|m)$, is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-pre...

  14. Monte-Carlo simulation-based statistical modeling

    CERN Document Server

    Chen, John

    2017-01-01

    This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.

  15. Development of 3D statistical mandible models for cephalometric measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Goo; Yi, Won Jin; Hwang, Soon Jung; Choi, Soon Chul; Lee, Sam Sun; Heo, Min Suk; Huh, Kyung Hoe; Kim, Tae Il [School of Dentistry, Seoul National University, Seoul (Korea, Republic of); Hong, Helen; Yoo, Ji Hyun [Division of Multimedia Engineering, Seoul Women' s University, Seoul (Korea, Republic of)

    2012-09-15

    The aim of this study was to provide sex-matched three-dimensional (3D) statistical shape models of the mandible, which would provide cephalometric parameters for 3D treatment planning and cephalometric measurements in orthognathic surgery. The subjects used to create the 3D shape models of the mandible included 23 males and 23 females. The mandibles were segmented semi-automatically from 3D facial CT images. Each individual mandible shape was reconstructed as a 3D surface model, which was parameterized to establish correspondence between different individual surfaces. The principal component analysis (PCA) applied to all mandible shapes produced a mean model and characteristic models of variation. The cephalometric parameters were measured directly from the mean models to evaluate the 3D shape models. The means of the measured parameters were compared with those from other conventional studies. The male and female 3D statistical mean models were developed from 23 individual mandibles, respectively. The male and female characteristic shapes of variation produced by PCA showed a large variability included in the individual mandibles. The cephalometric measurements from the developed models were very close to those from some conventional studies. We described the construction of 3D mandibular shape models and presented the application of the 3D mandibular template in cephalometric measurements. Optimal reference models determined from variations produced by PCA could be used for craniofacial patients with various types of skeletal shape.

  16. Multiscale modeling of turbulent channel flow over porous walls

    Science.gov (United States)

    Yogaraj, Sudhakar; Lacis, Ugis; Bagheri, Shervin

    2016-11-01

    We perform direct numerical simulations of fully developed turbulent flow through a channel coated with a porous material. The Navier-stokes equations governing the fluid domain and the Darcy equations of the porous medium are coupled using an iterative partitioned scheme. At the interface between the two media, boundary conditions derived using a multiscale homogenization approach are enforced. The main feature of this approach is that the anisotropic micro-structural pore features are directly taken into consideration to derive the constitutive coefficients of the porous media as well as of the interface. The focus of the present work is to study the influence of micro-structure pore geometry on the dynamics of turbulent flows. Detailed turbulence statistics and instantaneous flow field are presented. For comparison, flow through impermeable channel flows are included. Supported by the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie Grant agreement No 708281.

  17. Development of 3D statistical mandible models for cephalometric measurements

    OpenAIRE

    2012-01-01

    Purpose The aim of this study was to provide sex-matched three-dimensional (3D) statistical shape models of the mandible, which would provide cephalometric parameters for 3D treatment planning and cephalometric measurements in orthognathic surgery. Materials and Methods The subjects used to create the 3D shape models of the mandible included 23 males and 23 females. The mandibles were segmented semi-automatically from 3D facial CT images. Each individual mandible shape was reconstructed as a ...

  18. Statistical multiscale image segmentation via Alpha-stable modeling

    OpenAIRE

    Wan, Tao; Canagarajah, CN; Achim, AM

    2007-01-01

    This paper presents a new statistical image segmentation algorithm, in which the texture features are modeled by symmetric alpha-stable (SalphaS) distributions. These features are efficiently combined with the dominant color feature to perform automatic segmentation. First, the image is roughly segmented into textured and nontextured regions using the dual-tree complex wavelet transform (DT-CWT) with the sub-band coefficients modeled as SalphaS random variables. A mul-tiscale segmentation is ...

  19. Generalized statistical model for multicomponent adsorption equilibria on zeolites

    Energy Technology Data Exchange (ETDEWEB)

    Rota, R.; Gamba, G.; Paludetto, R.; Carra, S.; Morbidelli, M. (Dipartimento di Chimica Fisica Applicata, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (IT))

    1988-05-01

    The statistical thermodynamic approach to multicomponent adsorption equilibria on zeolites has been extended to nonideal systems, through the correction of cross coefficients characterizing the interaction between unlike molecules. Estimation of the model parameters requires experimental binary equilibrium data. Comparisons with the classical model based on adsorbed solution theory are reported for three nonideal ternary systems. The two approaches provide comparable results in the simulation of binary and ternary adsorption equilibrium data at constant temperature and pressure.

  20. Bregman divergence as general framework to estimate unnormalized statistical models

    CERN Document Server

    Gutmann, Michael

    2012-01-01

    We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to one, respectively. We prove that recent estimation methods such as noise-contrastive estimation, ratio matching, and score matching belong to the proposed framework, and explain their interconnection based on supervised learning. Further, we discuss the role of boosting in unsupervised learning.

  1. Statistical modelling of transcript profiles of differentially regulated genes

    Directory of Open Access Journals (Sweden)

    Sergeant Martin J

    2008-07-01

    Full Text Available Abstract Background The vast quantities of gene expression profiling data produced in microarray studies, and the more precise quantitative PCR, are often not statistically analysed to their full potential. Previous studies have summarised gene expression profiles using simple descriptive statistics, basic analysis of variance (ANOVA and the clustering of genes based on simple models fitted to their expression profiles over time. We report the novel application of statistical non-linear regression modelling techniques to describe the shapes of expression profiles for the fungus Agaricus bisporus, quantified by PCR, and for E. coli and Rattus norvegicus, using microarray technology. The use of parametric non-linear regression models provides a more precise description of expression profiles, reducing the "noise" of the raw data to produce a clear "signal" given by the fitted curve, and describing each profile with a small number of biologically interpretable parameters. This approach then allows the direct comparison and clustering of the shapes of response patterns between genes and potentially enables a greater exploration and interpretation of the biological processes driving gene expression. Results Quantitative reverse transcriptase PCR-derived time-course data of genes were modelled. "Split-line" or "broken-stick" regression identified the initial time of gene up-regulation, enabling the classification of genes into those with primary and secondary responses. Five-day profiles were modelled using the biologically-oriented, critical exponential curve, y(t = A + (B + CtRt + ε. This non-linear regression approach allowed the expression patterns for different genes to be compared in terms of curve shape, time of maximal transcript level and the decline and asymptotic response levels. Three distinct regulatory patterns were identified for the five genes studied. Applying the regression modelling approach to microarray-derived time course data

  2. A channel distortion model for video over lossy packet networks

    Institute of Scientific and Technical Information of China (English)

    CHENG Jian-xin; GAO Zhen-ming; ZHANG Zhi-chao

    2006-01-01

    Error-resilient video communication over lossy packet networks is often designed and operated based on models for the effect of losses on the reconstructed video quality. This paper analyzes the channel distortion for video over lossy packet networks and proposes a new model that, compared to previous models, more accurately estimates the expected mean-squared error distortion for different packet loss patterns by accounting for inter-frame error propagation and the correlation between error frames. The accuracy of the proposed model is validated with JVT/H.264 encoded standard test sequences and previous frame concealment, where the proposed model provides an obvious accuracy gain over previous models.

  3. Advances on statistical/thermodynamical models for unpolarized structure functions

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos; Tomio, Lauro

    2013-03-01

    During the eights and nineties many statistical/thermodynamical models were proposed to describe the nucleons' structure functions and distribution of the quarks in the hadrons. Most of these models describe the compound quarks and gluons inside the nucleon as a Fermi / Bose gas respectively, confined in a MIT bag[1] with continuous energy levels. Another models considers discrete spectrum. Some interesting features of the nucleons are obtained by these models, like the sea asymmetries ¯d/¯u and ¯d-¯u.

  4. STATISTICAL MODELS FOR SEMI-RIGID NEMATIC POLYMERS

    Institute of Scientific and Technical Information of China (English)

    WANG Xinjiu

    1995-01-01

    Semi-rigid liquid crystal polymer is a class of liquid crystal polymers different from long rigid rod liquid crystal polymer to which the well-known Onsager and Flory theories are applied. In this paper, three statistical models for the semi-rigid nematic polymer were addressed. They are the elastically jointed rod model, worm-like chain model, and non-homogeneous chain model.The nematic-isotropic transition temperature was examined. The pseudo-second transition temperature is expressed analytically. Comparisons with the experiments were made and the agreements were found.

  5. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  6. General Linear Models: An Integrated Approach to Statistics

    Directory of Open Access Journals (Sweden)

    Andrew Faulkner

    2008-09-01

    Full Text Available Generally, in psychology, the various statistical analyses are taught independently from each other. As a consequence, students struggle to learn new statistical analyses, in contexts that differ from their textbooks. This paper gives a short introduction to the general linear model (GLM, in which it is showed that ANOVA (one-way, factorial, repeated measure and analysis of covariance is simply a multiple correlation/regression analysis (MCRA. Generalizations to other cases, such as multivariate and nonlinear analysis, are also discussed. It can easily be shown that every popular linear analysis can be derived from understanding MCRA.

  7. Level statistics of a pseudo-Hermitian Dicke model.

    Science.gov (United States)

    Deguchi, Tetsuo; Ghosh, Pijush K; Kudo, Kazue

    2009-08-01

    A non-Hermitian operator that is related to its adjoint through a similarity transformation is defined as a pseudo-Hermitian operator. We study the level statistics of a pseudo-Hermitian Dicke Hamiltonian that undergoes quantum phase transition (QPT). We find that the level-spacing distribution of this Hamiltonian near the integrable limit is close to Poisson distribution, while it is Wigner distribution for the ranges of the parameters for which the Hamiltonian is nonintegrable. We show that the assertion in the context of the standard Dicke model that QPT is a precursor to a change in the level statistics is not valid in general.

  8. Stochastic geometry, spatial statistics and random fields models and algorithms

    CERN Document Server

    2015-01-01

    Providing a graduate level introduction to various aspects of stochastic geometry, spatial statistics and random fields, this volume places a special emphasis on fundamental classes of models and algorithms as well as on their applications, for example in materials science, biology and genetics. This book has a strong focus on simulations and includes extensive codes in Matlab and R, which are widely used in the mathematical community. It can be regarded as a continuation of the recent volume 2068 of Lecture Notes in Mathematics, where other issues of stochastic geometry, spatial statistics and random fields were considered, with a focus on asymptotic methods.

  9. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  10. Modeling of Reverberant Radio Channels Using Propagation Graphs

    DEFF Research Database (Denmark)

    Pedersen, Troels; Steinböck, Gerhard; Fleury, Bernard Henri

    2012-01-01

    decaying power. We model the channel as a propagation graph in which vertices represent transmitters, receivers, and scatterers, while edges represent propagation conditions between vertices. The recursive structure of the graph accounts for the exponential power decay and the avalanche effect. We derive...

  11. Synaptic channel model including effects of spike width variation

    OpenAIRE

    2015-01-01

    Synaptic Channel Model Including Effects of Spike Width Variation Hamideh Ramezani Next-generation and Wireless Communications Laboratory (NWCL) Department of Electrical and Electronics Engineering Koc University, Istanbul, Turkey Ozgur B. Akan Next-generation and Wireless Communications Laboratory (NWCL) Department of Electrical and Electronics Engineering Koc University, Istanbul, Turkey ABSTRACT An accu...

  12. Statistical Characterization for L and X-Band Meteorological Satellite Channel%L与X波段气象卫星信道概率统计特性

    Institute of Scientific and Technical Information of China (English)

    张秀再; 郭业才; 陈金立; 杨昌军

    2012-01-01

    covering clouds in the cloudy weather, which form part of the shadow block over a certain range spread above the ground station. In this case, the signals received by ground station may have two situations. One circumstance, the received signals are composed of the line of sight and a certain intensity of multipath scattering signals that are diffracted, refracted and scattered, and PDF obeys the statistical characterizations of Rice. The other circumstance, the received signals are composed of the line of sight obscured by clouds, and PDF o-beys the statistical characterizations of Lognormal. There are a few thin clouds in the clear sky and good visibility in the high atmosphere layer, when the signals received by ground station are composed of very weak multipath scattering signals and the line of sight, and PDF obeys the statistical characterizations of Gauss.According to theoretical analysis, the simulation models of Rayleigh, Rice, Lognormal and Gauss probability distribution are established. Through the computer calculation, the results of the simulation models show that the signals received by ground station with different composition lead to different statistical characterizations because meteorological satellite signal pass through different physical state of the atmosphere. The multipath scattering signals both exist in the Rice channel and the Rayleigh channel, however, the line of sight only exists in the Rice channel. Gauss channel model and the Rice channel model have the same structure, but the received signals in both channels have different intensity of the multipath scattering components. That explains the cause why the variety of the received signals envelope brings out different statistical characterizations in different channels. The probability density curve of the simulation model and the theoretical model match quite well, verifying the correctness and validity of the theoretical analysis, providing a theoretical guidance to calculate the data error

  13. On Wiener filtering and the physics behind statistical modeling.

    Science.gov (United States)

    Marbach, Ralf

    2002-01-01

    The closed-form solution of the so-called statistical multivariate calibration model is given in terms of the pure component spectral signal, the spectral noise, and the signal and noise of the reference method. The "statistical" calibration model is shown to be as much grounded on the physics of the pure component spectra as any of the "physical" models. There are no fundamental differences between the two approaches since both are merely different attempts to realize the same basic idea, viz., the spectrometric Wiener filter. The concept of the application-specific signal-to-noise ratio (SNR) is introduced, which is a combination of the two SNRs from the reference and the spectral data. Both are defined and the central importance of the latter for the assessment and development of spectroscopic instruments and methods is explained. Other statistics like the correlation coefficient, prediction error, slope deficiency, etc., are functions of the SNR. Spurious correlations and other practically important issues are discussed in quantitative terms. Most important, it is shown how to use a priori information about the pure component spectra and the spectral noise in an optimal way, thereby making the distinction between statistical and physical calibrations obsolete and combining the best of both worlds. Companies and research groups can use this article to realize significant savings in cost and time for development efforts.

  14. Statistical skull models from 3D X-ray images

    CERN Document Server

    Berar, M; Bailly, G; Payan, Y; Berar, Maxime; Desvignes, Michel; Payan, Yohan

    2006-01-01

    We present 2 statistical models of the skull and mandible built upon an elastic registration method of 3D meshes. The aim of this work is to relate degrees of freedom of skull anatomy, as static relations are of main interest for anthropology and legal medicine. Statistical models can effectively provide reconstructions together with statistical precision. In our applications, patient-specific meshes of the skull and the mandible are high-density meshes, extracted from 3D CT scans. All our patient-specific meshes are registrated in a subject-shared reference system using our 3D-to-3D elastic matching algorithm. Registration is based upon the minimization of a distance between the high density mesh and a shared low density mesh, defined on the vertexes, in a multi resolution approach. A Principal Component analysis is performed on the normalised registrated data to build a statistical linear model of the skull and mandible shape variation. The accuracy of the reconstruction is under the millimetre in the shape...

  15. Statistical 3D damage accumulation model for ion implant simulators

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez-Mangas, J.M. E-mail: jesman@ele.uva.es; Lazaro, J.; Enriquez, L.; Bailon, L.; Barbolla, J.; Jaraiz, M

    2003-04-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided.

  16. Statistical traffic modeling of MPEG frame size: Experiments and Analysis

    Directory of Open Access Journals (Sweden)

    Haniph A. Latchman

    2009-12-01

    Full Text Available For guaranteed quality of service (QoS and sufficient bandwidth in a communication network which provides an integrated multimedia service, it is important to obtain an analytical and tractable model of the compressed MPEG data. This paper presents a statistical approach to a group of picture (GOP MPEG frame size model to increase network traffic performance in a communication network. We extract MPEG frame data from commercial DVD movies and make probability histograms to analyze the statistical characteristics of MPEG frame data. Six candidates of probability distributions are considered here and their parameters are obtained from the empirical data using the maximum likelihood estimation (MLE. This paper shows that the lognormal distribution is the best fitting model of MPEG-2 total frame data.

  17. A statistical model for characterization of histopathology images

    Science.gov (United States)

    Álvarez, Pablo; Castro, Guatizalema; Corredor, Germán.; Romero, Eduardo

    2015-01-01

    Accessing information of interest in collections of histopathology images is a challenging task. To address such issue, previous works have designed searching strategies based on the use of keywords and low-level features. However, those methods have demonstrated to not be enough or practical for this purpose. Alternative low-level features such as cell area, distance among cells and cell density are directly associated to simple histological concepts and could serve as good descriptors for this purpose. In this paper, a statistical model is adapted to represent the distribution of the areas occupied by cells for its use in whole histopathology image characterization. This novel descriptor facilitates the design of metrics based on distribution parameters and also provides new elements for a better image understanding. The proposed model was validated using image processing and statistical techniques. Results showed low error rates, demonstrating the accuracy of the model.

  18. Statistical 3D damage accumulation model for ion implant simulators

    CERN Document Server

    Hernandez-Mangas, J M; Enriquez, L E; Bailon, L; Barbolla, J; Jaraiz, M

    2003-01-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided.

  19. Statistical model selection with “Big Data”

    Directory of Open Access Journals (Sweden)

    Jurgen A. Doornik

    2015-12-01

    Full Text Available Big Data offer potential benefits for statistical modelling, but confront problems including an excess of false positives, mistaking correlations for causes, ignoring sampling biases and selecting by inappropriate methods. We consider the many important requirements when searching for a data-based relationship using Big Data, and the possible role of Autometrics in that context. Paramount considerations include embedding relationships in general initial models, possibly restricting the number of variables to be selected over by non-statistical criteria (the formulation problem, using good quality data on all variables, analyzed with tight significance levels by a powerful selection procedure, retaining available theory insights (the selection problem while testing for relationships being well specified and invariant to shifts in explanatory variables (the evaluation problem, using a viable approach that resolves the computational problem of immense numbers of possible models.

  20. Improved head-driven statistical models for natural language parsing

    Institute of Scientific and Technical Information of China (English)

    袁里驰

    2013-01-01

    Head-driven statistical models for natural language parsing are the most representative lexicalized syntactic parsing models, but they only utilize semantic dependency between words, and do not incorporate other semantic information such as semantic collocation and semantic category. Some improvements on this distinctive parser are presented. Firstly, "valency" is an essential semantic feature of words. Once the valency of word is determined, the collocation of the word is clear, and the sentence structure can be directly derived. Thus, a syntactic parsing model combining valence structure with semantic dependency is purposed on the base of head-driven statistical syntactic parsing models. Secondly, semantic role labeling(SRL) is very necessary for deep natural language processing. An integrated parsing approach is proposed to integrate semantic parsing into the syntactic parsing process. Experiments are conducted for the refined statistical parser. The results show that 87.12% precision and 85.04% recall are obtained, and F measure is improved by 5.68% compared with the head-driven parsing model introduced by Collins.

  1. SoS contract verification using statistical model checking

    Directory of Open Access Journals (Sweden)

    Alessandro Mignogna

    2013-11-01

    Full Text Available Exhaustive formal verification for systems of systems (SoS is impractical and cannot be applied on a large scale. In this paper we propose to use statistical model checking for efficient verification of SoS. We address three relevant aspects for systems of systems: 1 the model of the SoS, which includes stochastic aspects; 2 the formalization of the SoS requirements in the form of contracts; 3 the tool-chain to support statistical model checking for SoS. We adapt the SMC technique for application to heterogeneous SoS. We extend the UPDM/SysML specification language to express the SoS requirements that the implemented strategies over the SoS must satisfy. The requirements are specified with a new contract language specifically designed for SoS, targeting a high-level English- pattern language, but relying on an accurate semantics given by the standard temporal logics. The contracts are verified against the UPDM/SysML specification using the Statistical Model Checker (SMC PLASMA combined with the simulation engine DESYRE, which integrates heterogeneous behavioral models through the functional mock-up interface (FMI standard. The tool-chain allows computing an estimation of the satisfiability of the contracts by the SoS. The results help the system architect to trade-off different solutions to guide the evolution of the SoS.

  2. A statistical model to predict streamwise turbulent dispersion from the wall at small times

    Science.gov (United States)

    Nguyen, Quoc; Papavassiliou, Dimitrios V.

    2016-12-01

    Data from simulations are used to develop a statistical model that can provide the streamwise dispersion distribution of passive particles released from the wall of a turbulent flow channel. It is found that a three-point gamma probability density function is the statistical distribution that can describe the dispersion of particles with Schmidt numbers ranging from 6 to 2400 at relatively short times after the release of the particles. Scaling arguments are used to physically justify and predict the parameters of the gamma three-point distribution. The model is used to predict particle separation that can occur in turbulent flow under special conditions. Close to the channel wall, turbulent convection is not the dominant transport mechanism, but molecular diffusion can dominate transport depending on the Schmidt number of the particles. This leads to turbulence-induced separation rather than mixing, and the currently proposed model can be used to predict the level of separation. Practically, these results can be applied for separating very small particles or even macromolecules in dilute suspensions.

  3. Statistical procedures for evaluating daily and monthly hydrologic model predictions

    Science.gov (United States)

    Coffey, M.E.; Workman, S.R.; Taraba, J.L.; Fogle, A.W.

    2004-01-01

    The overall study objective was to evaluate the applicability of different qualitative and quantitative methods for comparing daily and monthly SWAT computer model hydrologic streamflow predictions to observed data, and to recommend statistical methods for use in future model evaluations. Statistical methods were tested using daily streamflows and monthly equivalent runoff depths. The statistical techniques included linear regression, Nash-Sutcliffe efficiency, nonparametric tests, t-test, objective functions, autocorrelation, and cross-correlation. None of the methods specifically applied to the non-normal distribution and dependence between data points for the daily predicted and observed data. Of the tested methods, median objective functions, sign test, autocorrelation, and cross-correlation were most applicable for the daily data. The robust coefficient of determination (CD*) and robust modeling efficiency (EF*) objective functions were the preferred methods for daily model results due to the ease of comparing these values with a fixed ideal reference value of one. Predicted and observed monthly totals were more normally distributed, and there was less dependence between individual monthly totals than was observed for the corresponding predicted and observed daily values. More statistical methods were available for comparing SWAT model-predicted and observed monthly totals. The 1995 monthly SWAT model predictions and observed data had a regression Rr2 of 0.70, a Nash-Sutcliffe efficiency of 0.41, and the t-test failed to reject the equal data means hypothesis. The Nash-Sutcliffe coefficient and the R r2 coefficient were the preferred methods for monthly results due to the ability to compare these coefficients to a set ideal value of one.

  4. Measurement-Based LoS/NLoS Channel Modeling for Hot-Spot Urban Scenarios in UMTS Networks

    Directory of Open Access Journals (Sweden)

    Jiajing Chen

    2014-01-01

    Full Text Available A measurement campaign is introduced for modeling radio channels with either line-of-sight (LoS or non-line-of-sight (NLoS connection between user equipment (UE and NodeB (NB in an operating universal mobile telecommunications system. A space-alternating generalized expectation-maximization (SAGE algorithm is applied to estimate the delays and the complex attenuations of multipath components from the obtained channel impulse responses. Based on a novel LoS detection method of multipath parameter estimates, channels are classified into LoS and NLoS categories. Deterministic models which are named “channel maps” and fading statistical models have been constructed for LoS and NLoS, respectively. In addition, statistics of new parameters, such as the distance between the NB and the UE in LoS/NLoS scenarios, the life-distance of LoS channel, the LoS existence probability per location and per NB, the power variation at LoS to NLoS transition and vice versa, and the transition duration, are extracted. These models are applicable for designing and performance evaluation of transmission techniques or systems used by distinguishing the LoS and NLoS channels.

  5. Testing bedrock incision models: Holocene channel evolution, High Cascades, Oregon

    Science.gov (United States)

    Sweeney, K. E.; Roering, J. J.; Fonstad, M. A.

    2013-12-01

    There is abundant field evidence that sediment supply controls the incision of bedrock channels by both protecting the bed from incision and providing tools to incise the bed. Despite several theoretical models for sediment-dependent bedrock abrasion, many investigations of natural channel response to climatic, lithologic, or tectonic forcing rely on the stream power model, which does not consider the role of sediment. Here, we use a well-constrained fluvial channel cut into a Holocene lava flow in the High Cascades, Oregon to compare incision predictions of the stream power model and of the full physics of theoretical models for saltation-abrasion incision by bedload and suspended load. The blocky andesite of Collier lava flow erupted from Collier Cone ~1500 years ago, paving over the existing landscape and erasing fine-scale landscape dissection. Since the eruption, a 6 km stream channel has been incised into the lava flow. The channel is comprised of three alluvial reaches with sediment deposits up to 2 m thick and two bedrock gorges with incision of up to 8 m, with larger magnitude incision in the upstream gorge. Abraded forms such as flutes are present in both gorges. Given the low magnitude and duration of modern snowmelt flow in the channel, it is likely that much of the incision was driven by sediment-laden outburst floods from the terminus of Collier Glacier, which is situated just upstream of the lava flow and has produced two outburst floods in the past 100 years. This site is well suited for comparing incision models because of the relatively uniform lithology of the lava flow and our ability to constrain the timing and depth of incision using the undissected lava surface above the channel as an initial condition. Using a simple finite difference scheme with airborne-Lidar-derived pre-incision topography as an initial condition, we predict incision in the two gorges through time with both stream power and sediment-dependent models. Field observations

  6. Measurement and Modeling of Narrowband Channels for Ultrasonic Underwater Communications

    Directory of Open Access Journals (Sweden)

    Francisco J. Cañete

    2016-02-01

    Full Text Available Underwater acoustic sensor networks are a promising technology that allow real-time data collection in seas and oceans for a wide variety of applications. Smaller size and weight sensors can be achieved with working frequencies shifted from audio to the ultrasonic band. At these frequencies, the fading phenomena has a significant presence in the channel behavior, and the design of a reliable communication link between the network sensors will require a precise characterization of it. Fading in underwater channels has been previously measured and modeled in the audio band. However, there have been few attempts to study it at ultrasonic frequencies. In this paper, a campaign of measurements of ultrasonic underwater acoustic channels in Mediterranean shallow waters conducted by the authors is presented. These measurements are used to determine the parameters of the so-called κ-μ shadowed distribution, a fading model with a direct connection to the underlying physical mechanisms. The model is then used to evaluate the capacity of the measured channels with a closed-form expression.

  7. Measurement and Modeling of Narrowband Channels for Ultrasonic Underwater Communications.

    Science.gov (United States)

    Cañete, Francisco J; López-Fernández, Jesús; García-Corrales, Celia; Sánchez, Antonio; Robles, Encarnación; Rodrigo, Francisco J; Paris, José F

    2016-01-01

    Underwater acoustic sensor networks are a promising technology that allow real-time data collection in seas and oceans for a wide variety of applications. Smaller size and weight sensors can be achieved with working frequencies shifted from audio to the ultrasonic band. At these frequencies, the fading phenomena has a significant presence in the channel behavior, and the design of a reliable communication link between the network sensors will require a precise characterization of it. Fading in underwater channels has been previously measured and modeled in the audio band. However, there have been few attempts to study it at ultrasonic frequencies. In this paper, a campaign of measurements of ultrasonic underwater acoustic channels in Mediterranean shallow waters conducted by the authors is presented. These measurements are used to determine the parameters of the so-called κ-μ shadowed distribution, a fading model with a direct connection to the underlying physical mechanisms. The model is then used to evaluate the capacity of the measured channels with a closed-form expression.

  8. Information Models of Acupuncture Analgesia and Meridian Channels

    Directory of Open Access Journals (Sweden)

    Chang Hua Zou

    2010-12-01

    Full Text Available Acupuncture and meridian channels have been major components of Chinese and Eastern Asian medicine—especially for analgesia—for over 2000 years. In recent decades, electroacupuncture (EA analgesia has been applied clinically and experimentally. However, there were controversial results between different treatment frequencies, or between the active and the placebo treatments; and the mechanisms of the treatments and the related meridian channels are still unknown. In this study, we propose a new term of infophysics therapy and develop information models of acupuncture (or EA analgesia and meridian channels, to understand the mechanisms and to explain the controversial results, based on Western theories of information, trigonometry and Fourier series, and physics, as well as published biomedical data. We are trying to build a bridge between Chinese medicine and Western medicine by investigating the Eastern acupuncture analgesia and meridian channels with Western sciences; we model the meridians as a physiological system that is mostly constructed with interstices in or between other physiological systems; we consider frequencies, amplitudes and wave numbers of electric field intensity (EFI as information data. Our modeling results demonstrate that information regulated with acupuncture (or EA is different from pain information, we provide answers to explain the controversial published results, and suggest that mechanisms of acupuncture (or EA analgesia could be mostly involved in information regulation of frequencies and amplitudes of EFI as well as neuronal transmitters such as endorphins.

  9. The FPGA Implementation of Short—Wave Channel Model

    Institute of Scientific and Technical Information of China (English)

    GANLiangcai; LIYuanyuan

    2003-01-01

    Based on the characteristic of timevariance,short-wave channel can be modeled as a real-time tors of fllter in frequency domain,the model can simulate short-wave channel exactly,such as delay spread,Doppler shift and Doppler spread.In the design,the bandwidth of short-wave channel model is 768kHz,and the frequency interval is 3kHz.A kind of Overlap-Discard algorithm based on the fast Fourier transform (FFT)is utilized to design the real-time FIR filter,and an architectural design structure based on Field Programmable Gate Arrays(FPGA)chip is adopted to implement 512-point FFT.The channel transfer function and the noise and interference function are periodically updated in real-time,which are stored in ROM in advance.The simulation result shows that the hardware implementation is simple and feasible and the wideband short-wave systems,such as frequency-hopping,direct sequence spread spectrum systems.

  10. Measurement and Modeling of Narrowband Channels for Ultrasonic Underwater Communications

    Science.gov (United States)

    Cañete, Francisco J.; López-Fernández, Jesús; García-Corrales, Celia; Sánchez, Antonio; Robles, Encarnación; Rodrigo, Francisco J.; Paris, José F.

    2016-01-01

    Underwater acoustic sensor networks are a promising technology that allow real-time data collection in seas and oceans for a wide variety of applications. Smaller size and weight sensors can be achieved with working frequencies shifted from audio to the ultrasonic band. At these frequencies, the fading phenomena has a significant presence in the channel behavior, and the design of a reliable communication link between the network sensors will require a precise characterization of it. Fading in underwater channels has been previously measured and modeled in the audio band. However, there have been few attempts to study it at ultrasonic frequencies. In this paper, a campaign of measurements of ultrasonic underwater acoustic channels in Mediterranean shallow waters conducted by the authors is presented. These measurements are used to determine the parameters of the so-called κ-μ shadowed distribution, a fading model with a direct connection to the underlying physical mechanisms. The model is then used to evaluate the capacity of the measured channels with a closed-form expression. PMID:26907281

  11. Physics-based statistical learning approach to mesoscopic model selection

    Science.gov (United States)

    Taverniers, Søren; Haut, Terry S.; Barros, Kipton; Alexander, Francis J.; Lookman, Turab

    2015-11-01

    In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.

  12. Nuclear EMC effect in non-extensive statistical model

    Science.gov (United States)

    Trevisan, Luis A.; Mirez, Carlos

    2013-05-01

    In the present work, we attempt to describe the nuclear EMC effect by using the proton structure functions obtained from the non-extensive statistical quark model. We record that such model has three fundamental variables, the temperature T, the radius, and the Tsallis parameter q. By combining different small changes, a good agreement with the experimental data may be obtained. Another interesting point of the model is to allow phenomenological interpretation, for instance, with q constant and changing the radius and the temperature or changing the radius and q and keeping the temperature.

  13. Think continuous: Markovian Gaussian models in spatial statistics

    CERN Document Server

    Simpson, Daniel; Rue, Håvard

    2011-01-01

    Gaussian Markov random fields (GMRFs) are frequently used as computationally efficient models in spatial statistics. Unfortunately, it has traditionally been difficult to link GMRFs with the more traditional Gaussian random field models as the Markov property is difficult to deploy in continuous space. Following the pioneering work of Lindgren et al. (2011), we expound on the link between Markovian Gaussian random fields and GMRFs. In particular, we discuss the theoretical and practical aspects of fast computation with continuously specified Markovian Gaussian random fields, as well as the clear advantages they offer in terms of clear, parsimonious and interpretable models of anisotropy and non-stationarity.

  14. Statistics of a neuron model driven by asymmetric colored noise.

    Science.gov (United States)

    Müller-Hansen, Finn; Droste, Felix; Lindner, Benjamin

    2015-02-01

    Irregular firing of neurons can be modeled as a stochastic process. Here we study the perfect integrate-and-fire neuron driven by dichotomous noise, a Markovian process that jumps between two states (i.e., possesses a non-Gaussian statistics) and exhibits nonvanishing temporal correlations (i.e., represents a colored noise). Specifically, we consider asymmetric dichotomous noise with two different transition rates. Using a first-passage-time formulation, we derive exact expressions for the probability density and the serial correlation coefficient of the interspike interval (time interval between two subsequent neural action potentials) and the power spectrum of the spike train. Furthermore, we extend the model by including additional Gaussian white noise, and we give approximations for the interspike interval (ISI) statistics in this case. Numerical simulations are used to validate the exact analytical results for pure dichotomous noise, and to test the approximations of the ISI statistics when Gaussian white noise is included. The results may help to understand how correlations and asymmetry of noise and signals in nerve cells shape neuronal firing statistics.

  15. Spatio-temporal statistical models with applications to atmospheric processes

    Energy Technology Data Exchange (ETDEWEB)

    Wikle, C.K.

    1996-12-31

    This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model.

  16. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  17. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  18. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  19. RANDOM SYSTEMS OF HARD PARTICLES:MODELS AND STATISTICS

    Institute of Scientific and Technical Information of China (English)

    Dietrich Stoyan

    2002-01-01

    This paper surveys models and statistical properties of random systems of hard particles. Such systems appear frequently in materials science, biology and elsewhere. In mathematical - statistical investigations, simulations of such structures play an important role. In these simulations various methods and models are applied, namely the RSA model, sedimentation and collective rearrangement algorithms, molecular dynamics, and Monte Carlo methods such as the Metropolis - Hastings algorithm. The statistical description of real and simulated particle systems uses ideas of the mathematical theories of random sets and point processes. This leads to characteristics such as volume fraction or porosity, covariance,contact distribution functions, specific connectivity number from the random set approach and intensity, pair correlation function and mark correlation functions from the point process approach. Some of them can be determined stereologically using planar sections, while others can only be obtained using three - dimensional data and 3D image analysis. They are valuable tools for fitting models to empirical data and, consequently, for understanding various materials, biological structures, porous media and other practically important spatial structures.

  20. Real-Time Statistical Modeling of Blood Sugar.

    Science.gov (United States)

    Otoom, Mwaffaq; Alshraideh, Hussam; Almasaeid, Hisham M; López-de-Ipiña, Diego; Bravo, José

    2015-10-01

    Diabetes is considered a chronic disease that incurs various types of cost to the world. One major challenge in the control of Diabetes is the real time determination of the proper insulin dose. In this paper, we develop a prototype for real time blood sugar control, integrated with the cloud. Our system controls blood sugar by observing the blood sugar level and accordingly determining the appropriate insulin dose based on patient's historical data, all in real time and automatically. To determine the appropriate insulin dose, we propose two statistical models for modeling blood sugar profiles, namely ARIMA and Markov-based model. Our experiment used to evaluate the performance of the two models shows that the ARIMA model outperforms the Markov-based model in terms of prediction accuracy.

  1. Maritime Channel Modeling and Simulation for Efficient Wideband Communications between Autonomous Unmanned Surface Vehicles

    Science.gov (United States)

    2013-09-01

    5 4.1 NAKAGAMI -M FADING CHANNEL MODEL ...................................................................... 6 5. SIMULATION...modeled using a Nakagami -m distribution. A special instance of the Nakagami -m multipath fading channel is the Rayleigh fading channel, which is...decomposed into the following contributions and losses: , (7) where LAPM is the propagation loss calculated by APM. 4.1 NAKAGAMI -M FADING CHANNEL

  2. Statistical mechanics models for multimode lasers and random lasers

    CERN Document Server

    Antenucci, F; Berganza, M Ibáñez; Marruzzo, A; Leuzzi, L

    2015-01-01

    We review recent statistical mechanical approaches to multimode laser theory. The theory has proved very effective to describe standard lasers. We refer of the mean field theory for passive mode locking and developments based on Monte Carlo simulations and cavity method to study the role of the frequency matching condition. The status for a complete theory of multimode lasing in open and disordered cavities is discussed and the derivation of the general statistical models in this framework is presented. When light is propagating in a disordered medium, the system can be analyzed via the replica method. For high degrees of disorder and nonlinearity, a glassy behavior is expected at the lasing threshold, providing a suggestive link between glasses and photonics. We describe in details the results for the general Hamiltonian model in mean field approximation and mention an available test for replica symmetry breaking from intensity spectra measurements. Finally, we summary some perspectives still opened for such...

  3. Passive Target Tracking Based on Current Statistical Model

    Institute of Scientific and Technical Information of China (English)

    DENG Xiao-long; XIE Jian-ying; YANG Yu-pu

    2005-01-01

    Bearing-only passive tracking is regarded as a nonlinear hard tracking problem. There are still no completely good solutions to this problem until now. Based on current statistical model, the novel solution to this problem utilizing particle filter (PF) and the unscented Kalman filter (UKF) is proposed. The new solution adopts data fusion from two observers to increase the observability of passive tracking. It applies the residual resampling step to reduce the degeneracy of PF and it introduces the Markov Chain Monte Carlo methods (MCMC) to reduce the effect of the "sample impoverish". Based on current statistical model, the EKF, the UKF and particle filter with various proposal distributions are compared in the passive tracking experiments with two observers. The simulation results demonstrate the good performance of the proposed new filtering methods with the novel techniques.

  4. Statistical detection of structural damage based on model reduction

    Institute of Scientific and Technical Information of China (English)

    Tao YIN; Heung-fai LAM; Hong-ping ZHU

    2009-01-01

    This paper proposes a statistical method for damage detection based on the finite element (FE) model reduction technique that utilizes measured modal data with a limited number of sensors.A deterministic damage detection process is formulated based on the model reduction technique.The probabilistic process is integrated into the deterministic damage detection process using a perturbation technique,resulting in a statistical structural damage detection method.This is achieved by deriving the firstand second-order partial derivatives of uncertain parameters,such as elasticity of the damaged member,with respect to the measurement noise,which allows expectation and covariance matrix of the uncertain parameters to be calculated.Besides the theoretical development,this paper reports numerical verification of the proposed method using a portal frame example and Monte Carlo simulation.

  5. Statistical inference to advance network models in epidemiology.

    Science.gov (United States)

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  6. An improved dynamic subgrid-scale model and its application to large eddy simulation of stratified channel flows

    Institute of Scientific and Technical Information of China (English)

    ZHONG; Fengquan(仲峰泉); LIU; Nansheng(刘难生); LU; Xiyun(陆夕云); ZHUANG; Lixian(庄礼贤)

    2002-01-01

    In the present paper, a new dynamic subgrid-scale (SGS) model of turbulent stress and heat flux for stratified shear flow is proposed. Based on our calculated results of stratified channel flow, the dynamic subgrid-scale model developed in this paper is shown to be effective for large eddy simulation (LES) of stratified turbulent shear flows. The new SGS model is then applied to the LES of the stratified turbulent channel flow to investigate the coupled shear and buoyancy effects on the behavior of turbulent statistics, turbulent heat transfer and flow structures at different Richardson numbers.

  7. Statistical Theory of Breakup Reactions

    CERN Document Server

    Bertulani, Carlos A; Hussein, Mahir S

    2014-01-01

    We propose alternatives to coupled-channels calculations with loosely-bound exotic nuclei (CDCC), based on the the random matrix (RMT) and the optical background (OPM) models for the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCC$_S$), able in principle to take into account many pseudo channels.

  8. Statistical Theory of Breakup Reactions

    Science.gov (United States)

    Bertulani, Carlos A.; Descouvemont, Pierre; Hussein, Mahir S.

    2014-04-01

    We propose an alternative for Coupled-Channels calculations with looselybound exotic nuclei(CDCC), based on the the Random Matrix Model of the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCCs), able in principle to take into account many pseudo channels.

  9. Statistical Theory of Breakup Reactions

    Directory of Open Access Journals (Sweden)

    Bertulani Carlos A.

    2014-04-01

    Full Text Available We propose an alternative for Coupled-Channels calculations with looselybound exotic nuclei(CDCC, based on the the Random Matrix Model of the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCCs, able in principle to take into account many pseudo channels.

  10. Statistical theory of breakup reactions

    Energy Technology Data Exchange (ETDEWEB)

    Bertulani, Carlos A., E-mail: carlos.bertulani@tamuc.edu [Department of Physics and Astronomy, Texas A and M University-Commerce, Commerce, TX (United States); Descouvemont, Pierre, E-mail: pdesc@ulb.ac.be [Physique Nucleaire Theorique et Physique Mathematique, Universite Libre de Bruxelles (ULB), Brussels (Belgium); Hussein, Mahir S., E-mail: hussein@if.usp.br [Universidade de Sao Paulo (USP), Sao Paulo, SP (Brazil). Instituto de Estudos Avancados

    2014-07-01

    We propose an alternative for Coupled-Channels calculations with loosely bound exotic nuclei (CDCC), based on the the Random Matrix Model of the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCC{sub s}), able in principle to take into account many pseudo channels. (author)

  11. Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.

    Directory of Open Access Journals (Sweden)

    John K Hillier

    Full Text Available Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL. By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity. Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.

  12. Statistical Quark Model for the Nucleon Structure Function

    Science.gov (United States)

    Mirez, Carlos; Tomio, Lauro; Trevisan, Luis A.; Frederico, Tobias

    2009-06-01

    A statistical quark model, with quark energy levels given by a central linear confining potential is used to obtain the light sea-quark asymmetry, d¯/ū, and also for the ratio d/u, inside the nucleon. After adjusting a temperature parameter by the Gottfried sum rule violation, and chemical potentials by the valence up and down quark normalizations, the results are compared with experimental data available.

  13. A statistical mechanics model of carbon nanotube macro-films

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Carbon nanotube macro-films are two-dimensional films with micrometer thickness and centimeter by centimeter in-plane dimension.These carbon nanotube macroscopic assemblies have attracted significant attention from the material and mechanics communities recently because they can be easily handled and tailored to meet specific engineering needs.This paper reports the experimental methods on the preparation and characterization of single-walled carbon nanotube macro-films,and a statistical mechanics model on ...

  14. Explicit Numerical Modeling of Heat Transfer in Glacial Channels

    Science.gov (United States)

    Jarosch, A. H.; Zwinger, T.

    2015-12-01

    Turbulent flow and heat transfer of water in englacial channels is explicitly modelelled and the numerical results are compared to the most commonly used heat transfer parameterization in glaciology, i.e. the Dittus-Boelter equation. The three-dimensional flow is simulated by solving the incompressible Navier-Stokes equations utilizing a variational multiscale method (VMS) turbulence model and the finite-element method (i.e. Elmer-FEM software), which also solves the heat equation. By studying a wide range of key parameters of the system, e.g. channel diameter, Reynolds number, water flux, water temperature and Darcy-Weisbach wall roughness (which is explicitly represented on the wall geometry), it is found that the Dittus-Boelter equation is inadequate for glaciological applications and a new, highly suitable heat transfer parameterization for englacial/subglacial channels will be presented. This new parameterization utilizes a standard combination of dimensionless numbers describing the flow and channel (i.e. Reynolds number, Prandtl number and Darcy-Weisbach roughness) to predict a suitable Nusselt number describing the effective heat transfer and thus can be readily used in existing englacial/subglacial hydrology models.

  15. The use of kernel local Fisher discriminant analysis for the channelization of the Hotelling model observer

    Science.gov (United States)

    Wen, Gezheng; Markey, Mia K.

    2015-03-01

    It is resource-intensive to conduct human studies for task-based assessment of medical image quality and system optimization. Thus, numerical model observers have been developed as a surrogate for human observers. The Hotelling observer (HO) is the optimal linear observer for signal-detection tasks, but the high dimensionality of imaging data results in a heavy computational burden. Channelization is often used to approximate the HO through a dimensionality reduction step, but how to produce channelized images without losing significant image information remains a key challenge. Kernel local Fisher discriminant analysis (KLFDA) uses kernel techniques to perform supervised dimensionality reduction, which finds an embedding transformation that maximizes betweenclass separability and preserves within-class local structure in the low-dimensional manifold. It is powerful for classification tasks, especially when the distribution of a class is multimodal. Such multimodality could be observed in many practical clinical tasks. For example, primary and metastatic lesions may both appear in medical imaging studies, but the distributions of their typical characteristics (e.g., size) may be very different. In this study, we propose to use KLFDA as a novel channelization method. The dimension of the embedded manifold (i.e., the result of KLFDA) is a counterpart to the number of channels in the state-of-art linear channelization. We present a simulation study to demonstrate the potential usefulness of KLFDA for building the channelized HOs (CHOs) and generating reliable decision statistics for clinical tasks. We show that the performance of the CHO with KLFDA channels is comparable to that of the benchmark CHOs.

  16. Statistical Modeling of Large-Scale Signal Path Loss in Underwater Acoustic Networks

    Directory of Open Access Journals (Sweden)

    Manuel Perez Malumbres

    2013-02-01

    Full Text Available In an underwater acoustic channel, the propagation conditions are known to vary in time, causing the deviation of the received signal strength from the nominal value predicted by a deterministic propagation model. To facilitate a large-scale system design in such conditions (e.g., power allocation, we have developed a statistical propagation model in which the transmission loss is treated as a random variable. By applying repetitive computation to the acoustic field, using ray tracing for a set of varying environmental conditions (surface height, wave activity, small node displacements around nominal locations, etc., an ensemble of transmission losses is compiled and later used to infer the statistical model parameters. A reasonable agreement is found with log-normal distribution, whose mean obeys a log-distance increases, and whose variance appears to be constant for a certain range of inter-node distances in a given deployment location. The statistical model is deemed useful for higher-level system planning, where simulation is needed to assess the performance of candidate network protocols under various resource allocation policies, i.e., to determine the transmit power and bandwidth allocation necessary to achieve a desired level of performance (connectivity, throughput, reliability, etc..

  17. Spatial-Temporal Correlation Properties of the 3GPP Spatial Channel Model and the Kronecker MIMO Channel Model

    Directory of Open Access Journals (Sweden)

    Wu Hanguang

    2007-01-01

    Full Text Available The performance of multiple-input multiple-output (MIMO systems is greatly influenced by the spatial-temporal correlation properties of the underlying MIMO channels. This paper investigates the spatial-temporal correlation characteristics of the spatial channel model (SCM in the Third Generation Partnership Project (3GPP and the Kronecker-based stochastic model (KBSM at three levels, namely, the cluster level, link level, and system level. The KBSM has both the spatial separability and spatial-temporal separability at all the three levels. The spatial-temporal separability is observed for the SCM only at the system level, but not at the cluster and link levels. The SCM shows the spatial separability at the link and system levels, but not at the cluster level since its spatial correlation is related to the joint distribution of the angle of arrival (AoA and angle of departure (AoD. The KBSM with the Gaussian-shaped power azimuth spectrum (PAS is found to fit best the 3GPP SCM in terms of the spatial correlations. Despite its simplicity and analytical tractability, the KBSM is restricted to model only the average spatial-temporal behavior of MIMO channels. The SCM provides more insights of the variations of different MIMO channel realizations, but the implementation complexity is relatively high.

  18. Spatial-Temporal Correlation Properties of the 3GPP Spatial Channel Model and the Kronecker MIMO Channel Model

    Directory of Open Access Journals (Sweden)

    Cheng-Xiang Wang

    2007-02-01

    Full Text Available The performance of multiple-input multiple-output (MIMO systems is greatly influenced by the spatial-temporal correlation properties of the underlying MIMO channels. This paper investigates the spatial-temporal correlation characteristics of the spatial channel model (SCM in the Third Generation Partnership Project (3GPP and the Kronecker-based stochastic model (KBSM at three levels, namely, the cluster level, link level, and system level. The KBSM has both the spatial separability and spatial-temporal separability at all the three levels. The spatial-temporal separability is observed for the SCM only at the system level, but not at the cluster and link levels. The SCM shows the spatial separability at the link and system levels, but not at the cluster level since its spatial correlation is related to the joint distribution of the angle of arrival (AoA and angle of departure (AoD. The KBSM with the Gaussian-shaped power azimuth spectrum (PAS is found to fit best the 3GPP SCM in terms of the spatial correlations. Despite its simplicity and analytical tractability, the KBSM is restricted to model only the average spatial-temporal behavior of MIMO channels. The SCM provides more insights of the variations of different MIMO channel realizations, but the implementation complexity is relatively high.

  19. The Ising model in physics and statistical genetics.

    Science.gov (United States)

    Majewski, J; Li, H; Ott, J

    2001-10-01

    Interdisciplinary communication is becoming a crucial component of the present scientific environment. Theoretical models developed in diverse disciplines often may be successfully employed in solving seemingly unrelated problems that can be reduced to similar mathematical formulation. The Ising model has been proposed in statistical physics as a simplified model for analysis of magnetic interactions and structures of ferromagnetic substances. Here, we present an application of the one-dimensional, linear Ising model to affected-sib-pair (ASP) analysis in genetics. By analyzing simulated genetics data, we show that the simplified Ising model with only nearest-neighbor interactions between genetic markers has statistical properties comparable to much more complex algorithms from genetics analysis, such as those implemented in the Allegro and Mapmaker-Sibs programs. We also adapt the model to include epistatic interactions and to demonstrate its usefulness in detecting modifier loci with weak individual genetic contributions. A reanalysis of data on type 1 diabetes detects several susceptibility loci not previously found by other methods of analysis.

  20. Physical-Statistical Model of Thermal Conductivity of Nanofluids

    Directory of Open Access Journals (Sweden)

    B. Usowicz

    2014-01-01

    Full Text Available A physical-statistical model for predicting the effective thermal conductivity of nanofluids is proposed. The volumetric unit of nanofluids in the model consists of solid, liquid, and gas particles and is treated as a system made up of regular geometric figures, spheres, filling the volumetric unit by layers. The model assumes that connections between layers of the spheres and between neighbouring spheres in the layer are represented by serial and parallel connections of thermal resistors, respectively. This model is expressed in terms of thermal resistance of nanoparticles and fluids and the multinomial distribution of particles in the nanofluids. The results for predicted and measured effective thermal conductivity of several nanofluids (Al2O3/ethylene glycol-based and Al2O3/water-based; CuO/ethylene glycol-based and CuO/water-based; and TiO2/ethylene glycol-based are presented. The physical-statistical model shows a reasonably good agreement with the experimental results and gives more accurate predictions for the effective thermal conductivity of nanofluids compared to existing classical models.

  1. Statistical mechanics of the Huxley-Simmons model

    Science.gov (United States)

    Caruel, M.; Truskinovsky, L.

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971), 10.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  2. A generalized statistical model for the size distribution of wealth

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  3. Statistical mechanics of the Huxley-Simmons model.

    Science.gov (United States)

    Caruel, M; Truskinovsky, L

    2016-06-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971)NATUAS0028-083610.1038/233533a0] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  4. Statistical mechanics of the Huxley-Simmons model

    CERN Document Server

    Caruel, M

    2016-01-01

    The chemomechanical model of Huxley and Simmons (HS) [A. F. Huxley and R. M. Simmons, Nature 233, 533 (1971)] provides a paradigmatic description of mechanically induced collective conformational changes relevant in a variety of biological contexts, from muscles power-stroke and hair cell gating to integrin binding and hairpin unzipping. We develop a statistical mechanical perspective on the HS model by exploiting a formal analogy with a paramagnetic Ising model. We first study the equilibrium HS model with a finite number of elements and compute explicitly its mechanical and thermal properties. To model kinetics, we derive a master equation and solve it for several loading protocols. The developed formalism is applicable to a broad range of allosteric systems with mean-field interactions.

  5. Spatial Statistical Procedures to Validate Input Data in Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, G.; Stewart, J.; Barr, C.; Brady Sabeff, L.; George, R.; Heimiller, D.; Milbrandt, A.

    2006-01-01

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the abovementioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  6. Spatial Statistical Procedures to Validate Input Data in Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence Livermore National Laboratory

    2006-01-27

    Energy modeling and analysis often relies on data collected for other purposes such as census counts, atmospheric and air quality observations, economic trends, and other primarily non-energy-related uses. Systematic collection of empirical data solely for regional, national, and global energy modeling has not been established as in the above-mentioned fields. Empirical and modeled data relevant to energy modeling is reported and available at various spatial and temporal scales that might or might not be those needed and used by the energy modeling community. The incorrect representation of spatial and temporal components of these data sets can result in energy models producing misleading conclusions, especially in cases of newly evolving technologies with spatial and temporal operating characteristics different from the dominant fossil and nuclear technologies that powered the energy economy over the last two hundred years. Increased private and government research and development and public interest in alternative technologies that have a benign effect on the climate and the environment have spurred interest in wind, solar, hydrogen, and other alternative energy sources and energy carriers. Many of these technologies require much finer spatial and temporal detail to determine optimal engineering designs, resource availability, and market potential. This paper presents exploratory and modeling techniques in spatial statistics that can improve the usefulness of empirical and modeled data sets that do not initially meet the spatial and/or temporal requirements of energy models. In particular, we focus on (1) aggregation and disaggregation of spatial data, (2) predicting missing data, and (3) merging spatial data sets. In addition, we introduce relevant statistical software models commonly used in the field for various sizes and types of data sets.

  7. A Statistical Model for In Vivo Neuronal Dynamics.

    Directory of Open Access Journals (Sweden)

    Simone Carlo Surace

    Full Text Available Single neuron models have a long tradition in computational neuroscience. Detailed biophysical models such as the Hodgkin-Huxley model as well as simplified neuron models such as the class of integrate-and-fire models relate the input current to the membrane potential of the neuron. Those types of models have been extensively fitted to in vitro data where the input current is controlled. Those models are however of little use when it comes to characterize intracellular in vivo recordings since the input to the neuron is not known. Here we propose a novel single neuron model that characterizes the statistical properties of in vivo recordings. More specifically, we propose a stochastic process where the subthreshold membrane potential follows a Gaussian process and the spike emission intensity depends nonlinearly on the membrane potential as well as the spiking history. We first show that the model has a rich dynamical repertoire since it can capture arbitrary subthreshold autocovariance functions, firing-rate adaptations as well as arbitrary shapes of the action potential. We then show that this model can be efficiently fitted to data without overfitting. We finally show that this model can be used to characterize and therefore precisely compare various intracellular in vivo recordings from different animals and experimental conditions.

  8. Joint statistics of partial sums of ordered exponential variates and performance of GSC RAKE receivers over rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2011-08-01

    Spread spectrum receivers with generalized selection combining (GSC) RAKE reception were proposed and have been studied as alternatives to the classical two fundamental schemes: maximal ratio combining and selection combining because the number of diversity paths increases with the transmission bandwidth. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such as the exact performance analysis of the capture probability and an exact assessment of the impact of self-interference on GSC RAKE receivers. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability and outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels, and compare it to that of partial RAKE receivers. © 2011 IEEE.

  9. Statistical epistasis and functional brain imaging support a role of voltage-gated potassium channels in human memory.

    Directory of Open Access Journals (Sweden)

    Angela Heck

    Full Text Available Despite the current progress in high-throughput, dense genome scans, a major portion of complex traits' heritability still remains unexplained, a phenomenon commonly termed "missing heritability." The negligence of analytical approaches accounting for gene-gene interaction effects, such as statistical epistasis, is probably central to this phenomenon. Here we performed a comprehensive two-way SNP interaction analysis of human episodic memory, which is a heritable complex trait, and focused on 120 genes known to show differential, memory-related expression patterns in rat hippocampus. Functional magnetic resonance imaging was also used to capture genotype-dependent differences in memory-related brain activity. A significant, episodic memory-related interaction between two markers located in potassium channel genes (KCNB2 and KCNH5 was observed (P(nominal combined=0.000001. The epistatic interaction was robust, as it was significant in a screening (P(nominal=0.0000012 and in a replication sample (P(nominal=0.01. Finally, we found genotype-dependent activity differences in the parahippocampal gyrus (P(nominal=0.001 supporting the behavioral genetics finding. Our results demonstrate the importance of analytical approaches that go beyond single marker statistics of complex traits.

  10. Calculation of statistical entropic measures in a model of solids

    CERN Document Server

    Sanudo, Jaime

    2012-01-01

    In this work, a one-dimensional model of crystalline solids based on the Dirac comb limit of the Kronig-Penney model is considered. From the wave functions of the valence electrons, we calculate a statistical measure of complexity and the Fisher-Shannon information for the lower energy electronic bands appearing in the system. All these magnitudes present an extremal value for the case of solids having half-filled bands, a configuration where in general a high conductivity is attained in real solids, such as it happens with the monovalent metals.

  11. Sequence-Based Pronunciation Variation Modeling for Spontaneous ASR Using a Noisy Channel Approach

    Science.gov (United States)

    Hofmann, Hansjörg; Sakti, Sakriani; Hori, Chiori; Kashioka, Hideki; Nakamura, Satoshi; Minker, Wolfgang

    The performance of English automatic speech recognition systems decreases when recognizing spontaneous speech mainly due to multiple pronunciation variants in the utterances. Previous approaches address this problem by modeling the alteration of the pronunciation on a phoneme to phoneme level. However, the phonetic transformation effects induced by the pronunciation of the whole sentence have not yet been considered. In this article, the sequence-based pronunciation variation is modeled using a noisy channel approach where the spontaneous phoneme sequence is considered as a “noisy” string and the goal is to recover the “clean” string of the word sequence. Hereby, the whole word sequence and its effect on the alternation of the phonemes will be taken into consideration. Moreover, the system not only learns the phoneme transformation but also the mapping from the phoneme to the word directly. In this study, first the phonemes will be recognized with the present recognition system and afterwards the pronunciation variation model based on the noisy channel approach will map from the phoneme to the word level. Two well-known natural language processing approaches are adopted and derived from the noisy channel model theory: Joint-sequence models and statistical machine translation. Both of them are applied and various experiments are conducted using microphone and telephone of spontaneous speech.

  12. Non-gaussianity and Statistical Anisotropy in Cosmological Inflationary Models

    CERN Document Server

    Valenzuela-Toledo, Cesar A

    2010-01-01

    We study the statistical descriptors for some cosmological inflationary models that allow us to get large levels of non-gaussianity and violations of statistical isotropy. Basically, we study two different class of models: a model that include only scalar field perturbations, specifically a subclass of small-field slow-roll models of inflation with canonical kinetic terms, and models that admit both vector and scalar field perturbations. We study the former to show that it is possible to attain very high, including observable, values for the levels of non-gaussianity f_{NL} and \\tao_{NL} in the bispectrum B_\\zeta and trispectrum T_\\zeta of the primordial curvature perturbation \\zeta respectively. Such a result is obtained by taking care of loop corrections in the spectrum P_\\zeta, the bispectrum B_\\zeta and the trispectrum T_\\zeta . Sizeable values for f_{NL} and \\tao_{NL} arise even if \\zeta is generated during inflation. For the latter we study the spectrum P_\\zeta, bispectrum B_\\zeta and trispectrum $T_\\ze...

  13. Efficient Parallel Statistical Model Checking of Biochemical Networks

    Directory of Open Access Journals (Sweden)

    Paolo Ballarini

    2009-12-01

    Full Text Available We consider the problem of verifying stochastic models of biochemical networks against behavioral properties expressed in temporal logic terms. Exact probabilistic verification approaches such as, for example, CSL/PCTL model checking, are undermined by a huge computational demand which rule them out for most real case studies. Less demanding approaches, such as statistical model checking, estimate the likelihood that a property is satisfied by sampling executions out of the stochastic model. We propose a methodology for efficiently estimating the likelihood that a LTL property P holds of a stochastic model of a biochemical network. As with other statistical verification techniques, the methodology we propose uses a stochastic simulation algorithm for generating execution samples, however there are three key aspects that improve the efficiency: first, the sample generation is driven by on-the-fly verification of P which results in optimal overall simulation time. Second, the confidence interval estimation for the probability of P to hold is based on an efficient variant of the Wilson method which ensures a faster convergence. Third, the whole methodology is designed according to a parallel fashion and a prototype software tool has been implemented that performs the sampling/verification process in parallel over an HPC architecture.

  14. Anyonic behavior of an intermediate-statistics fermion gas model.

    Science.gov (United States)

    Algin, Abdullah; Irk, Dursun; Topcu, Gozde

    2015-06-01

    We study the high-temperature behavior of an intermediate-statistics fermionic gas model whose quantum statistical properties enable us to effectively deduce the details about both the interaction among deformed (quasi)particles and their anyonic behavior. Starting with a deformed fermionic grand partition function, we calculate, in the thermodynamical limit, several thermostatistical functions of the model such as the internal energy and the entropy by means of a formalism of the fermionic q calculus. For high temperatures, a virial expansion of the equation of state for the system is obtained in two and three dimensions and the first five virial coefficients are derived in terms of the model deformation parameter q. From the results obtained by the effect of fermionic deformation, it is found that the model parameter q interpolates completely between bosonlike and fermionic systems via the behaviors of the third and fifth virial coefficients in both two and three spatial dimensions and in addition it characterizes effectively the interaction among quasifermions. Our results reveal that the present deformed (quasi)fermion model could be very efficient and effective in accounting for the nonlinear behaviors in interacting composite particle systems.

  15. Cascaded Network Body Channel Model for Intrabody Communication.

    Science.gov (United States)

    Wang, Hao; Tang, Xian; Choy, Chiu Sing; Sobelman, Gerald E

    2016-07-01

    Intrabody communication has been of great research interest in recent years. This paper proposes a novel, compact but accurate body transmission channel model based on RC distribution networks and transmission line theory. The comparison between simulation and measurement results indicates that the proposed approach accurately models the body channel characteristics. In addition, the impedance-matching networks at the transmitter output and the receiver input further maximize the power transferred to the receiver, relax the receiver complexity, and increase the transmission performance. Based on the simulation results, the power gain can be increased by up to 16 dB after matching. A binary phase-shift keying modulation scheme is also used to evaluate the bit-error-rate improvement.

  16. Contributions in Radio Channel Sounding, Modeling, and Estimation

    DEFF Research Database (Denmark)

    Pedersen, Troels

    2009-01-01

    the necessary and sufficient conditions for  spatio-temporal apertures to minimize the Cramer-Rao lower bound on the joint bi-direction and Doppler frequency estimation. The spatio-temporal aperture also impacts on the accuracy of MIMO-capacity estimation from measurements impaired by colored phase noise. We......, than corresponding results from literature. These findings indicate that the per-path directional spreads (or cluster spreads) assumed in standard models are set too large. Finally, we propose a model of the specular-to-diffuse transition observed in measurements of reverberant channels.  The model...

  17. A statistical permafrost distribution model for the European Alps

    Directory of Open Access Journals (Sweden)

    L. Boeckli

    2011-05-01

    Full Text Available Permafrost distribution modeling in densely populated mountain regions is an important task to support the construction of infrastructure and for the assessment of climate change effects on permafrost and related natural systems. In order to analyze permafrost distribution and evolution on an Alpine-wide scale, one consistent model for the entire domain is needed.

    We present a statistical permafrost model for the entire Alps based on rock glacier inventories and rock surface temperatures. Starting from an integrated model framework, two different sub-models were developed, one for debris covered areas (debris model and one for steep rock faces (rock model. For the debris model a generalized linear mixed-effect model (GLMM was used to predict the probability of a rock glacier being intact as opposed to relict. The model is based on the explanatory variables mean annual air temperature (MAAT, potential incoming solar radiation (PISR and the mean annual sum of precipitation (PRECIP, and achieves an excellent discrimination (area under the receiver-operating characteristic, AUROC = 0.91. Surprisingly, the probability of a rock glacier being intact is positively associated with increasing PRECIP for given MAAT and PISR conditions. The rock model was calibrated with mean annual rock surface temperatures (MARST and is based on MAAT and PISR. The linear regression achieves a root mean square error (RMSE of 1.6 °C. The final model combines the two sub-models and accounts for the different scales used for model calibration. Further steps to transfer this model into a map-based product are outlined.

  18. The Impact of Statistical Leakage Models on Design Yield Estimation

    Directory of Open Access Journals (Sweden)

    Rouwaida Kanj

    2011-01-01

    Full Text Available Device mismatch and process variation models play a key role in determining the functionality and yield of sub-100 nm design. Average characteristics are often of interest, such as the average leakage current or the average read delay. However, detecting rare functional fails is critical for memory design and designers often seek techniques that enable accurately modeling such events. Extremely leaky devices can inflict functionality fails. The plurality of leaky devices on a bitline increase the dimensionality of the yield estimation problem. Simplified models are possible by adopting approximations to the underlying sum of lognormals. The implications of such approximations on tail probabilities may in turn bias the yield estimate. We review different closed form approximations and compare against the CDF matching method, which is shown to be most effective method for accurate statistical leakage modeling.

  19. Liver recognition based on statistical shape model in CT images

    Science.gov (United States)

    Xiang, Dehui; Jiang, Xueqing; Shi, Fei; Zhu, Weifang; Chen, Xinjian

    2016-03-01

    In this paper, an automatic method is proposed to recognize the liver on clinical 3D CT images. The proposed method effectively use statistical shape model of the liver. Our approach consist of three main parts: (1) model training, in which shape variability is detected using principal component analysis from the manual annotation; (2) model localization, in which a fast Euclidean distance transformation based method is able to localize the liver in CT images; (3) liver recognition, the initial mesh is locally and iteratively adapted to the liver boundary, which is constrained with the trained shape model. We validate our algorithm on a dataset which consists of 20 3D CT images obtained from different patients. The average ARVD was 8.99%, the average ASSD was 2.69mm, the average RMSD was 4.92mm, the average MSD was 28.841mm, and the average MSD was 13.31%.

  20. Statistical models of video structure for content analysis and characterization.

    Science.gov (United States)

    Vasconcelos, N; Lippman, A

    2000-01-01

    Content structure plays an important role in the understanding of video. In this paper, we argue that knowledge about structure can be used both as a means to improve the performance of content analysis and to extract features that convey semantic information about the content. We introduce statistical models for two important components of this structure, shot duration and activity, and demonstrate the usefulness of these models with two practical applications. First, we develop a Bayesian formulation for the shot segmentation problem that is shown to extend the standard thresholding model in an adaptive and intuitive way, leading to improved segmentation accuracy. Second, by applying the transformation into the shot duration/activity feature space to a database of movie clips, we also illustrate how the Bayesian model captures semantic properties of the content. We suggest ways in which these properties can be used as a basis for intuitive content-based access to movie libraries.

  1. Eikonal solutions to optical model coupled-channel equations

    Science.gov (United States)

    Cucinotta, Francis A.; Khandelwal, Govind S.; Maung, Khin M.; Townsend, Lawrence W.; Wilson, John W.

    1988-01-01

    Methods of solution are presented for the Eikonal form of the nucleus-nucleus coupled-channel scattering amplitudes. Analytic solutions are obtained for the second-order optical potential for elastic scattering. A numerical comparison is made between the first and second order optical model solutions for elastic and inelastic scattering of H-1 and He-4 on C-12. The effects of bound-state excitations on total and reaction cross sections are also estimated.

  2. Molecular dynamics simulations of water within models of ion channels.

    Science.gov (United States)

    Breed, J; Sankararamakrishnan, R; Kerr, I D; Sansom, M S

    1996-04-01

    The transbilayer pores formed by ion channel proteins contain extended columns of water molecules. The dynamic properties of such waters have been suggested to differ from those of water in its bulk state. Molecular dynamics simulations of ion channel models solvated within and at the mouths of their pores are used to investigate the dynamics and structure of intra-pore water. Three classes of channel model are investigated: a) parallel bundles of hydrophobic (Ala20) alpha-helices; b) eight-stranded hydrophobic (Ala10) antiparallel beta-barrels; and c) parallel bundles of amphipathic alpha-helices (namely, delta-toxin, alamethicin, and nicotinic acetylcholine receptor M2 helix). The self-diffusion coefficients of water molecules within the pores are reduced significantly relative to bulk water in all of the models. Water rotational reorientation rates are also reduced within the pores, particularly in those pores formed by alpha-helix bundles. In the narrowest pore (that of the Ala20 pentameric helix bundle) self-diffusion coefficients and reorientation rates of intra-pore waters are reduced by approximately an order of magnitude relative to bulk solvent. In Ala20 helix bundles the water dipoles orient antiparallel to the helix dipoles. Such dipole/dipole interaction between water and pore may explain how water-filled ion channels may be formed by hydrophobic helices. In the bundles of amphipathic helices the orientation of water dipoles is modulated by the presence of charged side chains. No preferential orientation of water dipoles relative to the pore axis is observed in the hydrophobic beta-barrel models.

  3. Basic equations of channel model for underground coal gasification

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The underground coal gasification has advantages of zero rubbish, nonpollution, low cost and high safety. According to the characteristics of the gasification, the channel model of chemical fluid mechanics is used to set up the fluid equations and chemical equations by some reasonable suppositions in this paper, which lays a theoretical foundation on requirements of fluid movement rules in the process of underground coal gasification.

  4. Statistical Inference for Partially Linear Regression Models with Measurement Errors

    Institute of Scientific and Technical Information of China (English)

    Jinhong YOU; Qinfeng XU; Bin ZHOU

    2008-01-01

    In this paper, the authors investigate three aspects of statistical inference for the partially linear regression models where some covariates are measured with errors. Firstly,a bandwidth selection procedure is proposed, which is a combination of the difference-based technique and GCV method. Secondly, a goodness-of-fit test procedure is proposed,which is an extension of the generalized likelihood technique. Thirdly, a variable selection procedure for the parametric part is provided based on the nonconcave penalization and corrected profile least squares. Same as "Variable selection via nonconcave penalized like-lihood and its oracle properties" (J. Amer. Statist. Assoc., 96, 2001, 1348-1360), it is shown that the resulting estimator has an oracle property with a proper choice of regu-larization parameters and penalty function. Simulation studies are conducted to illustrate the finite sample performances of the proposed procedures.

  5. WE-A-201-02: Modern Statistical Modeling.

    Science.gov (United States)

    Niemierko, A

    2016-06-01

    Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the "big tent" vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that "Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]". Don developed an interest in chemistry at school by "reading a book" - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to a BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory

  6. Modeling phenotypic plasticity in growth trajectories: a statistical framework.

    Science.gov (United States)

    Wang, Zhong; Pang, Xiaoming; Wu, Weimiao; Wang, Jianxin; Wang, Zuoheng; Wu, Rongling

    2014-01-01

    Phenotypic plasticity, that is multiple phenotypes produced by a single genotype in response to environmental change, has been thought to play an important role in evolution and speciation. Historically, knowledge about phenotypic plasticity has resulted from the analysis of static traits measured at a single time point. New insight into the adaptive nature of plasticity can be gained by an understanding of how organisms alter their developmental processes in a range of environments. Recent advances in statistical modeling of functional data and developmental genetics allow us to construct a dynamic framework of plastic response in developmental form and pattern. Under this framework, development, genetics, and evolution can be synthesized through statistical bridges to better address how evolution results from phenotypic variation in the process of development via genetic alterations.

  7. The Statistical Multifragmentation Model with Skyrme Effective Interactions

    CERN Document Server

    Souza, S R; Donangelo, R; Lynch, W G; Steiner, A W; Tsang, M B

    2009-01-01

    The Statistical Multifragmentation Model is modified to incorporate the Helmholtz free energies calculated in the finite temperature Thomas-Fermi approximation using Skyrme effective interactions. In this formulation, the density of the fragments at the freeze-out configuration corresponds to the equilibrium value obtained in the Thomas-Fermi approximation at the given temperature. The behavior of the nuclear caloric curve at constant volume is investigated in the micro-canonical ensemble and a plateau is observed for excitation energies between 8 and 10 MeV per nucleon. A kink in the caloric curve is found at the onset of this gas transition, indicating the existence of a small excitation energy region with negative heat capacity. In contrast to previous statistical calculations, this situation takes place even in this case in which the system is constrained to fixed volume. The observed phase transition takes place at approximately constant entropy. The charge distribution and other observables also turn ou...

  8. Estimating Predictive Variance for Statistical Gas Distribution Modelling

    Science.gov (United States)

    Lilienthal, Achim J.; Asadi, Sahar; Reggente, Matteo

    2009-05-01

    Recent publications in statistical gas distribution modelling have proposed algorithms that model mean and variance of a distribution. This paper argues that estimating the predictive concentration variance entails not only a gradual improvement but is rather a significant step to advance the field. This is, first, since the models much better fit the particular structure of gas distributions, which exhibit strong fluctuations with considerable spatial variations as a result of the intermittent character of gas dispersal. Second, because estimating the predictive variance allows to evaluate the model quality in terms of the data likelihood. This offers a solution to the problem of ground truth evaluation, which has always been a critical issue for gas distribution modelling. It also enables solid comparisons of different modelling approaches, and provides the means to learn meta parameters of the model, to determine when the model should be updated or re-initialised, or to suggest new measurement locations based on the current model. We also point out directions of related ongoing or potential future research work.

  9. Statistical analysis on the optical fading in free space optical channel for RoFSO link design

    Science.gov (United States)

    Kim, Kyung-Hwan; Higashino, Takeshi; Tsukamoto, Katsutoshi; Komaki, Shozo; Kazaura, Kamugisha; Matsumoto, Mitsuji

    2010-01-01

    This paper presents empirical probability density functions (p.d.fs) of variance and fluctuation speed of scintillation, through analyzing a number of experimental data measured in Japan by a statistical model. The model enables us to treat scintillation speed by one parameter of cut-off frequency in the power spectral density (PDS). By using the model and based on the two p.d.fs, we also present simulation results on the level crossing rate (LCR) and average fade duration (AFD). Combined the two results, an outage probabilities corresponding to a threshold optical intensity can be derived.

  10. Modeling magnetosensitive ion channels in viscoelastic environment of living cells

    CERN Document Server

    Goychuk, Igor

    2015-01-01

    We propose and study a model of hypothetical magnetosensitive ionic channels which are long thought to be a possible candidate to explain the influence of weak magnetic fields on living organisms ranging from magnetotactic bacteria to fishes, birds, rats, bats and other mammals including humans. The core of the model is provided by a short chain of magnetosomes serving as a sensor which is coupled by elastic linkers to the gating elements of ion channels forming a small cluster in the cell membrane. The magnetic sensor is fixed by one end on cytoskeleton elements attached to the membrane and is exposed to viscoelastic cytosol. Its free end can reorient stochastically and subdiffusively in viscoelastic cytosol responding to external magnetic field changes and open the gates of coupled ion channels. The sensor dynamics is generally bistable due to bistability of the gates which can be in two states with probabilities which depend on the sensor orientation. For realistic parameters, it is shown that this model c...

  11. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  12. Statistical process control of a Kalman filter model.

    Science.gov (United States)

    Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A

    2014-09-26

    For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

  13. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  14. A Statistical Quality Model for Data-Driven Speech Animation.

    Science.gov (United States)

    Ma, Xiaohan; Deng, Zhigang

    2012-11-01

    In recent years, data-driven speech animation approaches have achieved significant successes in terms of animation quality. However, how to automatically evaluate the realism of novel synthesized speech animations has been an important yet unsolved research problem. In this paper, we propose a novel statistical model (called SAQP) to automatically predict the quality of on-the-fly synthesized speech animations by various data-driven techniques. Its essential idea is to construct a phoneme-based, Speech Animation Trajectory Fitting (SATF) metric to describe speech animation synthesis errors and then build a statistical regression model to learn the association between the obtained SATF metric and the objective speech animation synthesis quality. Through delicately designed user studies, we evaluate the effectiveness and robustness of the proposed SAQP model. To the best of our knowledge, this work is the first-of-its-kind, quantitative quality model for data-driven speech animation. We believe it is the important first step to remove a critical technical barrier for applying data-driven speech animation techniques to numerous online or interactive talking avatar applications.

  15. Statistical Process Control of a Kalman Filter Model

    Science.gov (United States)

    Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A.

    2014-01-01

    For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959

  16. Statistical Process Control of a Kalman Filter Model

    Directory of Open Access Journals (Sweden)

    Sonja Gamse

    2014-09-01

    Full Text Available For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations.

  17. A statistical model for interpreting computerized dynamic posturography data

    Science.gov (United States)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  18. A data-driven model of a modal gated ion channel: the inositol 1,4,5-trisphosphate receptor in insect Sf9 cells.

    Science.gov (United States)

    Ullah, Ghanim; Mak, Don-On Daniel; Pearson, John E

    2012-08-01

    The inositol 1,4,5-trisphosphate (IP(3)) receptor (IP(3)R) channel is crucial for the generation and modulation of intracellular Ca(2+) signals in animal cells. To gain insight into the complicated ligand regulation of this ubiquitous channel, we constructed a simple quantitative continuous-time Markov-chain model from the data. Our model accounts for most experimentally observed gating behaviors of single native IP(3)R channels from insect Sf9 cells. Ligand (Ca(2+) and IP(3)) dependencies of channel activity established six main ligand-bound channel complexes, where a complex consists of one or more states with the same ligand stoichiometry and open or closed conformation. Channel gating in three distinct modes added one complex and indicated that three complexes gate in multiple modes. This also restricted the connectivity between channel complexes. Finally, latencies of channel responses to abrupt ligand concentration changes defined a model with specific network topology between 9 closed and 3 open states. The model with 28 parameters can closely reproduce the equilibrium gating statistics for all three gating modes over a broad range of ligand concentrations. It also captures the major features of channel response latency distributions. The model can generate falsifiable predictions of IP(3)R channel gating behaviors and provide insights to both guide future experiment development and improve IP(3)R channel gating analysis. Maximum likelihood estimates of the model parameters and of the parameters in the De Young-Keizer model yield strong statistical evidence in favor of our model. Our method is simple and easily applicable to the dynamics of other ion channels and molecules.

  19. Discrete dynamical models: combinatorics, statistics and continuum approximations

    CERN Document Server

    Kornyak, Vladimir V

    2015-01-01

    This essay advocates the view that any problem that has a meaningful empirical content, can be formulated in constructive, more definitely, finite terms. We consider combinatorial models of dynamical systems and approaches to statistical description of such models. We demonstrate that many concepts of continuous physics --- such as continuous symmetries, the principle of least action, Lagrangians, deterministic evolution equations --- can be obtained from combinatorial structures as a result of the large number approximation. We propose a constructive description of quantum behavior that provides, in particular, a natural explanation of appearance of complex numbers in the formalism of quantum mechanics. Some approaches to construction of discrete models of quantum evolution that involve gauge connections are discussed.

  20. Exploiting linkage disequilibrium in statistical modelling in quantitative genomics

    DEFF Research Database (Denmark)

    Wang, Lei

    Alleles at two loci are said to be in linkage disequilibrium (LD) when they are correlated or statistically dependent. Genomic prediction and gene mapping rely on the existence of LD between gentic markers and causul variants of complex traits. In the first part of the thesis, a novel method...... to quantify and visualize local variation in LD along chromosomes in describet, and applied to characterize LD patters at the local and genome-wide scale in three Danish pig breeds. In the second part, different ways of taking LD into account in genomic prediction models are studied. One approach is to use...... the recently proposed antedependence models, which treat neighbouring marker effects as correlated; another approach involves use of haplotype block information derived using the program Beagle. The overall conclusion is that taking LD information into account in genomic prediction models potentially improves...

  1. Statistical modelling and deconvolution of yield meter data

    DEFF Research Database (Denmark)

    Tøgersen, Frede Aakmann; Waagepetersen, Rasmus Plenge

    2004-01-01

    This paper considers the problem of mapping spatial variation of yield in a field using data from a yield monitoring system on a combine harvester. The unobserved yield is assumed to be a Gaussian random field and the yield monitoring system data is modelled as a convolution of the yield...... and an impulse response function. This results in an unusual spatial covariance structure (depending on the driving pattern of the combine harverster) for the yield monitoring system data. Parameters of the impulse response function and the spatial covariance function of the yield are estimated using maximum...... likelihood methods. The fitted model is assessed using certain empirical directional covariograms and the yield is finally predicted using the inferred statistical model....

  2. RM-structure alignment based statistical machine translation model

    Institute of Scientific and Technical Information of China (English)

    Sun Jiadong; Zhao Tiejun

    2008-01-01

    A novel model based on structure alignments is proposed for statistical machine translation in this paper.Meta-structure and sequence of meta-structure for a parse tree are defined.During the translation process, a parse tree is decomposed to deal with the structure divergence and the alignments can be constructed at different levels of recombination of meta-structure (RM).This method can perform the structure mapping across the sub-tree structure between languages.As a result, we get not only the translation for the target language, but sequence of meta-structure of its parse tree at the same time.Experiments show that the model in the framework of log-linear model has better generative ability and significantly outperforms Pharaoh, a phrase-based system.

  3. Dynamic statistical models of biological cognition: insights from communications theory

    Science.gov (United States)

    Wallace, Rodrick

    2014-10-01

    Maturana's cognitive perspective on the living state, Dretske's insight on how information theory constrains cognition, the Atlan/Cohen cognitive paradigm, and models of intelligence without representation, permit construction of a spectrum of dynamic necessary conditions statistical models of signal transduction, regulation, and metabolism at and across the many scales and levels of organisation of an organism and its context. Nonequilibrium critical phenomena analogous to physical phase transitions, driven by crosstalk, will be ubiquitous, representing not only signal switching, but the recruitment of underlying cognitive modules into tunable dynamic coalitions that address changing patterns of need and opportunity at all scales and levels of organisation. The models proposed here, while certainly providing much conceptual insight, should be most useful in the analysis of empirical data, much as are fitted regression equations.

  4. Modeling, dependence, classification, united statistical science, many cultures

    CERN Document Server

    Parzen, Emanuel

    2012-01-01

    Breiman (2001) proposed to statisticians awareness of two cultures: 1. Parametric modeling culture, pioneered by R.A.Fisher and Jerzy Neyman; 2. Algorithmic predictive culture, pioneered by machine learning research. Parzen (2001), as a part of discussing Breiman (2001), proposed that researchers be aware of many cultures, including the focus of our research: 3. Nonparametric, quantile based, information theoretic modeling. Our research seeks to unify statistical problem solving in terms of comparison density, copula density, measure of dependence, correlation, information, new measures (called LP score comoments) that apply to long tailed distributions with out finite second order moments. A very important goal is to unify methods for discrete and continuous random variables. We are actively developing these ideas, which have a history of many decades, since Parzen (1979, 1983) and Eubank et al. (1987). Our research extends these methods to modern high dimensional data modeling.

  5. Hybrid Perturbation methods based on Statistical Time Series models

    CERN Document Server

    San-Juan, Juan Félix; Pérez, Iván; López, Rosario

    2016-01-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...

  6. How Good Are Statistical Models at Approximating Complex Fitness Landscapes?

    Science.gov (United States)

    du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian

    2016-01-01

    Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564

  7. How Good Are Statistical Models at Approximating Complex Fitness Landscapes?

    Science.gov (United States)

    du Plessis, Louis; Leventhal, Gabriel E; Bonhoeffer, Sebastian

    2016-09-01

    Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations.

  8. Symmetry Energy Effects in a Statistical Multifragmentation Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lei; GAO Yuan1; ZHANG Hong-Fei; CHEN Xi-Meng; Yu Mei-Ling; LI Jun-Qing

    2011-01-01

    The symmetry energy effects on the nuclear disintegration mechanisms of the neutron-rich system (A0 = 200, Z0 = 78) are studied in the framework of the statistical multifragmentation model (SMM) within its micro-canonical ensemble. A modified symmetry energy term with consideration of the volume and surface asymmetry is adopted instead of the original invariable value in the standard SMM model. The results indicate that as the volume and surface asymmetries are considered, the neutron-rich system translates to a fission-like process from evaporation earlier than the original standard SMM model at lower excitation energies, and its mass distribution has larger probabilities in the medium-heavy nuclei range so that the system breaks up more averagely. When the excitation energy becomes higher, the volume and surface asymmetry lead to a smaller average multiplicity.%The symmetry energy effects on the nuclear disintegration mechanisms of the neutron-rich system (A0 =200,Z0 =78) are studied in the framework of the statistical multifragmentation model (SMM) within its micro-canonical ensemble.A modified symmetry energy term with consideration of the volume and surface asymmetry is adopted instead of the original invariable value in the standard SMM model.The results indicate that as the volume and surface asymmetries are considered,the neutron-rich system translates to a fission-like process from evaporation earlier than the original standard SMM model at lower excitation energies,and its mass distribution has larger probabilities in the medium-heavy nuclei range so that the system breaks up more averagely.When the excitation energy becomes higher,the volume and surface asymmetry lead to a smaller average multiplicity.

  9. S-channel dark matter simplified models and unitarity

    Science.gov (United States)

    Englert, Christoph; McCullough, Matthew; Spannowsky, Michael

    2016-12-01

    The ultraviolet structure of s-channel mediator dark matter simplified models at hadron colliders is considered. In terms of commonly studied s-channel mediator simplified models it is argued that at arbitrarily high energies the perturbative description of dark matter production in high energy scattering can break down. This is analogous to the well documented breakdown of an EFT description of dark matter collider production. With this in mind, to diagnose whether or not the use of simplified models at the LHC is valid, perturbative unitarity of the scattering amplitude in the processes relevant to LHC dark matter searches is studied. The results are as one would expect: at the LHC and future proton colliders the simplified model descriptions of dark matter production are in general valid. As a result of the general discussion, a simple class of 'Fermiophobic Scalar' simplified models is proposed, in which a scalar mediator couples to electroweak vector bosons. The Fermiophobic simplified model is well motivated and exhibits interesting collider and direct detection phenomenology.

  10. Random matrices as models for the statistics of quantum mechanics

    Science.gov (United States)

    Casati, Giulio; Guarneri, Italo; Mantica, Giorgio

    1986-05-01

    Random matrices from the Gaussian unitary ensemble generate in a natural way unitary groups of evolution in finite-dimensional spaces. The statistical properties of this time evolution can be investigated by studying the time autocorrelation functions of dynamical variables. We prove general results on the decay properties of such autocorrelation functions in the limit of infinite-dimensional matrices. We discuss the relevance of random matrices as models for the dynamics of quantum systems that are chaotic in the classical limit. Permanent address: Dipartimento di Fisica, Via Celoria 16, 20133 Milano, Italy.

  11. Efficiency of a statistical transport model for turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    In developing its theory for turbulent dispersion transport, the Litchford and Jeng (1991) statistical transport model for turbulent particle dispersion took a generalized approach in which the perturbing influence of each turbulent eddy on consequent interactions was transported through all subsequent eddies. Nevertheless, examinations of this transport relation shows it to be able to decay rapidly: this implies that additional computational efficiency may be obtained via truncation of unneccessary transport terms. Attention is here given to the criterion for truncation, as well as to expected efficiency gains.

  12. Stochastical modeling for Viral Disease: Statistical Mechanics and Network Theory

    Science.gov (United States)

    Zhou, Hao; Deem, Michael

    2007-04-01

    Theoretical methods of statistical mechanics are developed and applied to study the immunological response against viral disease, such as dengue. We use this theory to show how the immune response to four different dengue serotypes may be sculpted. It is the ability of avian influenza, to change and to mix, that has given rise to the fear of a new human flu pandemic. Here we propose to utilize a scale free network based stochastic model to investigate the mitigation strategies and analyze the risk.

  13. A Probabilistic Rain Diagnostic Model Based on Cyclone Statistical Analysis

    OpenAIRE

    Iordanidou, V.; A. G. Koutroulis; I. K. Tsanis

    2014-01-01

    Data from a dense network of 69 daily precipitation gauges over the island of Crete and cyclone climatological analysis over middle-eastern Mediterranean are combined in a statistical approach to develop a rain diagnostic model. Regarding the dataset, 0.5 × 0.5, 33-year (1979–2011) European Centre for Medium-Range Weather Forecasts (ECMWF) reanalysis (ERA-Interim) is used. The cyclone tracks and their characteristics are identified with the aid of Melbourne University algorithm (MS scheme). T...

  14. Social inequality: from data to statistical physics modeling

    Science.gov (United States)

    Chatterjee, Arnab; Ghosh, Asim; Inoue, Jun-ichi; Chakrabarti, Bikas K.

    2015-09-01

    Social inequality is a topic of interest since ages, and has attracted researchers across disciplines to ponder over it origin, manifestation, characteristics, consequences, and finally, the question of how to cope with it. It is manifested across different strata of human existence, and is quantified in several ways. In this review we discuss the origins of social inequality, the historical and commonly used non-entropic measures such as Lorenz curve, Gini index and the recently introduced k index. We also discuss some analytical tools that aid in understanding and characterizing them. Finally, we argue how statistical physics modeling helps in reproducing the results and interpreting them.

  15. Social inequality: from data to statistical physics modeling

    CERN Document Server

    Chatterjee, Arnab; Inoue, Jun-ichi; Chakrabarti, Bikas K

    2015-01-01

    Social inequality is a topic of interest since ages, and has attracted researchers across disciplines to ponder over it origin, manifestation, characteristics, consequences, and finally, the question of how to cope with it. It is manifested across different strata of human existence, and is quantified in several ways. In this review we discuss the origins of social inequality, the historical and commonly used non-entropic measures such as Lorenz curve, Gini index and the recently introduced $k$ index. We also discuss some analytical tools that aid in understanding and characterizing them. Finally, we argue how statistical physics modeling helps in reproducing the results and interpreting them.

  16. Statistical Inference for Point Process Models of Rainfall

    Science.gov (United States)

    Smith, James A.; Karr, Alan F.

    1985-01-01

    In this paper we develop maximum likelihood procedures for parameter estimation and model selection that apply to a large class of point process models that have been used to model rainfall occurrences, including Cox processes, Neyman-Scott processes, and renewal processes. The statistical inference procedures are based on the stochastic intensity λ(t) = lims→0,s>0 (1/s)E[N(t + s) - N(t)|N(u), u process is shown to have a simple expression in terms of the stochastic intensity. The main result of this paper is a recursive procedure for computing stochastic intensities; the procedure is applicable to a broad class of point process models, including renewal Cox process with Markovian intensity processes and an important class of Neyman-Scott processes. The model selection procedure we propose, which is based on likelihood ratios, allows direct comparison of two classes of point processes to determine which provides a better model for a given data set. The estimation and model selection procedures are applied to two data sets of simulated Cox process arrivals and a data set of daily rainfall occurrences in the Potomac River basin.

  17. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal ...... of the Center for Research and Innovation in Translation and Translation Technology (CRITT) at IBC....... of International Business Communication. His current research interests are related to the investigation of human translation processes and how advanced computer tools (such as machine translation) can fruitfully complement and support the human (translation) activities. Furthermore, he is the director...

  18. A Unified Channel Charges Expression for Analytic MOSFET Modeling

    Directory of Open Access Journals (Sweden)

    Hugues Murray

    2012-01-01

    Full Text Available Based on a 1D Poissons equation resolution, we present an analytic model of inversion charges allowing calculation of the drain current and transconductance in the Metal Oxide Semiconductor Field Effect Transistor. The drain current and transconductance are described by analytical functions including mobility corrections and short channel effects (CLM, DIBL. The comparison with the Pao-Sah integral shows excellent accuracy of the model in all inversion modes from strong to weak inversion in submicronics MOSFET. All calculations are encoded with a simple C program and give instantaneous results that provide an efficient tool for microelectronics users.

  19. Coupled-channel optical model potential for rare earth nuclei

    CERN Document Server

    Herman, M; Palumbo, A; Dietrich, F S; Brown, D; Hoblit, S

    2013-01-01

    Inspired by the recent work by Dietrich et al., substantiating validity of the adiabatic assumption in coupled-channel calculations, we explore the possibility of generalizing a global spherical optical model potential (OMP) to make it usable in coupled-channel calculations on statically deformed nuclei. The generalization consists in adding the coupling of the ground state rotational band, deforming the potential by introducing appropriate quadrupole and hexadecupole deformation and correcting the OMP radius to preserve volume integral of the spherical OMP. We choose isotopes of three rare-earth elements (W, Ho, Gd), which are known to be nearly perfect rotors, to perform a consistent test of our conjecture on integrated cross sections as well as on angular distributions for elastic and inelastic neutron scattering. When doing this we employ the well-established Koning-Delaroche global spherical potential and experimentally determined deformations without any adjustments. We observe a dramatically improved a...

  20. Statistical modeling and visualization of localized prostate cancer

    Science.gov (United States)

    Wang, Yue J.; Xuan, Jianhua; Sesterhenn, Isabell A.; Hayes, Wendelin S.; Ebert, David S.; Lynch, John H.; Mun, Seong K.

    1997-05-01

    In this paper, a statistically significant master model of localized prostate cancer is developed with pathologically- proven surgical specimens to spatially guide specific points in the biopsy technique for a higher rate of prostate cancer detection and the best possible representation of tumor grade and extension. Based on 200 surgical specimens of the prostates, we have developed a surface reconstruction technique to interactively visualize in the clinically significant objects of interest such as the prostate capsule, urethra, seminal vesicles, ejaculatory ducts and the different carcinomas, for each of these cases. In order to investigate the complex disease pattern including the tumor distribution, volume, and multicentricity, we created a statistically significant master model of localized prostate cancer by fusing these reconstructed computer models together, followed by a quantitative formulation of the 3D finite mixture distribution. Based on the reconstructed prostate capsule and internal structures, we have developed a technique to align all surgical specimens through elastic matching. By labeling the voxels of localized prostate cancer by '1' and the voxels of other internal structures by '0', we can generate a 3D binary image of the prostate that is simply a mutually exclusive random sampling of the underlying distribution f cancer to gram of localized prostate cancer characteristics. In order to quantify the key parameters such as distribution, multicentricity, and volume, we used a finite generalized Gaussian mixture to model the histogram, and estimate the parameter values through information theoretical criteria and a probabilistic self-organizing mixture. Utilizing minimally-immersive and stereoscopic interactive visualization, an augmented reality can be developed to allow the physician to virtually hold the master model in one hand and use the dominant hand to probe data values and perform a simulated needle biopsy. An adaptive self- organizing

  1. An Extended Clustering Algorithm for Statistical Language Models

    CERN Document Server

    Ueberla, J P

    1994-01-01

    Statistical language models frequently suffer from a lack of training data. This problem can be alleviated by clustering, because it reduces the number of free parameters that need to be trained. However, clustered models have the following drawback: if there is ``enough'' data to train an unclustered model, then the clustered variant may perform worse. On currently used language modeling corpora, e.g. the Wall Street Journal corpus, how do the performances of a clustered and an unclustered model compare? While trying to address this question, we develop the following two ideas. First, to get a clustering algorithm with potentially high performance, an existing algorithm is extended to deal with higher order N-grams. Second, to make it possible to cluster large amounts of training data more efficiently, a heuristic to speed up the algorithm is presented. The resulting clustering algorithm can be used to cluster trigrams on the Wall Street Journal corpus and the language models it produces can compete with exi...

  2. Comparison of Statistical Multifragmentation Model simulations with Canonical Thermodynamical Model results: a few representative cases

    CERN Document Server

    Botvina, A; Gupta, S Das; Mishustin, I

    2008-01-01

    The statistical multifragmentation model (SMM) has been widely used to explain experimental data of intermediate energy heavy ion collisions. A later entrant in the field is the canonical thermodynamic model (CTM) which is also being used to fit experimental data. The basic physics of both the models is the same, namely that fragments are produced according to their statistical weights in the available phase space. However, they are based on different statistical ensembles, and the methods of calculation are different: while the SMM uses Monte-Carlo simulations, the CTM solves recursion relations. In this paper we compare the predictions of the two models for a few representative cases.

  3. Energy Level Statistics in Particle—Rotor Model

    Institute of Scientific and Technical Information of China (English)

    ZHOUXian-Rong; MENGJie; 等

    2002-01-01

    Energy level statistics of a system consisting of six particles interacting by delta force in a two-j model coupled with a deformed core is studied in particle-rotor model.For single-j shell (i13/2) and two-j shell (g7/2+d5/2) the exact energies for our statistical analysis are obtained from a full diagonalization of the Hamiltonian,whilt in two-j case (i13/2+g9/2) the configuration truncation is used.The nearest-neighbor distribution of energy levels and spectral rigidity are studied as the function of spin.The results of single-j shell are compared with those in two-j case.It is showed that the system becomes more regular when single-j space (i13/2) is replaced by two-j shell (g7/2+d5/2) although the basis size of the configuration space is unchanged.The degree of chaoticity of the system,however,changes slightly when configuration space is enlarged by extending single-j shell (i13/2) to two-j shell (i13/2+g9/2).

  4. An efficient simulator of 454 data using configurable statistical models

    Directory of Open Access Journals (Sweden)

    Persson Bengt

    2011-10-01

    Full Text Available Abstract Background Roche 454 is one of the major 2nd generation sequencing platforms. The particular characteristics of 454 sequence data pose new challenges for bioinformatic analyses, e.g. assembly and alignment search algorithms. Simulation of these data is therefore useful, in order to further assess how bioinformatic applications and algorithms handle 454 data. Findings We developed a new application named 454sim for simulation of 454 data at high speed and accuracy. The program is multi-thread capable and is available as C++ source code or pre-compiled binaries. Sequence reads are simulated by 454sim using a set of statistical models for each chemistry. 454sim simulates recorded peak intensities, peak quality deterioration and it calculates quality values. All three generations of the Roche 454 chemistry ('GS20', 'GS FLX' and 'Titanium' are supported and defined in external text files for easy access and tweaking. Conclusions We present a new platform independent application named 454sim. 454sim is generally 200 times faster compared to previous programs and it allows for simple adjustments of the statistical models. These improvements make it possible to carry out more complex and rigorous algorithm evaluations in a reasonable time scale.

  5. Editorial to: Six papers on Dynamic Statistical Models

    DEFF Research Database (Denmark)

    2014-01-01

    Group-Sequential Covariate-Adjusted Randomized Clinical Trials Antoine Chambaz and Mark J. van der Laan Estimation of Causal Odds of Concordance using the Aalen Additive Model Torben Martinussen and Christian Bressen Pipper We would like to acknowledge the financial support from the University...... areas working with frontier research topics in statistics for dynamic models. This issue of SJS contains a quite diverse collection of six papers from the conference: Spectral Estimation of Covolatility from Noisy Observations Using Local Weights Markus Bibinger and Markus Reiß One-Way Anova...... for Functional Data via Globalizing the Pointwise F-test Jin-Ting Zhang and Xuehua Liang Weakly Decomposable Regularization Penalties and Structured Sparsity Sara van de Geer Integrative Analysis of Cancer Diagnosis Studies with Composite Penalization Jin Liu, Shuangge Ma and Jian Huang Inference in Targeted...

  6. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    Science.gov (United States)

    di Clemente, Riccardo; Pietronero, Luciano

    2012-07-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  7. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    CERN Document Server

    Di Clemente, Riccardo; 10.1038/srep00532

    2012-01-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  8. Statistical Modeling of Robotic Random Walks on Different Terrain

    Science.gov (United States)

    Naylor, Austin; Kinnaman, Laura

    Issues of public safety, especially with crowd dynamics and pedestrian movement, have been modeled by physicists using methods from statistical mechanics over the last few years. Complex decision making of humans moving on different terrains can be modeled using random walks (RW) and correlated random walks (CRW). The effect of different terrains, such as a constant increasing slope, on RW and CRW was explored. LEGO robots were programmed to make RW and CRW with uniform step sizes. Level ground tests demonstrated that the robots had the expected step size distribution and correlation angles (for CRW). The mean square displacement was calculated for each RW and CRW on different terrains and matched expected trends. The step size distribution was determined to change based on the terrain; theoretical predictions for the step size distribution were made for various simple terrains. It's Dr. Laura Kinnaman, not sure where to put the Prefix.

  9. Statistical properties of cloud lifecycles in cloud-resolving models

    Directory of Open Access Journals (Sweden)

    R. S. Plant

    2008-12-01

    Full Text Available A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30 min, but events can lengthen the typical lifetime considerably.

  10. Image Watermarking Using Visual Perception Model and Statistical Features

    Directory of Open Access Journals (Sweden)

    Mrs.C.Akila

    2010-06-01

    Full Text Available This paper presents an effective method for the image watermarking using visual perception model based on statistical features in the low frequency domain. In the image watermarking community watermark resistance to geometric attacks is an important issue. Most countermeasures proposed in the literature usually focus on the problem of global affine transforms such as rotation, scaling and translation (RST, but few are resistant to challenging cropping and random bending attacks (RBAs. Normally in the case of watermarking there may be an occurrence of distortion in the form of artifacts. A visual perception model is proposed to quantify the localized tolerance to noise for arbitrary imagery which achieves the reduction of artifacts. As a result, the watermarking system provides a satisfactory performance for those content-preserving geometric deformations and image processing operations, including JPEG ompression, low pass filtering, cropping and RBAs.

  11. Helicity statistics in homogeneous and isotropic turbulence and turbulence models

    Science.gov (United States)

    Sahoo, Ganapati; De Pietro, Massimo; Biferale, Luca

    2017-02-01

    We study the statistical properties of helicity in direct numerical simulations of fully developed homogeneous and isotropic turbulence and in a class of turbulence shell models. We consider correlation functions based on combinations of vorticity and velocity increments that are not invariant under mirror symmetry. We also study the scaling properties of high-order structure functions based on the moments of the velocity increments projected on a subset of modes with either positive or negative helicity (chirality). We show that mirror symmetry is recovered at small scales, i.e., chiral terms are subleading and they are well captured by a dimensional argument plus anomalous corrections. These findings are also supported by a high Reynolds numbers study of helical shell models with the same chiral symmetry of Navier-Stokes equations.

  12. Role of scaling in the statistical modelling of finance

    Indian Academy of Sciences (India)

    Attilio L Stella; Fulvio Baldovin

    2008-08-01

    Modelling the evolution of a financial index as a stochastic process is a problem awaiting a full, satisfactory solution since it was first formulated by Bachelier in 1900. Here it is shown that the scaling with time of the return probability density function sampled from the historical series suggests a successful model. The resulting stochastic process is a heteroskedastic, non-Markovian martingale, which can be used to simulate index evolution on the basis of an autoregressive strategy. Results are fully consistent with volatility clustering and with the multiscaling properties of the return distribution. The idea of basing the process construction on scaling, and the construction itself, are closely inspired by the probabilistic renormalization group approach of statistical mechanics and by a recent formulation of the central limit theorem for sums of strongly correlated random variables.

  13. Statistical analysis and model of spread F occurrence in China

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The spread F data obtained over Lanzhou (36.1°N,103.9°E),Chongqing (29.5°N,106.4°E) and Haikou (20.0°N,110.3°E) of China during the period from 1978 to 1997 are used to analyze the occurrence characteristics.The statistical results show that the post midnight spread F occurrence is maximum during the summer solstice months of the lower solar activity period,while post sunset spread F is dominant in equinoxes of higher solar activity period over Haikou station.Over Chongqing and Lanzhou stations,spread F mostly occurs at post midnight and relates negatively with solar activity.Using regression method and Fourier expansion,the preliminary single-station model of spread F occurrence is established and the accuracy of the model is evaluated.

  14. Helicity statistics in homogeneous and isotropic turbulence and turbulence models

    CERN Document Server

    Sahoo, Ganapati; Biferale, Luca

    2016-01-01

    We study the statistical properties of helicity in direct numerical simulations of fully developed homogeneous and isotropic turbulence and in a class of turbulence shell models. We consider correlation functions based on combinations of vorticity and velocity increments that are not invariant under mirror symmetry. We also study the scaling properties of high-order structure functions based on the moments of the velocity increments projected on a subset of modes with either positive or negative helicity (chirality). We show that mirror symmetry is recovered at small-scales, i.e. chiral terms are always subleading and they are well captured by a dimensional argument plus a small anomalous correction. We confirm these findings with numerical study of helical shell models at high Reynolds numbers.

  15. Isospin dependence of nuclear multifragmentation in statistical model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lei; XIE Dong-Zhu; ZHANG Yan-Ping; GAO Yuan

    2011-01-01

    The evolution of nuclear disintegration mechanisms with increasing excitation energy, from com- pound nucleus to multifragmentation, has been studied by using the Statistical Multifragmentation Model (SMM) within a micro-canonical ensemble. We discuss the observable characteristics as functions of excitation energy in multifragmentation, concentrating on the isospin dependence of the model in its decaying mechanism and break-up fragment configuration by comparing the A = 200, Z = 78 and A = 200, Z = 100 systems. The calculations indicate that the neutron-rich system (Z = 78) translates to a fission-like process from evaporation later than the symmetric nucleus at a lower excitation energy, but gets a larger average multiplicity as the excitation energy increases above 1.0 MeV/u.

  16. A review of radio channel models for body centric communications

    Science.gov (United States)

    Cotton, Simon L.; D'Errico, Raffaele; Oestges, Claude

    2014-06-01

    The human body is an extremely challenging environment for the operation of wireless communications systems, not least because of the complex antenna-body electromagnetic interaction effects which can occur. This is further compounded by the impact of movement and the propagation characteristics of the local environment which all have an effect upon body centric communications channels. As the successful design of body area networks (BANs) and other types of body centric system is inextricably linked to a thorough understanding of these factors, the aim of this paper is to conduct a survey of the current state of the art in relation to propagation and channel models primarily for BANs but also considering other types of body centric communications. We initially discuss some of the standardization efforts performed by the Institute of Electrical and Electronics Engineers 802.15.6 task group before focusing on the two most popular types of technologies currently being considered for BANs, namely narrowband and Ultrawideband (UWB) communications. For narrowband communications the applicability of a generic path loss model is contended, before presenting some of the scenario specific models which have proven successful. The impacts of human body shadowing and small-scale fading are also presented alongside some of the most recent research into the Doppler and time dependencies of BANs. For UWB BAN communications, we again consider the path loss as well as empirical tap delay line models developed from a number of extensive channel measurement campaigns conducted by research institutions around the world. Ongoing efforts within collaborative projects such as Committee on Science and Technology Action IC1004 are also described. Finally, recent years have also seen significant developments in other areas of body centric communications such as off-body and body-to-body communications. We highlight some of the newest relevant research in these areas as well as discussing

  17. A review of radio channel models for body centric communications.

    Science.gov (United States)

    Cotton, Simon L; D'Errico, Raffaele; Oestges, Claude

    2014-06-01

    The human body is an extremely challenging environment for the operation of wireless communications systems, not least because of the complex antenna-body electromagnetic interaction effects which can occur. This is further compounded by the impact of movement and the propagation characteristics of the local environment which all have an effect upon body centric communications channels. As the successful design of body area networks (BANs) and other types of body centric system is inextricably linked to a thorough understanding of these factors, the aim of this paper is to conduct a survey of the current state of the art in relation to propagation and channel models primarily for BANs but also considering other types of body centric communications. We initially discuss some of the standardization efforts performed by the Institute of Electrical and Electronics Engineers 802.15.6 task group before focusing on the two most popular types of technologies currently being considered for BANs, namely narrowband and Ultrawideband (UWB) communications. For narrowband communications the applicability of a generic path loss model is contended, before presenting some of the scenario specific models which have proven successful. The impacts of human body shadowing and small-scale fading are also presented alongside some of the most recent research into the Doppler and time dependencies of BANs. For UWB BAN communications, we again consider the path loss as well as empirical tap delay line models developed from a number of extensive channel measurement campaigns conducted by research institutions around the world. Ongoing efforts within collaborative projects such as Committee on Science and Technology Action IC1004 are also described. Finally, recent years have also seen significant developments in other areas of body centric communications such as off-body and body-to-body communications. We highlight some of the newest relevant research in these areas as well as discussing

  18. A statistically predictive model for future monsoon failure in India

    Science.gov (United States)

    Schewe, Jacob; Levermann, Anders

    2012-12-01

    Indian monsoon rainfall is vital for a large share of the world’s population. Both reliably projecting India’s future precipitation and unraveling abrupt cessations of monsoon rainfall found in paleorecords require improved understanding of its stability properties. While details of monsoon circulations and the associated rainfall are complex, full-season failure is dominated by large-scale positive feedbacks within the region. Here we find that in a comprehensive climate model, monsoon failure is possible but very rare under pre-industrial conditions, while under future warming it becomes much more frequent. We identify the fundamental intraseasonal feedbacks that are responsible for monsoon failure in the climate model, relate these to observational data, and build a statistically predictive model for such failure. This model provides a simple dynamical explanation for future changes in the frequency distribution of seasonal mean all-Indian rainfall. Forced only by global mean temperature and the strength of the Pacific Walker circulation in spring, it reproduces the trend as well as the multidecadal variability in the mean and skewness of the distribution, as found in the climate model. The approach offers an alternative perspective on large-scale monsoon variability as the result of internal instabilities modulated by pre-seasonal ambient climate conditions.

  19. 2D numerical modelling of meandering channel formation

    Indian Academy of Sciences (India)

    Y Xiao; G Zhou; F S Yang

    2016-03-01

    A 2D depth-averaged model for hydrodynamic sediment transport and river morphological adjustment was established. The sediment transport submodel takes into account the influence of non-uniform sediment with bed surface armoring and considers the impact of secondary flow in the direction of bed-loadtransport and transverse slope of the river bed. The bank erosion submodel incorporates a simple simulation method for updating bank geometry during either degradational or aggradational bed evolution. Comparison of the results obtained by the extended model with experimental and field data, and numericalpredictions validate that the proposed model can simulate grain sorting in river bends and duplicate the characteristics of meandering river and its development. The results illustrate that by using its control factors, the improved numerical model can be applied to simulate channel evolution under differentscenarios and improve understanding of patterning processes.

  20. Improved virtual channel noise model for transform domain Wyner-Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2009-01-01

    Distributed video coding (DVC) has been proposed as a new video coding paradigm to deal with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. A virtual channel noise model is utilized at the decoder to estimate...... the noise distribution between the side information frame and the original frame. This is one of the most important aspects influencing the coding performance of DVC. Noise models with different granularity have been proposed. In this paper, an improved noise model for transform domain Wyner-Ziv video...... coding is proposed, which utilizes cross-band correlation to estimate the Laplacian parameters more accurately. Experimental results show that the proposed noise model can improve the rate-distortion (RD) performance....

  1. A Simplified Geometric Channel Model for Mobile-to-Mobile Communications

    Directory of Open Access Journals (Sweden)

    K. B. Baltzis

    2011-12-01

    Full Text Available In Mobile-to-Mobile (M2M communications, the communicating nodes are surrounded by scatterers and equipped with low elevation antennas. This paper proposes a simple 2-D geometric scattering model for M2M channels. The model is also applicable in cellular systems when we employ low height base station antennas. In our approach, the scatterers are uniformly distributed in ellipses with arbitrary size and orientation around each communicating node. We provide simple formulas for the calculation of the angular spread and delay variation of the propagating signal. Simulation results verify the accuracy of the model. In order to validate the generalization of the approach, we compare it against notable models in the literature. As an application example, we investigate the impact of scatterer distribution and separation between mobiles on the angle and time of arrival statistics of the multipaths.

  2. River channel's predisposition to ice jams: a geospatial model

    Science.gov (United States)

    De Munck, S.; Gauthier, Y.; Bernier, M.; Légaré, S.

    2012-04-01

    When dynamic breakup occurs on rivers, ice moving downstream may eventually stop at an obstacle when the volume of moving ice exceeds the transport capacity of the river, resulting into an ice jam. The suddenness and unpredictability of these ice jams are a constant danger to local population. Therefore forecasting methods are necessary to provide an early warning to these population. Nonetheless the morphological and hydrological factors controlling where and how the ice will jam are numerous and complex. Existing studies which exist on this topic are highly site specific. Therefore, the goal of this work is to develop a simplified geospatial model that would estimate the predisposition of any river channel to ice jams. The question here is not to predict when the ice will break up but rather to know where the released ice would be susceptible to jam. This paper presents the developments and preliminary results of the proposed approach. The initial step was to document the main factors identified in the literature, as potential cause for an ice jam. First, several main factors identified in the literature as potential cause for an ice jam have been selected: presence of an island, narrowing of the channel, sinuosity, presence of a bridge, confluence of rivers and slope break. The second step was to spatially represent, in 2D, the physical characteristics of the channel and to translate these characteristics into potential ice jamming factors. The Chaudiere River, south of Quebec City (Canada), was chosen as a test site. Tools from the GIS-based FRAZIL system have been used to generate these factors from readily available geospatial data and calcutate an "ice jam predisposition index" over regular-spaced segments along the entire channel. The resulting map was validated upon historical observations and local knowledge, collected in relationship with the Minister of Public Security.

  3. Analysis on MIMO Channel Model%MIMO信道模型分析﹡

    Institute of Scientific and Technical Information of China (English)

    王圆晨; 袁雪莲; 段红光

    2013-01-01

    Wireless channel, as the transmission medium of mobile communication, contain all the information. For full utilization of spectrum resource and maximization of quality and capacity of the transmitted information, it is necessary to clearly understand the channel characteristics. Meanwhile, the MIMO channel model should be required in the research iof 4G mobile communication technology in the complex wireless environment. This paper briefs the classification of MIMO channel, focuses on fundamental model-SCM, which is based on geometrically-distributed statistical model. Finally, the simulation on Doppler, delay spread and angle spread indicates the feasibility of SCM model.%  无线信道作为移动通信传输媒介,所有信息包含其中。要想充分利用频谱资源并且使传输信息的质量和容量最大化,信道特性应被清楚了解。同时,研究移动通信4G技术在复杂的无线传播环境中性能,离不开MIMO信道模型,因此对MIMO信道模型的分析十分必要。简要介绍了MIMO信道模型分类,并着重研究了基于几何分布统计信道模型的基础模型—SCM信道模型。最后对多普勒、时延扩展以及角度扩展仿真分析,验证了SCM信道建模的正确性。

  4. Assessing the relative effectiveness of statistical downscaling and distribution mapping in reproducing rainfall statistics based on climate model results

    Science.gov (United States)

    Langousis, Andreas; Mamalakis, Antonios; Deidda, Roberto; Marrocu, Marino

    2016-01-01

    To improve the level skill of climate models (CMs) in reproducing the statistics of daily rainfall at a basin level, two types of statistical approaches have been suggested. One is statistical correction of CM rainfall outputs based on historical series of precipitation. The other, usually referred to as statistical rainfall downscaling, is the use of stochastic models to conditionally simulate rainfall series, based on large-scale atmospheric forcing from CMs. While promising, the latter approach attracted reduced attention in recent years, since the developed downscaling schemes involved complex weather identification procedures, while demonstrating limited success in reproducing several statistical features of rainfall. In a recent effort, Langousis and Kaleris () developed a statistical framework for simulation of daily rainfall intensities conditional on upper-air variables, which is simpler to implement and more accurately reproduces several statistical properties of actual rainfall records. Here we study the relative performance of: (a) direct statistical correction of CM rainfall outputs using nonparametric distribution mapping, and (b) the statistical downscaling scheme of Langousis and Kaleris (), in reproducing the historical rainfall statistics, including rainfall extremes, at a regional level. This is done for an intermediate-sized catchment in Italy, i.e., the Flumendosa catchment, using rainfall and atmospheric data from four CMs of the ENSEMBLES project. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the CM used and the characteristics of the calibration period. This is particularly the case for yearly rainfall maxima.

  5. Emerging Trends and Statistical Analysis in Computational Modeling in Agriculture

    Directory of Open Access Journals (Sweden)

    Sunil Kumar

    2015-03-01

    Full Text Available In this paper the authors have tried to describe emerging trend in computational modelling used in the sphere of agriculture. Agricultural computational modelling with the use of intelligence techniques for computing the agricultural output by providing minimum input data to lessen the time through cutting down the multi locational field trials and also the labours and other inputs is getting momentum. Development of locally suitable integrated farming systems (IFS is the utmost need of the day, particularly in India where about 95% farms are under small and marginal holding size. Optimization of the size and number of the various enterprises to the desired IFS model for a particular set of agro-climate is essential components of the research to sustain the agricultural productivity for not only filling the stomach of the bourgeoning population of the country, but also to enhance the nutritional security and farms return for quality life. Review of literature pertaining to emerging trends in computational modelling applied in field of agriculture is done and described below for the purpose of understanding its trends mechanism behavior and its applications. Computational modelling is increasingly effective for designing and analysis of the system. Computa-tional modelling is an important tool to analyses the effect of different scenarios of climate and management options on the farming systems and its interaction among themselves. Further, authors have also highlighted the applications of computational modeling in integrated farming system, crops, weather, soil, climate, horticulture and statistical used in agriculture which can show the path to the agriculture researcher and rural farming community to replace some of the traditional techniques.

  6. A Statistical Model for Regional Tornado Climate Studies.

    Directory of Open Access Journals (Sweden)

    Thomas H Jagger

    Full Text Available Tornado reports are locally rare, often clustered, and of variable quality making it difficult to use them directly to describe regional tornado climatology. Here a statistical model is demonstrated that overcomes some of these difficulties and produces a smoothed regional-scale climatology of tornado occurrences. The model is applied to data aggregated at the level of counties. These data include annual population, annual tornado counts and an index of terrain roughness. The model has a term to capture the smoothed frequency relative to the state average. The model is used to examine whether terrain roughness is related to tornado frequency and whether there are differences in tornado activity by County Warning Area (CWA. A key finding is that tornado reports increase by 13% for a two-fold increase in population across Kansas after accounting for improvements in rating procedures. Independent of this relationship, tornadoes have been increasing at an annual rate of 1.9%. Another finding is the pattern of correlated residuals showing more Kansas tornadoes in a corridor of counties running roughly north to south across the west central part of the state consistent with the dryline climatology. The model is significantly improved by adding terrain roughness. The effect amounts to an 18% reduction in the number of tornadoes for every ten meter increase in elevation standard deviation. The model indicates that tornadoes are 51% more likely to occur in counties served by the CWAs of DDC and GID than elsewhere in the state. Flexibility of the model is illustrated by fitting it to data from Illinois, Mississippi, South Dakota, and Ohio.

  7. A Generalized Statistical Uncertainty Model for Satellite Precipitation Products

    Science.gov (United States)

    Sarachi, S.

    2013-12-01

    A mixture model of Generalized Normal Distribution and Gamma distribution (GND-G) is used to model the joint probability distribution of satellite-based and stage IV radar rainfall under a given spatial and temporal resolution (e.g. 1°x1° and daily rainfall). The distribution parameters of GND-G are extended across various rainfall rates and spatial and temporal resolutions. In the study, GND-G is used to describe the uncertainty of the estimates from Precipitation Estimation from Remote Sensing Information using Artificial Neural Network algorithm (PERSIANN). The stage IV-based multi-sensor precipitation estimates (MPE) are used as reference measurements .The study area for constructing the uncertainty model covers a 15°×15°box of 0.25°×0.25° cells over the eastern United States for summer 2004 to 2009. Cells are aggregated in space and time to obtain data with different resolutions for the construction of the model's parameter space. Result shows that comparing to the other statistical uncertainty models, GND-G fits better than the other models, such as Gaussian and Gamma distributions, to the reference precipitation data. The impact of precipitation uncertainty to the stream flow is further demonstrated by Monte Carlo simulation of precipitation forcing in the hydrologic model. The NWS DMIP2 basins over Illinois River basin south of Siloam is selected in this case study. The data covers the time period of 2006 to 2008.The uncertainty range of stream flow from precipitation of GND-G distributions calculated and will be discussed.

  8. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  9. A hybrid random field model for scalable statistical learning.

    Science.gov (United States)

    Freno, A; Trentin, E; Gori, M

    2009-01-01

    This paper introduces hybrid random fields, which are a class of probabilistic graphical models aimed at allowing for efficient structure learning in high-dimensional domains. Hybrid random fields, along with the learning algorithm we develop for them, are especially useful as a pseudo-likelihood estimation technique (rather than a technique for estimating strict joint probability distributions). In order to assess the generality of the proposed model, we prove that the class of pseudo-likelihood distributions representable by hybrid random fields strictly includes the class of joint probability distributions representable by Bayesian networks. Once we establish this result, we develop a scalable algorithm for learning the structure of hybrid random fields, which we call 'Markov Blanket Merging'. On the one hand, we characterize some complexity properties of Markov Blanket Merging both from a theoretical and from the experimental point of view, using a series of synthetic benchmarks. On the other hand, we evaluate the accuracy of hybrid random fields (as learned via Markov Blanket Merging) by comparing them to various alternative statistical models in a number of pattern classification and link-prediction applications. As the results show, learning hybrid random fields by the Markov Blanket Merging algorithm not only reduces significantly the computational cost of structure learning with respect to several considered alternatives, but it also leads to models that are highly accurate as compared to the alternative ones.

  10. Testing the DGP model with gravitational lensing statistics

    Science.gov (United States)

    Zhu, Zong-Hong; Sereno, M.

    2008-09-01

    Aims: The self-accelerating braneworld model (DGP) appears to provide a simple alternative to the standard ΛCDM cosmology to explain the current cosmic acceleration, which is strongly indicated by measurements of type Ia supernovae, as well as other concordant observations. Methods: We investigate observational constraints on this scenario provided by gravitational-lensing statistics using the Cosmic Lens All-Sky Survey (CLASS) lensing sample. Results: We show that a substantial part of the parameter space of the DGP model agrees well with that of radio source gravitational lensing sample. Conclusions: In the flat case, Ω_K=0, the likelihood is maximized, L=L_max, for ΩM = 0.30-0.11+0.19. If we relax the prior on Ω_K, the likelihood peaks at Ω_M,Ωr_c ≃ 0.29, 0.12, slightly in the region of open models. The confidence contours are, however, elongated such that we are unable to discard any of the close, flat or open models.

  11. Glass viscosity calculation based on a global statistical modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Fluegel, Alex

    2007-02-01

    A global statistical glass viscosity model was developed for predicting the complete viscosity curve, based on more than 2200 composition-property data of silicate glasses from the scientific literature, including soda-lime-silica container and float glasses, TV panel glasses, borosilicate fiber wool and E type glasses, low expansion borosilicate glasses, glasses for nuclear waste vitrification, lead crystal glasses, binary alkali silicates, and various further compositions from over half a century. It is shown that within a measurement series from a specific laboratory the reported viscosity values are often over-estimated at higher temperatures due to alkali and boron oxide evaporation during the measurement and glass preparation, including data by Lakatos et al. (1972) and the recently published High temperature glass melt property database for process modeling by Seward et al. (2005). Similarly, in the glass transition range many experimental data of borosilicate glasses are reported too high due to phase separation effects. The developed global model corrects those errors. The model standard error was 9-17°C, with R^2 = 0.985-0.989. The prediction 95% confidence interval for glass in mass production largely depends on the glass composition of interest, the composition uncertainty, and the viscosity level. New insights in the mixed-alkali effect are provided.

  12. Energy Level Statistics in Particle-Rotor Model

    Institute of Scientific and Technical Information of China (English)

    ZHOU Xian-Rong; GUO Lu; MENG Jie; ZHAO En-Guang

    2002-01-01

    Energy level statistics of a system consisting of six particles interacting by delta force in a two-j modelcoupled with a deformed core is studied in particle-rotor model. For single-j shell (i13/2) and two-j shell (g7/2 + d5/2)the exact energies for our statistical analysis are obtained from a full diagonalization of the Hamiltonian, while in two-jcase (i13/2 + g9/2) the configuration truncation is used. The nearest-neighbor distribution of energy levels and spectralrigidity are studied as the function of spin. The results of single-j shell are compared with those in two-j case. It isshowed that the system becomes more regular when single-j space (i13/2) is replaced by two-j shell (g7/2 +d5/2) althoughthe basis size of the configuration space is unchanged. The degree of chaoticity of the system, however, changes slightlywhen configuration space is enlarged by extending single-j shell (i13/2) to two-j shell (i13/2 + g9/2).

  13. Composite and Cascaded Generalized-K Fading Channel Modeling and Their Diversity and Performance Analysis

    KAUST Repository

    Ansari, Imran Shafique

    2010-12-01

    The introduction of new schemes that are based on the communication among nodes has motivated the use of composite fading models due to the fact that the nodes experience different multipath fading and shadowing statistics, which subsequently determines the required statistics for the performance analysis of different transceivers. The end-to-end signal-to-noise-ratio (SNR) statistics plays an essential role in the determination of the performance of cascaded digital communication systems. In this thesis, a closed-form expression for the probability density function (PDF) of the end-end SNR for independent but not necessarily identically distributed (i.n.i.d.) cascaded generalized-K (GK) composite fading channels is derived. The developed PDF expression in terms of the Meijer-G function allows the derivation of subsequent performance metrics, applicable to different modulation schemes, including outage probability, bit error rate for coherent as well as non-coherent systems, and average channel capacity that provides insights into the performance of a digital communication system operating in N cascaded GK composite fading environment. Another line of research that was motivated by the introduction of composite fading channels is the error performance. Error performance is one of the main performance measures and derivation of its closed-form expression has proved to be quite involved for certain systems. Hence, in this thesis, a unified closed-form expression, applicable to different binary modulation schemes, for the bit error rate of dual-branch selection diversity based systems undergoing i.n.i.d. GK fading is derived in terms of the extended generalized bivariate Meijer G-function.

  14. Mathematical modeling and statistical analysis of calcium-regulated insulin granule exocytosis in ß-cells from mice and humans

    DEFF Research Database (Denmark)

    Pedersen, Morten Gram; Cortese, Giuliana; Eliasson, Lena

    2011-01-01

    on depolarization-evoked Ca2+-currents and corresponding capacitance measurements. Using a statistical mixed-effects model, we show that the data indicate that pool depletion is negligible in response to short depolarizations in mouse ß-cells. We then review mathematical models of granule dynamics and exocytosis...... in rodent ß-cells and present a mathematical description of Ca2+-evoked exocytosis in human ß-cells, which show clear differences to their rodent counterparts. The model suggests that L- and P/Q-type Ca2+-channels are involved to a similar degree in exocytosis during electrical activity in human ß-cells....

  15. STATISTICAL MECHANICS MODELING OF MESOSCALE DEFORMATION IN METALS

    Energy Technology Data Exchange (ETDEWEB)

    Anter El-Azab

    2013-04-08

    The research under this project focused on a theoretical and computational modeling of dislocation dynamics of mesoscale deformation of metal single crystals. Specifically, the work aimed to implement a continuum statistical theory of dislocations to understand strain hardening and cell structure formation under monotonic loading. These aspects of crystal deformation are manifestations of the evolution of the underlying dislocation system under mechanical loading. The project had three research tasks: 1) Investigating the statistical characteristics of dislocation systems in deformed crystals. 2) Formulating kinetic equations of dislocations and coupling these kinetics equations and crystal mechanics. 3) Computational solution of coupled crystal mechanics and dislocation kinetics. Comparison of dislocation dynamics predictions with experimental results in the area of statistical properties of dislocations and their field was also a part of the proposed effort. In the first research task, the dislocation dynamics simulation method was used to investigate the spatial, orientation, velocity, and temporal statistics of dynamical dislocation systems, and on the use of the results from this investigation to complete the kinetic description of dislocations. The second task focused on completing the formulation of a kinetic theory of dislocations that respects the discrete nature of crystallographic slip and the physics of dislocation motion and dislocation interaction in the crystal. Part of this effort also targeted the theoretical basis for establishing the connection between discrete and continuum representation of dislocations and the analysis of discrete dislocation simulation results within the continuum framework. This part of the research enables the enrichment of the kinetic description with information representing the discrete dislocation systems behavior. The third task focused on the development of physics-inspired numerical methods of solution of the coupled

  16. Impact of harbor navigation channels on waves: a numerical modelling guideline

    NARCIS (Netherlands)

    Dusseljee, D.W.; Klopman, G.; Van Vledder, G.P.; Van Riezebos, H.J.

    2014-01-01

    This study presents an intercomparison of a SWAN and SWASH wave model and 3D laboratory experiments for an existing navigation channel towards a harbor. Results show that the spectral refraction model SWAN underestimates the wave conditions in the channel and at the lee side of the channel especiall

  17. Impact of Harbor Navigation Channels on Waves: a Numerical Modelling Guideline

    NARCIS (Netherlands)

    Dusseljee, D.W.; Klopman, G.; Van Vledder, G.P.; Riezebos, H.J.

    2014-01-01

    This study presents an intercomparison of a SWAN and SWASH wave model and 3D laboratory experiments for an existing navigation channel towards a harbor. Results show that the spectral refraction model SWAN underestimates the wave conditions in the channel and at the lee side of the channel especiall

  18. Statistical Models and Methods for Network Meta-Analysis.

    Science.gov (United States)

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS.

  19. A statistical downscaling model for summer rainfall over Pakistan

    Science.gov (United States)

    Kazmi, Dildar Hussain; Li, Jianping; Ruan, Chengqing; Zhao, Sen; Li, Yanjie

    2016-10-01

    A statistical approach is utilized to construct an interannual model for summer (July-August) rainfall over the western parts of South Asian Monsoon. Observed monthly rainfall data for selected stations of Pakistan for the last 55 years (1960-2014) is taken as predictand. Recommended climate indices along with the oceanic and atmospheric data on global scales, for the period April-June are employed as predictors. First 40 years data has been taken as training period and the rest as validation period. Cross-validation stepwise regression approach adopted to select the robust predictors. Upper tropospheric zonal wind at 200 hPa over the northeastern Atlantic is finally selected as the best predictor for interannual model. Besides, the next possible candidate `geopotential height at upper troposphere' is taken as the indirect predictor for being a source of energy transportation from core region (northeast Atlantic/western Europe) to the study area. The model performed well for both the training as well as validation period with correlation coefficient of 0.71 and tolerable root mean square errors. Cross-validation of the model has been processed by incorporating JRA-55 data for potential predictors in addition to NCEP and fragmentation of study period to five non-overlapping test samples. Subsequently, to verify the outcome of the model on physical grounds, observational analyses as well as the model simulations are incorporated. It is revealed that originating from the jet exit region through large vorticity gradients, zonally dominating waves may transport energy and momentum to the downstream areas of west-central Asia, that ultimately affect interannual variability of the specific rainfall. It has been detected that both the circumglobal teleconnection and Rossby wave propagation play vital roles in modulating the proposed mechanism.

  20. Over-sampling basis expansion model aided channel estimation for OFDM systems with ICI

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The rapid variation of channel can induce the intercarrier interference in orthogonal frequency-division multiplexing (OFDM) systems. Intercarrier interference will significantly increase the difficulty of OFDM channel estimation because too many channel coefficients need be estimated. In this article, a novel channel estimator is proposed to resolve the above problem. This estimator consists of two parts: the channel parameter estimation unit (CPEU), which is used to estimate the number of channel taps and the multipath time delays, and the channel coefficient estimation unit (CCEU), which is used to estimate the channel coefficients by using the estimated channel parameters provided by CPEU. In CCEU, the over-sampling basis expansion model is resorted to solve the problem that a large number of channel coefficients need to be estimated. Finally, simulation results are given to scale the performance of the proposed scheme.

  1. Statistical shape model-based femur kinematics from biplane fluoroscopy

    DEFF Research Database (Denmark)

    Baka, N.; de Bruijne, Marleen; Walsum, T. van;

    2012-01-01

    Studying joint kinematics is of interest to improve prosthesis design and to characterize postoperative motion. State of the art techniques register bones segmented from prior computed tomography or magnetic resonance scans with X-ray fluoroscopic sequences. Elimination of the prior 3D acquisition...... could potentially lower costs and radiation dose. Therefore, we propose to substitute the segmented bone surface with a statistical shape model based estimate. A dedicated dynamic reconstruction and tracking algorithm was developed estimating the shape based on all frames, and pose per frame...... on the distal femur using eight biplane fluoroscopic drop-landing sequences. The proposed dynamic prior and features increased the convergence rate of the reconstruction from 71% to 91%, using a convergence limit of 3 mm. The achieved root mean square point-to-surface accuracy at the converged frames was 1...

  2. Population stratification using a statistical model on hypergraphs

    CERN Document Server

    Vazquez, Alexei

    2007-01-01

    Population stratification is a problem encountered in several areas of biology and public health. We tackle this problem by mapping a population and its elements attributes into a hypergraph, a natural extension of the concept of graph or network to encode associations among any number of elements. On this hypergraph, we construct a statistical model reflecting our intuition about how the elements attributes can emerge from a postulated population structure. Finally, we introduce the concept of stratification representativeness as a mean to identify the simplest stratification already containing most of the information about the population structure. We demonstrate the power of this framework stratifying an animal and a human population based on phenotypic and genotypic properties, respectively.

  3. Quantum statistics of Raman scattering model with Stokes mode generation

    Science.gov (United States)

    Tanatar, Bilal; Shumovsky, Alexander S.

    1994-01-01

    The model describing three coupled quantum oscillators with decay of Rayleigh mode into the Stokes and vibration (phonon) modes is examined. Due to the Manley-Rowe relations the problem of exact eigenvalues and eigenstates is reduced to the calculation of new orthogonal polynomials defined both by the difference and differential equations. The quantum statistical properties are examined in the case when initially: the Stokes mode is in the vacuum state; the Rayleigh mode is in the number state; and the vibration mode is in the number of or squeezed states. The collapses and revivals are obtained for different initial conditions as well as the change in time the sub-Poisson distribution by the super-Poisson distribution and vice versa.

  4. Improving Statistical Language Model Performance with Automatically Generated Word Hierarchies

    CERN Document Server

    McMahon, J; Mahon, John Mc

    1995-01-01

    An automatic word classification system has been designed which processes word unigram and bigram frequency statistics extracted from a corpus of natural language utterances. The system implements a binary top-down form of word clustering which employs an average class mutual information metric. Resulting classifications are hierarchical, allowing variable class granularity. Words are represented as structural tags --- unique $n$-bit numbers the most significant bit-patterns of which incorporate class information. Access to a structural tag immediately provides access to all classification levels for the corresponding word. The classification system has successfully revealed some of the structure of English, from the phonemic to the semantic level. The system has been compared --- directly and indirectly --- with other recent word classification systems. Class based interpolated language models have been constructed to exploit the extra information supplied by the classifications and some experiments have sho...

  5. A context dependent pair hidden Markov model for statistical alignment

    CERN Document Server

    Arribas-Gil, Ana

    2011-01-01

    This article proposes a novel approach to statistical alignment of nucleotide sequences by introducing a context dependent structure on the substitution process in the underlying evolutionary model. We propose to estimate alignments and context dependent mutation rates relying on the observation of two homologous sequences. The procedure is based on a generalized pair-hidden Markov structure, where conditional on the alignment path, the nucleotide sequences follow a Markov distribution. We use a stochastic approximation expectation maximization (saem) algorithm to give accurate estimators of parameters and alignments. We provide results both on simulated data and vertebrate genomes, which are known to have a high mutation rate from CG dinucleotide. In particular, we establish that the method improves the accuracy of the alignment of a human pseudogene and its functional gene.

  6. Statistical model on the surface elevation of waves with breaking

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In the surface wind drift layer with constant momentum flux, two sets of the consistent surface eleva- tion expressions with breaking and occurrence conditions for breaking are deduced from the first in- tegrals of the energy and vortex variations and the kinetic and mathematic breaking criterions, then the expression of the surface elevation with wave breaking is established by using the Heaviside function. On the basis of the form of the sea surface elevation with wave breaking and the understanding of small slope sea waves, a triple composite function of real sea waves is presented including the func- tions for the breaking, weak-nonlinear and basic waves. The expression of the triple composite func- tion and the normal distribution of basic waves are the expected theoretical model for surface elevation statistics.

  7. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  8. Terminal-Dependent Statistical Inference for the FBSDEs Models

    Directory of Open Access Journals (Sweden)

    Yunquan Song

    2014-01-01

    Full Text Available The original stochastic differential equations (OSDEs and forward-backward stochastic differential equations (FBSDEs are often used to model complex dynamic process that arise in financial, ecological, and many other areas. The main difference between OSDEs and FBSDEs is that the latter is designed to depend on a terminal condition, which is a key factor in some financial and ecological circumstances. It is interesting but challenging to estimate FBSDE parameters from noisy data and the terminal condition. However, to the best of our knowledge, the terminal-dependent statistical inference for such a model has not been explored in the existing literature. We proposed a nonparametric terminal control variables estimation method to address this problem. The reason why we use the terminal control variables is that the newly proposed inference procedures inherit the terminal-dependent characteristic. Through this new proposed method, the estimators of the functional coefficients of the FBSDEs model are obtained. The asymptotic properties of the estimators are also discussed. Simulation studies show that the proposed method gives satisfying estimates for the FBSDE parameters from noisy data and the terminal condition. A simulation is performed to test the feasibility of our method.

  9. System models for PET statistical iterative reconstruction: A review.

    Science.gov (United States)

    Iriarte, A; Marabini, R; Matej, S; Sorzano, C O S; Lewitt, R M

    2016-03-01

    Positron emission tomography (PET) is a nuclear imaging modality that provides in vivo quantitative measurements of the spatial and temporal distribution of compounds labeled with a positron emitting radionuclide. In the last decades, a tremendous effort has been put into the field of mathematical tomographic image reconstruction algorithms that transform the data registered by a PET camera into an image that represents slices through the scanned object. Iterative image reconstruction methods often provide higher quality images than conventional direct analytical methods. Aside from taking into account the statistical nature of the data, the key advantage of iterative reconstruction techniques is their ability to incorporate detailed models of the data acquisition process. This is mainly realized through the use of the so-called system matrix, that defines the mapping from the object space to the measurement space. The quality of the reconstructed images relies to a great extent on the accuracy with which the system matrix is estimated. Unfortunately, an accurate system matrix is often associated with high reconstruction times and huge storage requirements. Many attempts have been made to achieve realistic models without incurring excessive computational costs. As a result, a wide range of alternatives to the calculation of the system matrix exists. In this article we present a review of the different approaches used to address the problem of how to model, calculate and store the system matrix.

  10. The statistical multifragmentation model: Origins and recent advances

    Science.gov (United States)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  11. Critical, statistical, and thermodynamical properties of lattice models

    Energy Technology Data Exchange (ETDEWEB)

    Varma, Vipin Kerala

    2013-10-15

    In this thesis we investigate zero temperature and low temperature properties - critical, statistical and thermodynamical - of lattice models in the contexts of bosonic cold atom systems, magnetic materials, and non-interacting particles on various lattice geometries. We study quantum phase transitions in the Bose-Hubbard model with higher body interactions, as relevant for optical lattice experiments of strongly interacting bosons, in one and two dimensions; the universality of the Mott insulator to superfluid transition is found to remain unchanged for even large three body interaction strengths. A systematic renormalization procedure is formulated to fully re-sum these higher (three and four) body interactions into the two body terms. In the strongly repulsive limit, we analyse the zero and low temperature physics of interacting hard-core bosons on the kagome lattice at various fillings. Evidence for a disordered phase in the Ising limit of the model is presented; in the strong coupling limit, the transition between the valence bond solid and the superfluid is argued to be first order at the tip of the solid lobe.

  12. Multivariate Statistical Modelling of Drought and Heat Wave Events

    Science.gov (United States)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A

  13. Robust model selection and the statistical classification of languages

    Science.gov (United States)

    García, J. E.; González-López, V. A.; Viola, M. L. L.

    2012-10-01

    In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating

  14. Feature and Statistical Model Development in Structural Health Monitoring

    Science.gov (United States)

    Kim, Inho

    All structures suffer wear and tear because of impact, excessive load, fatigue, corrosion, etc. in addition to inherent defects during their manufacturing processes and their exposure to various environmental effects. These structural degradations are often imperceptible, but they can severely affect the structural performance of a component, thereby severely decreasing its service life. Although previous studies of Structural Health Monitoring (SHM) have revealed extensive prior knowledge on the parts of SHM processes, such as the operational evaluation, data processing, and feature extraction, few studies have been conducted from a systematical perspective, the statistical model development. The first part of this dissertation, the characteristics of inverse scattering problems, such as ill-posedness and nonlinearity, reviews ultrasonic guided wave-based structural health monitoring problems. The distinctive features and the selection of the domain analysis are investigated by analytically searching the conditions of the uniqueness solutions for ill-posedness and are validated experimentally. Based on the distinctive features, a novel wave packet tracing (WPT) method for damage localization and size quantification is presented. This method involves creating time-space representations of the guided Lamb waves (GLWs), collected at a series of locations, with a spatially dense distribution along paths at pre-selected angles with respect to the direction, normal to the direction of wave propagation. The fringe patterns due to wave dispersion, which depends on the phase velocity, are selected as the primary features that carry information, regarding the wave propagation and scattering. The following part of this dissertation presents a novel damage-localization framework, using a fully automated process. In order to construct the statistical model for autonomous damage localization deep-learning techniques, such as restricted Boltzmann machine and deep belief network

  15. Channel Measurement and Modeling for 5G Urban Microcellular Scenarios

    Directory of Open Access Journals (Sweden)

    Michael Peter

    2016-08-01

    Full Text Available In order to support the development of channel models for higher frequency bands, multiple urban microcellular measurement campaigns have been carried out in Berlin, Germany, at 60 and 10 GHz. In this paper, the collected data is uniformly analyzed with focus on the path loss (PL and the delay spread (DS. It reveals that the ground reflection has a dominant impact on the fading behavior. For line-of-sight conditions, the PL exponents are close to free space propagation at 60 GHz, but slightly smaller (1.62 for the street canyon at 10 GHz. The DS shows a clear dependence on the scenario (median values between 16 and 38 ns and a strong distance dependence for the open square and the wide street canyon. The dependence is less distinct for the narrow street canyon with residential buildings. This behavior is consistent with complementary ray tracing simulations, though the simplified model tends to overestimate the DS.

  16. Fermionic dark matter in a simple t-channel model

    Energy Technology Data Exchange (ETDEWEB)

    Goyal, Ashok; Kumar, Mukesh [National Institute for Theoretical Physics, School of Physics and Mandelstam Institute for Theoretical Physics, University of the Witwatersrand, Johannesburg, Wits 2050 (South Africa)

    2016-11-02

    We consider a fermionic dark matter (DM) particle in renormalizable Standard Model (SM) gauge interactions in a simple t-channel model. The DM particle interactions with SM fermions is through the exchange of scalar and vector mediators which carry colour or lepton number. In the case of coloured mediators considered in this study, we find that if the DM is thermally produced and accounts for the observed relic density almost the entire parameter space is ruled out by the direct detection observations. The bounds from the monojet plus missing energy searches at the Large Hadron Collider are less stringent in this case. In contrast for the case of Majorana DM, we obtain strong bounds from the monojet searches which rule out DM particles of mass less than about a few hundred GeV for both the scalar and vector mediators.

  17. Statistical modelling of monthly mean sea level at coastal tide gauge stations along the Indian subcontinent

    Digital Repository Service at National Institute of Oceanography (India)

    Srinivas, K.; Das, V.K.; DineshKumar, P.K.

    This study investigates the suitability of statistical models for their predictive potential for the monthly mean sea level at different stations along the west and east coasts of the Indian subcontinent. Statistical modelling of the monthly mean...

  18. Statistical methods in joint modeling of longitudinal and survival data

    Science.gov (United States)

    Dempsey, Walter

    Survival studies often generate not only a survival time for each patient but also a sequence of health measurements at annual or semi-annual check-ups while the patient remains alive. Such a sequence of random length accompanied by a survival time is called a survival process. Ordinarily robust health is associated with longer survival, so the two parts of a survival process cannot be assumed independent. The first part of the thesis is concerned with a general technique---reverse alignment---for constructing statistical models for survival processes. A revival model is a regression model in the sense that it incorporates covariate and treatment effects into both the distribution of survival times and the joint distribution of health outcomes. The revival model also determines a conditional survival distribution given the observed history, which describes how the subsequent survival distribution is determined by the observed progression of health outcomes. The second part of the thesis explores the concept of a consistent exchangeable survival process---a joint distribution of survival times in which the risk set evolves as a continuous-time Markov process with homogeneous transition rates. A correspondence with the de Finetti approach of constructing an exchangeable survival process by generating iid survival times conditional on a completely independent hazard measure is shown. Several specific processes are detailed, showing how the number of blocks of tied failure times grows asymptotically with the number of individuals in each case. In particular, we show that the set of Markov survival processes with weakly continuous predictive distributions can be characterized by a two-dimensional family called the harmonic process. The outlined methods are then applied to data, showing how they can be easily extended to handle censoring and inhomogeneity among patients.

  19. MIMO capacity for deterministic channel models: sublinear growth

    DEFF Research Database (Denmark)

    Bentosela, Francois; Cornean, Horia; Marchetti, Nicola

    2013-01-01

    This is the second paper by the authors in a series concerned with the development of a deterministic model for the transfer matrix of a MIMO system. In our previous paper, we started from the Maxwell equations and described the generic structure of such a deterministic transfer matrix...... some generic assumptions, we prove that the capacity grows much more slowly than linearly with the number of antennas. These results reinforce previous heuristic results obtained from statistical models of the transfer matrix, which also predict a sublinear behavior....

  20. Mathematical-statistical models of generated hazardous hospital solid waste.

    Science.gov (United States)

    Awad, A R; Obeidat, M; Al-Shareef, M

    2004-01-01

    This research work was carried out under the assumption that wastes generated from hospitals in Irbid, Jordan were hazardous. The hazardous and non-hazardous wastes generated from the different divisions in the three hospitals under consideration were not separated during collection process. Three hospitals, Princess Basma hospital (public), Princess Bade'ah hospital (teaching), and Ibn Al-Nafis hospital (private) in Irbid were selected for this study. The research work took into account the amounts of solid waste accumulated from each division and also determined the total amount generated from each hospital. The generation rates were determined (kilogram per patient, per day; kilogram per bed, per day) for the three hospitals. These generation rates were compared with similar hospitals in Europe. The evaluation suggested that the current situation regarding the management of these wastes in the three studied hospitals needs revision as these hospitals do not follow methods of waste disposals that would reduce risk to human health and the environment practiced in developed countries. Statistical analysis was carried out to develop models for the prediction of the quantity of waste generated at each hospital (public, teaching, private). In these models number of patients, beds, and type of hospital were revealed to be significant factors on quantity of waste generated. Multiple regressions were also used to estimate the quantities of wastes generated from similar divisions in the three hospitals (surgery, internal diseases, and maternity).