WorldWideScience

Sample records for extreme order statistics

  1. On the Limit Distribution of Lower Extreme Generalized Order Statistics

    Indian Academy of Sciences (India)

    H M Barakat; Magdy E El-Adll

    2012-05-01

    In a wide subclass of generalized order statistics $(gOs)$, which contains most of the known and important models of ordered random variables, weak convergence of lower extremes are developed. A recent result of extreme value theory of $m-gOs$ (as well as the classical extreme value theory of ordinary order statistics) yields three types of limit distributions that are possible in case of linear normalization. In this paper a similar classification of limit distributions holds for extreme $gOs$, where the parameters $_j,j=1,\\ldots,n$, are assumed to be pairwise different. Two illustrative examples are given to demonstrate the practical importance for some of the obtained results.

  2. Statistics of extremes

    CERN Document Server

    Gumbel, E J

    2012-01-01

    This classic text covers order statistics and their exceedances; exact distribution of extremes; the 1st asymptotic distribution; uses of the 1st, 2nd, and 3rd asymptotes; more. 1958 edition. Includes 44 tables and 97 graphs.

  3. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  4. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  5. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.

    2015-04-10

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event of interest may be very limited, efficient methods of inference play an important role. This article reviews this domain, emphasizing current research topics. We first sketch the classical theory of extremes for maxima and threshold exceedances of stationary series. We then review multivariate theory, distinguishing asymptotic independence and dependence models, followed by a description of models for spatial and spatiotemporal extreme events. Finally, we discuss inference and describe two applications. Animations illustrate some of the main ideas. © 2015 by Annual Reviews. All rights reserved.

  6. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  7. Applied extreme-value statistics

    Energy Technology Data Exchange (ETDEWEB)

    Kinnison, R.R.

    1983-05-01

    The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all the subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.

  8. Observed Statistics of Extreme Waves

    Science.gov (United States)

    2006-12-01

    9 Figure 5. An energy stealing wave as a solution to the NLS equation . (From: Dysthe and...shown that nonlinear interaction between four colliding waves can produce extreme wave behavior. He utilized the NLS equation in his numerical ...2000) demonstrated the formation of extreme waves using the Korteweg de Vries ( KdV ) equation , which is valid in shallow water. It was shown in the

  9. Statistics of extremes theory and applications

    CERN Document Server

    Beirlant, Jan; Segers, Johan; Teugels, Jozef; De Waal, Daniel; Ferro, Chris

    2006-01-01

    Research in the statistical analysis of extreme values has flourished over the past decade: new probability models, inference and data analysis techniques have been introduced; and new application areas have been explored. Statistics of Extremes comprehensively covers a wide range of models and application areas, including risk and insurance: a major area of interest and relevance to extreme value theory. Case studies are introduced providing a good balance of theory and application of each model discussed, incorporating many illustrated examples and plots of data. The last part of the book covers some interesting advanced topics, including  time series, regression, multivariate and Bayesian modelling of extremes, the use of which has huge potential.  

  10. Order statistics & inference estimation methods

    CERN Document Server

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  11. Extreme value statistics in coupled lasers

    CERN Document Server

    Fridman, Moti; Nixon, Micha; Friesem, Asher A; Davidson, Nir

    2010-01-01

    Experimental configuration for investigating the dynamics and the statistics of the phase locking level of coupled lasers that have no common frequency is presented. The results reveal that the probability distribution of the phase locking level of such coupled lasers fits a Gumbel distribution that describes the extreme value statistic of Gaussian processes. A simple model, based on the spectral response of the coupled lasers, is also described, and the calculated results are in good agreement with the experimental results.

  12. On order statistics from nonidentical discrete random variables

    Directory of Open Access Journals (Sweden)

    Yüzbaşı Bahadır

    2016-01-01

    Full Text Available In this study, pf and df of single order statistic of nonidentical discrete random variables are obtained. These functions are also expressed in integral form. Finally, pf and df of extreme of order statistics of random variables for the nonidentical discrete case are given.

  13. An engineering primer on extreme value statistics

    Energy Technology Data Exchange (ETDEWEB)

    Novog, D.R.; Hoppe, F. [McMaster Univ., Hamilton, Ontario (Canada); Nainer, O. [Bruce Power, Tiverton, Ontario (Canada); Phan, B. [Ontario Power Generation, Toronto, Ontario (Canada)

    2009-07-01

    This primer is intended for individuals interested in gaining an understanding of Extreme Value Statistics (EVS). This work provides an explanation of EVS at a level that can be accessible to most people with an engineering or science background. While this work represents a simplification of the discussions from Reference 1, it is hoped that the authors will forgive any liberties taken in this paper. Some of the simplifications presented here may not be rigorous in all aspects, but the sacrifice in rigour is intended to aid the fundamental understanding of the EVS formulation and basic application. (author)

  14. Are extreme events (statistically) special? (Invited)

    Science.gov (United States)

    Main, I. G.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A. F.; McCloskey, J.

    2009-12-01

    We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic’, do they ‘know’ how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic’-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball’ fits to unconsciously (but

  15. A comparative assessment of statistical methods for extreme weather analysis

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus

  16. Statistical convergence of order $\\alpha$ in probability

    OpenAIRE

    Pratulananda Das; Sanjoy Ghosal; Sumit Som

    2016-01-01

    In this paper ideas of different types of convergence of a sequence of random variables in probability, namely, statistical convergence of order $\\alpha$ in probability, strong $p$-Ces$\\grave{\\mbox{a}}$ro summability of order $\\alpha$ in probability, lacunary statistical convergence or $S_{\\theta}$-convergence of order $\\alpha$ in probability, ${N_{\\theta}}$-convergence of order $\\alpha$ in probability have been introduced and their certain basic properties have been studied.

  17. ON LACUNARY STATISTICAL CONVERGENCE OF ORDER α

    Institute of Scientific and Technical Information of China (English)

    Hacer SENGUL; Mikail ET

    2014-01-01

    In this article, we introduce the concept of lacunary statistical convergence of orderαof real number sequences and give some inclusion relations between the sets of lacu-nary statistical convergence of order α and strong Nαθ (p)-summability. Furthermore, some relations between the spaces Nαθ (p) and Sαθ are examined.

  18. Extreme value statistics of weak lensing shear peak counts

    CERN Document Server

    Reischke, Robert; Bartelmann, Matthias

    2015-01-01

    The statistics of peaks in weak gravitational lensing maps is a promising technique to constrain cosmological parameters in present and future surveys. Here we investigate its power when using general extreme value statistics which is very sensitive to the exponential tail of the halo mass function. To this end, we use an analytic method to quantify the number of weak lensing peaks caused by galaxy clusters, large-scale structures and observational noise. Doing so, we further improve the method in the regime of high signal-to-noise ratios dominated by non-linear structures by accounting for the embedding of those counts into the surrounding shear caused by large scale structures. We derive the extreme value and order statistics for both over-densities (positive peaks) and under-densities (negative peaks) and provide an optimized criterion to split a wide field survey into sub-fields in order to sample the distribution of extreme values such that the expected objects causing the largest signals are mostly due ...

  19. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  20. Characterization through distributional properties of dual generalized order statistics

    Directory of Open Access Journals (Sweden)

    A.H. Khan

    2012-10-01

    Full Text Available Distributional properties of two non-adjacent dual generalized order statistics have been used to characterize distributions. Further, one sided contraction and dilation for the dual generalized order statistics are discussed and then the results are deduced for generalized order statistics, order statistics, lower record statistics, upper record statistics and adjacent dual generalized order statistics.

  1. Order statistics and the linear assignment problem

    NARCIS (Netherlands)

    J.B.G. Frenk (Hans); M. van Houweninge; A.H.G. Rinnooy Kan (Alexander)

    1987-01-01

    textabstractUnder mild conditions on the distribution functionF, we analyze the asymptotic behavior in expectation of the smallest order statistic, both for the case thatF is defined on (–, +) and for the case thatF is defined on (0, ). These results yield asymptotic estimates of the expected optiml

  2. Extreme value theory and statistics for heavy tail data

    NARCIS (Netherlands)

    S. Caserta; C.G. de Vries (Casper)

    2003-01-01

    textabstractA scientific way of looking beyond the worst-case return is to employ statistical extreme value methods. Extreme Value Theory (EVT) shows that the probability on very large losses is eventually governed by a simple function, regardless the specific distribution that underlies the return

  3. Adaptive Order-Statistic LMS Filters

    Directory of Open Access Journals (Sweden)

    S. Marchevsky

    2001-04-01

    Full Text Available The LMS-based adaptive order-statistic filters are presented in thispaper. The adaptive Ll-filters as extension of the adaptive L-filterfor two-dimensional filtering of noisy greyscale images is studied too.Their adaptation properties are studied by three types of noise, theadditive white Gaussian noise, the impulsive noise or both,respectively. Moreover, the impulsive noise has the fixed noise value(Salt & Pepper noise. The problem of pixel value multiplicity anddetermination its position in the ordered input vector for adaptiveLl-filter is shown in this article. The two types of images withdifferent of image complexity are used to demonstration of the power oftime-spatial ordering.

  4. Chaos, Order Statistics and Unstable Periodic Orbits

    CERN Document Server

    Valsakumar, M C; Kanmani, S

    1999-01-01

    We present a new method for locating unstable periodic points of one dimensional chaotic maps. This method is based on order statistics. The densities of various maxima of the iterates are discontinuous exactly at unstable periodic points of the map. This is illustrated using logistic map where densities corresponding to a small number of iterates have been obtained in closed form. This technique can be applied to the class of continuous time systems where the successive maxima of the time series behave as if they were generated from a unimodal map. This is demonstrated using Lorenz model.

  5. Adaptive filtering using Higher Order Statistics (HOS

    Directory of Open Access Journals (Sweden)

    Abdelghani Manseur

    2012-03-01

    Full Text Available The performed job, in this study, consists in studying adaptive filters and higher order statistics (HOS to ameliorate their performances, by extension of linear case to non linear filters via Volterra series. This study is, principally, axed on: „ Choice of the adaptation step and convergence conditions. „ Convergence rate. „ Adaptive variation of the convergence factor, according to the input signal. The obtained results, with real signals, have shown computationally efficient and numerically stable algorithms for adaptive nonlinear filtering while keeping relatively simple computational complexity.

  6. Extreme event statistics in a drifting Markov chain

    Science.gov (United States)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  7. Level density of a bose gas and extreme value statistics.

    Science.gov (United States)

    Comtet, A; Leboeuf, P; Majumdar, Satya N

    2007-02-16

    We establish a connection between the level density of a gas of noninteracting bosons and the theory of extreme value statistics. Depending on the exponent that characterizes the growth of the underlying single-particle spectrum, we show that at a given excitation energy the limiting distribution function for the number of excited particles follows the three universal distribution laws of extreme value statistics, namely, the Gumbel, Weibull, and Fréchet distributions. Implications of this result, as well as general properties of the level density at different energies, are discussed.

  8. Efficient nonrigid registration using ranked order statistics

    DEFF Research Database (Denmark)

    Tennakoon, Ruwan B.; Bab-Hadiashar, Alireza; de Bruijne, Marleen

    2013-01-01

    Non-rigid image registration techniques are widely used in medical imaging applications. Due to high computational complexities of these techniques, finding appropriate registration method to both reduce the computation burden and increase the registration accuracy has become an intense area...... of research. In this paper we propose a fast and accurate non-rigid registration method for intra-modality volumetric images. Our approach exploits the information provided by an order statistics based segmentation method, to find the important regions for registration and use an appropriate sampling scheme...... to target those areas and reduce the registration computation time. A unique advantage of the proposed method is its ability to identify the point of diminishing returns and stop the registration process. Our experiments on registration of real lung CT images, with expert annotated landmarks, show...

  9. EXTREME PROGRAMMING PROJECT PERFORMANCE MANAGEMENT BY STATISTICAL EARNED VALUE ANALYSIS

    OpenAIRE

    Wei Lu; Li Lu

    2013-01-01

    As an important project type of Agile Software Development, the performance evaluation and prediction for eXtreme Programming project has significant meanings. Targeting on the short release life cycle and concurrent multitask features, a statistical earned value analysis model is proposed. Based on the traditional concept of earned value analysis, the statistical earned value analysis model introduced Elastic Net regression function and Laplacian hierarchical model to construct a Bayesian El...

  10. Extreme value statistics for dynamical systems with noise

    CERN Document Server

    Faranda, Davide; Lucarini, Valerio; Turchetti, Giorgio; Vaienti, Sandro

    2012-01-01

    We study the distribution of maxima (Extreme Value Statistics) for sequences of observables computed along orbits generated by random transformations. The underlying, deterministic, dynamical system can be regular or chaotic. In the former case, we will show that by perturbing rational or irrational rotations with additive noise, an extreme value law will appear, regardless of the intensity of the noise, while unperturbed rotations do not admit such limiting distributions. In the case of deterministic chaotic dynamics, we will consider observables specially designed to study the recurrence properties in the neighbourhood of periodic points. The exponential limiting law for the distribution of maxima is therefore modified by the presence of the extremal index, a positive parameter not larger than one, whose inverse gives the average size of the clusters of extreme events. The theory predicts that such a parameter is unitary when the system is perturbed randomly. We perform sophisticated numerical tests to asse...

  11. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  12. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  13. Statistical Downscaling of Summer Temperature Extremes in Northern China

    Institute of Scientific and Technical Information of China (English)

    FAN Lijun; Deliang CHEN; FU Congbin; YAN Zhongwei

    2013-01-01

    Two approaches of statistical downscaling were applied to indices of temperature extremes based on percentiles of daily maximum and minimum temperature observations at Beijing station in summer during 1960-2008.One was to downscale daily maximum and minimum temperatures by using EOF analysis and stepwise linear regression at first,then to calculate the indices of extremes; the other was to directly downscale the percentile-based indices by using seasonal large-scale temperature and geo-potential height records.The cross-validation results showed that the latter approach has a better performance than the former.Then,the latter approach was applied to 48 meteorological stations in northern China.The crossvalidation results for all 48 stations showed close correlation between the percentile-based indices and the seasonal large-scale variables.Finally,future scenarios of indices of temperature extremes in northern China were projected by applying the statistical downscaling to Hadley Centre Coupled Model Version 3 (HadCM3) simulations under the Representative Concentration Pathways 4.5 (RCP 4.5) scenario of the Fifth Coupled Model Inter-comparison Project (CMIP5).The results showed that the 90th percentile of daily maximum temperatures will increase by about 1.5℃,and the 10th of daily minimum temperatures will increase by about 2℃ during the period 2011-35 relative to 1980-99.

  14. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    Science.gov (United States)

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  15. Statistical distributions of extreme dry spell in Peninsular Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Jemain, Abdul Aziz

    2010-11-01

    Statistical distributions of annual extreme (AE) series and partial duration (PD) series for dry-spell event are analyzed for a database of daily rainfall records of 50 rain-gauge stations in Peninsular Malaysia, with recording period extending from 1975 to 2004. The three-parameter generalized extreme value (GEV) and generalized Pareto (GP) distributions are considered to model both series. In both cases, the parameters of these two distributions are fitted by means of the L-moments method, which provides a robust estimation of them. The goodness-of-fit (GOF) between empirical data and theoretical distributions are then evaluated by means of the L-moment ratio diagram and several goodness-of-fit tests for each of the 50 stations. It is found that for the majority of stations, the AE and PD series are well fitted by the GEV and GP models, respectively. Based on the models that have been identified, we can reasonably predict the risks associated with extreme dry spells for various return periods.

  16. A probabilistic analysis of wind gusts using extreme value statistics

    Energy Technology Data Exchange (ETDEWEB)

    Friederichs, Petra; Bentzien, Sabrina; Lenz, Anne; Krampitz, Rebekka [Meteorological Inst., Univ. of Bonn (Germany); Goeber, Martin [Deutscher Wetterdienst, Offenbach (Germany)

    2009-12-15

    The spatial variability of wind gusts is probably as large as that of precipitation, but the observational weather station network is much less dense. The lack of an area-wide observational analysis hampers the forecast verification of wind gust warnings. This article develops and compares several approaches to derive a probabilistic analysis of wind gusts for Germany. Such an analysis provides a probability that a wind gust exceeds a certain warning level. To that end we have 5 years of observations of hourly wind maxima at about 140 weather stations of the German weather service at our disposal. The approaches are based on linear statistical modeling using generalized linear models, extreme value theory and quantile regression. Warning level exceedance probabilities are estimated in response to predictor variables such as the observed mean wind or the operational analysis of the wind velocity at a height of 10 m above ground provided by the European Centre for Medium Range Weather Forecasts (ECMWF). The study shows that approaches that apply to the differences between the recorded wind gust and the mean wind perform better in terms of the Brier skill score (which measures the quality of a probability forecast) than those using the gust factor or the wind gusts only. The study points to the benefit from using extreme value theory as the most appropriate and theoretically consistent statistical model. The most informative predictors are the observed mean wind, but also the observed gust velocities recorded at the neighboring stations. Out of the predictors used from the ECMWF analysis, the wind velocity at 10 m above ground is the most informative predictor, whereas the wind shear and the vertical velocity provide no additional skill. For illustration the results for January 2007 and during the winter storm Kyrill are shown. (orig.)

  17. Orographic Signature on Multiscale Statistics of Extreme Rainfall: Conditional downscaling with emphasis on extremes

    Science.gov (United States)

    Foufoula-Georgiou, E.; Ebtehaj, M.

    2010-09-01

    Rainfall intensity and spatio-temporal patterns often show a strong dependency on the underlying terrain. The main objective of this work is to study the statistical signature imprinted by orography on the spatial structure of rainfall and its temporal evolution at multiple scales, with the aim to develop a consistent theoretical basis for conditional downscaling of precipitation given the topographic information of the underlying terrain. The results of an extensive analysis of the high resolution stage II Doppler radar data of the Rapidan storm, June 1995, over the Appalachian Mountains is reported in this study. The orographic signature on the elementary statistical structure of the precipitation fields is studied via a variable-intensity thresholding scheme. This signature is further explored at multiple scales via analysis of the dependence of precipitation fields on the underlying terrain both in Fourier and Wavelet domains. The Generalized Normal distribution is found to be a suitable probability model to explain the variability of the rainfall wavelet coefficients and its dependence on the underlying elevations. These results provide a new perspective for more accurate statistical downscaling of the orographic precipitation over complex terrain with emphasis on extremes.

  18. Uncertainty analysis in statistical modeling of extreme hydrological events

    NARCIS (Netherlands)

    Xu, Yue-Ping; Booij, Martijn J.; Tong, Yang-Bin

    2010-01-01

    With the increase of both magnitude and frequency of hydrological extreme events such as drought and flooding, the significance of adequately modeling hydrological extreme events is fully recognized. Estimation of extreme rainfall/flood for various return periods is of prime importance for hydrologi

  19. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Brunsell, Nathaniel [Univ. of Kansas, Lawrence, KS (United States); Mechem, David [Univ. of Kansas, Lawrence, KS (United States); Ma, Chunsheng [Wichita State Univ., KS (United States)

    2015-02-20

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive to alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the

  20. Exponential order statistic models of software reliability growth

    Science.gov (United States)

    Miller, D. R.

    1986-01-01

    Failure times of a software reliability growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  1. A Statistical Framework to Evaluate Extreme Weather Definitions from A Health Perspective

    Science.gov (United States)

    Vaidyanathan, Ambarish; Kegler, scott R.; Saha, Shubhayu S.; Mulholland, James A.

    2017-01-01

    A statistical framework for evaluating definitions of extreme weather phenomena can help weather agencies and health departments identify the definition(s) most applicable for alerts nd other preparedness operations related to extreme weather episodes. PMID:28883666

  2. Extreme weather exposure identification for road networks - a comparative assessment of statistical methods

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of

  3. Extreme value statistics and thermodynamics of earthquakes: aftershock sequences

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available The Gutenberg-Richter magnitude-frequency law takes into account the minimum detectable magnitude, and treats aftershocks as if they were independent and identically distributed random events. A new magnitude-frequency relation is proposed which takes into account the magnitude of the main shock, and the degree to which aftershocks depend on the main shock makes them appear clustered. In certain cases, there can be two branches in the order-statistics of aftershock sequences: for energies below threshold, the Pareto law applies and the asymptotic distribution of magnitude is the double-exponential distribution, while energies above threshold follow a one-parameter beta distribution, whose exponent is the cluster dimension, and the asymptotic Gompertz distribution predicts a maximum magnitude. The 1957 Aleutian Islands aftershock sequence exemplifies such dual behavior. A thermodynamics of aftershocks is constructed on the analogy between the non-conservation of the number of aftershocks and that of the particle number in degenerate gases.

  4. An Order Statistics Approach to the Halo Model for Galaxies

    Science.gov (United States)

    Paul, Niladri; Paranjape, Aseem; Sheth, Ravi K.

    2017-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models - one in which this luminosity function p(L) is universal - naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts no luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a halo mass dependent luminosity function p(L|m), is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-predicts central luminosities. This brings into focus the idea that central galaxies constitute a distinct population that is affected by different physical processes than are the satellites. We model this physical difference as a statistical brightening of the central luminosities, over and above the order statistics prediction. The magnitude gap between the brightest and second brightest group galaxy is predicted as a by-product, and is also in good agreement with observations. We propose that this order statistics framework provides a useful language in which to compare the Halo Model for galaxies with more physically motivated galaxy formation models.

  5. A statistical methodology for the estimation of extreme wave conditions for offshore renewable applications

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kalogeri, Christina; Galanis, George

    2015-01-01

    Rev, which is located in the North Sea, west of Denmark. The post-processing targets at correcting the modeled time series of the significant wave height, in order to match the statistics of the corresponding measurements, including not only the conventional parameters such as the mean and standard...... as a characteristic index of extreme wave conditions. The results from the proposed methodology seem to be in a good agreement with the measurements at both the relatively deep, open water and the shallow, coastal water sites, providing a potentially useful tool for offshore renewable energy applications. © 2015...

  6. Detection of small target using recursive higher order statistics

    Science.gov (United States)

    Hou, Wang; Sun, Hongyuan; Lei, Zhihui

    2014-02-01

    In this paper, a recursive higher order statistics algorithm is proposed for small target detection in temporal domain. Firstly, the background of image sequence is normalized. Then, the higher order statistics are recursively solved in image sequence to obtain the feature image. Finally, the feature image is segmented with threshold to detect the small target. To validate the algorithm proposed in this paper, five simulated and one semi-simulation image sequences are created. The ROC curves are employed for evaluation of experimental results. Experiment results show that our method is very effective for small target detection.

  7. Analysis of linear weighted order statistics CFAR algorithm

    Institute of Scientific and Technical Information of China (English)

    孟祥伟; 关键; 何友

    2004-01-01

    CFAR technique is widely used in radar targets detection fields. Traditional algorithm is cell averaging (CA),which can give a good detection performance in a relatively ideal environment. Recently, censoring technique is adopted to make the detector perform robustly. Ordered statistic (OS) and trimmed mean (TM) methods are proposed. TM methods treat the reference samples which participate in clutter power estimates equally, but this processing will not realize the effective estimates of clutter power. Therefore, in this paper a quasi best weighted (Q BW) order statistics algorithm is presented. In special cases, QBW reduces to CA and the censored mean level detector (CMLD).

  8. Eigenvalue Order Statistics for Random Schrödinger Operators with Doubly-Exponential Tails

    Science.gov (United States)

    Biskup, M.; König, W.

    2016-01-01

    We consider random Schrödinger operators of the form {Δ+ξ}, where {Δ} is the lattice Laplacian on {Zd} and {ξ} is an i.i.d. random field, and study the extreme order statistics of the Dirichlet eigenvalues for this operator restricted to large but finite subsets of {Zd}. We show that, for {ξ} with a doubly-exponential type of upper tail, the upper extreme order statistics of the eigenvalues falls into the Gumbel max-order class, and the corresponding eigenfunctions are exponentially localized in regions where {ξ} takes large, and properly arranged, values. The picture we prove is thus closely connected with the phenomenon of Anderson localization at the spectral edge. Notwithstanding, our approach is largely independent of existing methods for proofs of Anderson localization and it is based on studying individual eigenvalue/eigenfunction pairs and characterizing the regions where the leading eigenfunctions put most of their mass.

  9. Timing optimization utilizing order statistics and multichannel digital silicon photomultipliers

    NARCIS (Netherlands)

    Mandai, S.; Venialgo, E.; Charbon, E.

    2014-01-01

    We present an optimization technique utilizing order statistics with a multichannel digital silicon photomultiplier (MD-SiPM) for timing measurements. Accurate timing measurements are required by 3D rangefinding and time-of-flight positron emission tomography, to name a few applications. We have

  10. Timing optimization utilizing order statistics and multichannel digital silicon photomultipliers

    NARCIS (Netherlands)

    Mandai, S.; Venialgo, E.; Charbon, E.

    2014-01-01

    We present an optimization technique utilizing order statistics with a multichannel digital silicon photomultiplier (MD-SiPM) for timing measurements. Accurate timing measurements are required by 3D rangefinding and time-of-flight positron emission tomography, to name a few applications. We have dem

  11. Timing optimization utilizing order statistics and multichannel digital silicon photomultipliers

    NARCIS (Netherlands)

    Mandai, S.; Venialgo, E.; Charbon, E.

    2014-01-01

    We present an optimization technique utilizing order statistics with a multichannel digital silicon photomultiplier (MD-SiPM) for timing measurements. Accurate timing measurements are required by 3D rangefinding and time-of-flight positron emission tomography, to name a few applications. We have dem

  12. mitants of Order Statistics from Bivariate Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aleem

    2006-01-01

    Full Text Available The probability density function (pdf of the rth, 1 r n and joint pdf of the rth and sth, 1 rorder statistics are derived for Bivariate Inverse Rayleigh Distribution and their moments, product moments are obtained. Its percentiles are also obtained.

  13. Useful experimental designs and rank order statistics in educational research

    Directory of Open Access Journals (Sweden)

    Zendler, Andreas

    2013-01-01

    Full Text Available Experimental educational research is of great impact because it illuminates cause-and-effect relationships by accumulating empirical evidence. The present article does not propose new methods but brings three useful experimental designs as well as appropriate statistical procedures (rank order statistics to the attention of the reader to conduct educational experiments, even with small samples. By means of their systematic use combined with the process-product paradigm of experimental educational research, the influence of essential variables (teacher, context, and process variables in schools, universities, and other educational institutions can be investigated. The statistical procedures described in this article guarantee that small samples (e.g. a school class can be successfully used, and that product variables (e.g. knowledge, comprehension, transfer are only required to meet the criteria of an ordinal scale. The experimental designs and statistical procedures are exemplified by hypothetical data and detailed calculations.

  14. Statistics of Extreme Events with Application to Climate

    Science.gov (United States)

    1992-01-01

    costs associated with global warming will be measured in terms of changes in the frequency and intensity of extreme events such as droughts, floods...in climate studies or in discussions of greenhouse warming despite the obvious importance of large deviations from 1 the mean. The theory and...examining 33 7.60 Globa Averaged Temerture Range for Gaussian Distributi, Dew oin Tepertur 6.40 r5-0Sea Surface Temperature - 5.20 4.60 4,00

  15. Recent trends in heavy precipitation extremes over Germany: A thorough intercomparison between different statistical approaches

    Science.gov (United States)

    Donner, Reik; Passow, Christian

    2016-04-01

    , linear trends in the location parameter are sufficient (which are significant for 803 stations). In summary, quantile regression provides a prospective alternative tool for statistically identifying and quantifying robust trends in hydro-meteorological extremes, which has certain conceptual benefits that allow its application to shorter time series than commonly required in the context of studies on time-dependent extremes. However, further systematic inter-comparisons with classical approaches based on extreme value theory are necessary in order to identify the fundamental causes of deviations between the trends revealed by both concepts.

  16. Parameter estimation of stable distribution based on zero - order statistics

    Science.gov (United States)

    Chen, Jian; Chen, Hong; Cai, Xiaoxia; Weng, Pengfei; Nie, Hao

    2017-08-01

    With the increasing complexity of the channel, there are many impulse noise signals in the real channel. The statistical properties of such processes are significantly deviated from the Gaussian distribution, and the Alpha stable distribution provides a very useful theoretical tool for this process. This paper focuses on the parameter estimation method of the Alpha stable distribution. First, the basic theory of Alpha stable distribution is introduced. Then, the concept of logarithmic moment and geometric power are proposed. Finally, the parameter estimation of Alpha stable distribution is realized based on zero order statistic (ZOS). This method has better toughness and precision.

  17. Dynamic process analysis by moments of extreme orders

    Science.gov (United States)

    Šimberová, S.; Suk, T.

    2016-01-01

    Dynamic processes in astronomical observations are captured in various video sequences. The image datacubes are represented by the datasets of random variables. Diagnostics of a fast developing event is based on the specific behavior of the high-order moments (HOM) in time. The moment curves computed in an image video sequence give valuable information about various phases of the phenomenon and significant periods in the frequency analysis. The proposed method uses statistical moments of high and very high orders to describe and investigate the dynamic process in progress. Since these moments are highly correlated, the method of principal component analysis (PCA) has been suggested for following frequency analysis. PCA can be used both for decorrelation of the moments and for determination of the number of used moments. We experimentally illustrate performance of the method on simulated data. A typical development of the dynamic phenomenon is modeled by the moment time curve. Then applications to the real data sequences follow: solar active regions observed in the spectral line H α (wavelength 6563 A˚-Ondřejov and Kanzelhöhe observatories) in two different angular resolutions. The frequency analysis of the first few principal components showed common periods or quasi-periods of all examined events and the periods specific for individual events. The detailed analysis of the moment's methodology can contribute to the observational mode settings. The method can be applied to video sequences obtained by observing systems with various angular resolutions. It is robust to noise and it can work with high range of sampling frequencies.

  18. PERFORMANCE ANALYSIS OF SECOND-ORDER STATISTICS FOR CYCLOSTATIONARY SIGNALS

    Institute of Scientific and Technical Information of China (English)

    姜鸣; 陈进

    2002-01-01

    The second-order statistics for cyclostationary signals were introduced, and their performance were discussed. It especially researched the time lag characteristic of the cyclic autocorrelation function and spectral correlation characteristic of spectral correlation density function. It was pointed out that those functions can be available to extract the time-vary information of the kind of non-stationary signals. Using the relations of time lag-cyclic frequency and frequency-cyclic frequency independently, vibration signals of a rolling element bearing measured on test bed were analyzed. The results indicate that the second-order cyclostationary statistics might provide a powerful tool for the feature extracting and fault diagnosis of rolling element bearing.

  19. An Order Statistics Approach to the Halo Model for Galaxies

    CERN Document Server

    Paul, Niladri; Sheth, Ravi K

    2016-01-01

    We use the Halo Model to explore the implications of assuming that galaxy luminosities in groups are randomly drawn from an underlying luminosity function. We show that even the simplest of such order statistics models -- one in which this luminosity function $p(L)$ is universal -- naturally produces a number of features associated with previous analyses based on the `central plus Poisson satellites' hypothesis. These include the monotonic relation of mean central luminosity with halo mass, the Lognormal distribution around this mean, and the tight relation between the central and satellite mass scales. In stark contrast to observations of galaxy clustering, however, this model predicts $\\textit{no}$ luminosity dependence of large scale clustering. We then show that an extended version of this model, based on the order statistics of a $\\textit{halo mass dependent}$ luminosity function $p(L|m)$, is in much better agreement with the clustering data as well as satellite luminosities, but systematically under-pre...

  20. Research on Life Signals Detection Based on Higher Order Statistics

    Directory of Open Access Journals (Sweden)

    Jian-Jun Li

    2012-10-01

    Full Text Available The life signals are built on harmonic mode for their low frequency, quasi-periodicity, low SNR, and the easy submerged in strong clutter noise. The method for detecting life signal based on adaptive filter and high order statistics is presented, in which neither the Gaussian supposition of the observed signal, nor a prior information about the waveform and arrival time of the observed signal is necessary. The principle of method is to  separate the spectrum of input signal into many narrow frequency bands, whose Sub-band signal  is followed by a short-time estimation of higher-order statistics so as to suppress Gaussian noises. Simulated results show that the method can effectively detect life signals from noise with good convergence speed and stability, and greatly improve the signal quality with respect to LMS method.

  1. First- and second-order statistics of optical near fields.

    Science.gov (United States)

    Apostol, Adela; Dogariu, Aristide

    2004-02-01

    The statistical properties of the intensity in close proximity to highly scattering, randomly inhomogeneous media are investigated. Whereas the intensity probability density function obeys the same law irrespective of the distance z from the interface, the second-order intensity correlation length changes for distances smaller than the wavelength. Contrary to predictions of the conventional coherence theory, the corresponding field correlation length can be smaller than the wavelength of light.

  2. Statistics of extreme objects in the Juropa Hubble Volume simulation

    CERN Document Server

    Watson, W A; Diego, J M; Gottlöber, S; Knebe, A; Martínez-González, E; Yepes, G

    2013-01-01

    We present the first results from the JUropa huBbLE volumE (Jubilee) project, based on the output from a large N-body, dark matter-only cosmological simulation with a volume of V=(6Gpc/h)^3, containing 6000^3 particles, performed within the concordance Lambda-CDM cosmological model. The simulation volume is sufficient to probe extremely large length scales in the universe, whilst at the same time the particle count is high enough so that dark matter haloes down to 1.5x10^12 M_sun/h can be resolved. At z = 0 we identify over 400 million haloes, and the first haloes in the simulation form at z = 11. We present an all-sky map of the Integrated Sachs Wolfe signal calculated from the gravitational potential in the box between z = 0-1.4. The cluster mass function is derived using three different halofinders and compared to fitting functions in the literature, with results being consistent with previous studies across most of the mass-range of the simulation. We compare simulated clusters of maximal mass across reds...

  3. Extreme value statistics and traveling fronts: application to computer science.

    Science.gov (United States)

    Majumdar, Satya N; Krapivsky, P L

    2002-03-01

    We study the statistics of height and balanced height in the binary search tree problem in computer science. The search tree problem is first mapped to a fragmentation problem that is then further mapped to a modified directed polymer problem on a Cayley tree. We employ the techniques of traveling fronts to solve the polymer problem and translate back to derive exact asymptotic properties in the original search tree problem. The second mapping allows us not only to again derive the already known results for random binary trees but to obtain exact results for search trees where the entries arrive according to an arbitrary distribution, not necessarily randomly. Besides it allows us to derive the asymptotic shape of the full probability distribution of height and not just its moments. Our results are then generalized to m-ary search trees with arbitrary distribution.

  4. Statistical analysis of extreme values from insurance, finance, hydrology and other fields

    CERN Document Server

    Reiss, Rolf-Dieter

    1997-01-01

    The statistical analysis of extreme data is important for various disciplines, including hydrology, insurance, finance, engineering and environmental sciences. This book provides a self-contained introduction to the parametric modeling, exploratory analysis and statistical interference for extreme values. The entire text of this third edition has been thoroughly updated and rearranged to meet the new requirements. Additional sections and chapters, elaborated on more than 100 pages, are particularly concerned with topics like dependencies, the conditional analysis and the multivariate modeling of extreme data. Parts I–III about the basic extreme value methodology remain unchanged to some larger extent, yet notable are, e.g., the new sections about "An Overview of Reduced-Bias Estimation" (co-authored by M.I. Gomes), "The Spectral Decomposition Methodology", and "About Tail Independence" (co-authored by M. Frick), and the new chapter about "Extreme Value Statistics of Dependent Random Variables" (co-authored ...

  5. Most Likely Response Waves for Estimation of Extreme Value Ship Response Statistics

    DEFF Research Database (Denmark)

    Dietz, Jesper Skjoldager; Friis-Hansen, Peter; Jensen, Jørgen Juncher

    2004-01-01

    Fast and accurate methods for estimation of non-linear extreme value ship response statistics using 2D or 3D time-domain codes are of interest. The present study illustrates a new approach using Most Likely Response Waves (MLRW) to estimate the entire non-linear extreme response value distribution...

  6. Detection of a diffusive cloak via second-order statistics

    CERN Document Server

    Koirala, Milan

    2016-01-01

    We propose a scheme to detect the diffusive cloak proposed by Schittny et al [Science 345, 427 (2014)]. We exploit the fact that diffusion of light is an approximation that disregards wave interference. The long-range contribution to intensity correlation is sensitive to locations of paths crossings and the interference inside the medium, allowing one to detect the size and position, including the depth, of the diffusive cloak. Our results also suggest that it is possible to separately manipulate the first- and the second-order statistics of wave propagation in turbid media.

  7. Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation

    Science.gov (United States)

    Tchiguirinskaia, Ioulia; Scherzer, Daniel

    2016-04-01

    Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate

  8. Exact Extremal Statistics in the Classical 1D Coulomb Gas

    Science.gov (United States)

    Dhar, Abhishek; Kundu, Anupam; Majumdar, Satya N.; Sabhapandit, Sanjib; Schehr, Grégory

    2017-08-01

    We consider a one-dimensional classical Coulomb gas of N -like charges in a harmonic potential—also known as the one-dimensional one-component plasma. We compute, analytically, the probability distribution of the position xmax of the rightmost charge in the limit of large N . We show that the typical fluctuations of xmax around its mean are described by a nontrivial scaling function, with asymmetric tails. This distribution is different from the Tracy-Widom distribution of xmax for Dyson's log gas. We also compute the large deviation functions of xmax explicitly and show that the system exhibits a third-order phase transition, as in the log gas. Our theoretical predictions are verified numerically.

  9. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Science.gov (United States)

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  10. Extreme-value statistics of fractional Brownian motion bridges.

    Science.gov (United States)

    Delorme, Mathieu; Wiese, Kay Jörg

    2016-11-01

    Fractional Brownian motion is a self-affine, non-Markovian, and translationally invariant generalization of Brownian motion, depending on the Hurst exponent H. Here we investigate fractional Brownian motion where both the starting and the end point are zero, commonly referred to as bridge processes. Observables are the time t_{+} the process is positive, the maximum m it achieves, and the time t_{max} when this maximum is taken. Using a perturbative expansion around Brownian motion (H=1/2), we give the first-order result for the probability distribution of these three variables and the joint distribution of m and t_{max}. Our analytical results are tested and found to be in excellent agreement, with extensive numerical simulations for both H>1/2 and H<1/2. This precision is achieved by sampling processes with a free end point and then converting each realization to a bridge process, in generalization to what is usually done for Brownian motion.

  11. Multicomponent seismic noise attenuation with multivariate order statistic filters

    Science.gov (United States)

    Wang, Chao; Wang, Yun; Wang, Xiaokai; Xun, Chao

    2016-10-01

    The vector relationship between multicomponent seismic data is highly important for multicomponent processing and interpretation, but this vector relationship could be damaged when each component is processed individually. To overcome the drawback of standard component-by-component filtering, multivariate order statistic filters are introduced and extended to attenuate the noise of multicomponent seismic data by treating such dataset as a vector wavefield rather than a set of scalar fields. According to the characteristics of seismic signals, we implement this type of multivariate filtering along local events. First, the optimal local events are recognized according to the similarity between the vector signals which are windowed from neighbouring seismic traces with a sliding time window along each trial trajectory. An efficient strategy is used to reduce the computational cost of similarity measurement for vector signals. Next, one vector sample each from the neighbouring traces are extracted along the optimal local event as the input data for a multivariate filter. Different multivariate filters are optimal for different noise. The multichannel modified trimmed mean (MTM) filter, as one of the multivariate order statistic filters, is applied to synthetic and field multicomponent seismic data to test its performance for attenuating white Gaussian noise. The results indicate that the multichannel MTM filter can attenuate noise while preserving the relative amplitude information of multicomponent seismic data more effectively than a single-channel filter.

  12. Statistic analysis of annual total ozone extremes for the period 1964-1988

    Science.gov (United States)

    Krzyscin, Janusz W.

    1994-01-01

    Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.

  13. Inter-comparison of statistical downscaling methods for projection of extreme flow indices across Europe

    DEFF Research Database (Denmark)

    Hundecha, Yeshewatesfa; Sunyer Pinya, Maria Antonia; Lawrence, Deborah;

    2016-01-01

    flow indices in most of the catchments. The catchments where the extremes are expected to increase have a rainfall-dominated flood regime. In these catchments, the downscaling methods also project an increase in the extreme precipitation in the seasons when the extreme flows occur. In catchments where...... the flooding is mainly caused by spring/summer snowmelt, the downscaling methods project a decrease in the extreme flows in three of the four catchments considered. A major portion of the variability in the projected changes in the extreme flow indices is attributable to the variability of the climate model......The effect of methods of statistical downscaling of daily precipitation on changes in extreme flow indices under a plausible future climate change scenario was investigated in 11 catchments selected from 9 countries in different parts of Europe. The catchments vary from 67 to 6171km2 in size...

  14. Timing optimization utilizing order statistics and multichannel digital silicon photomultipliers.

    Science.gov (United States)

    Mandai, Shingo; Venialgo, Esteban; Charbon, Edoardo

    2014-02-01

    We present an optimization technique utilizing order statistics with a multichannel digital silicon photomultiplier (MD-SiPM) for timing measurements. Accurate timing measurements are required by 3D rangefinding and time-of-flight positron emission tomography, to name a few applications. We have demonstrated the ability of the MD-SiPM to detect multiple photons, and we verified the advantage of detecting multiple photons assuming incoming photons follow a Gaussian distribution. We have also shown the advantage of utilizing multiple timestamps for estimating time-of-arrivals more accurately. This estimation technique can be widely available in various applications, which have a certain probability density function of incoming photons, such as a scintillator or a laser source.

  15. Nonrigid registration of volumetric images using ranked order statistics

    DEFF Research Database (Denmark)

    Tennakoon, Ruwan; Bab-Hadiashar, Alireza; Cao, Zhenwei

    2014-01-01

    Non-rigid image registration techniques using intensity based similarity measures are widely used in medical imaging applications. Due to high computational complexities of these techniques, particularly for volumetric images, finding appropriate registration methods to both reduce the computation...... burden and increase the registration accuracy has become an intensive area of research. In this paper we propose a fast and accurate non-rigid registration method for intra-modality volumetric images. Our approach exploits the information provided by an order statistics based segmentation method, to find...... the important regions for registration and use an appropriate sampling scheme to target those areas and reduce the registration computation time. A unique advantage of the proposed method is its ability to identify the point of diminishing returns and stop the registration process. Our experiments...

  16. Higher order statistical moment application for solar PV potential analysis

    Science.gov (United States)

    Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan

    2016-10-01

    Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.

  17. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  18. Bridging centrality and extremity : Refining empirical data depth using extreme value statistics

    NARCIS (Netherlands)

    Einmahl, John; Li, Jun; R.Y., Liu

    2015-01-01

    Data depth measures the centrality of a point with respect to a given distribution or data cloud. It provides a natural center-outward ordering of multivariate data points and yields a systematic nonparametric multivariate analysis scheme. In particular, the halfspace depth is shown to have many des

  19. On Extreme Value Statistics: maximum likelihood; portfolio optimization; extremal rainfall; internet auctions

    NARCIS (Netherlands)

    C. Zhou (Chen)

    2008-01-01

    textabstractIn the 18th century, statisticians sometimes worked as consultants to gamblers. In order to answer questions like "If a fair coin is flipped 100 times, what is the probability of getting 60 or more heads?", Abraham de Moivre discovered the so-called "normal curve". Independently, Pierre-

  20. Inter-comparison of statistical downscaling methods for projection of extreme flow indices across Europe

    Science.gov (United States)

    Hundecha, Yeshewatesfa; Sunyer, Maria A.; Lawrence, Deborah; Madsen, Henrik; Willems, Patrick; Bürger, Gerd; Kriaučiūnienė, Jurate; Loukas, Athanasios; Martinkova, Marta; Osuch, Marzena; Vasiliades, Lampros; von Christierson, Birgitte; Vormoor, Klaus; Yücel, Ismail

    2016-10-01

    The effect of methods of statistical downscaling of daily precipitation on changes in extreme flow indices under a plausible future climate change scenario was investigated in 11 catchments selected from 9 countries in different parts of Europe. The catchments vary from 67 to 6171 km2 in size and cover different climate zones. 15 regional climate model outputs and 8 different statistical downscaling methods, which are broadly categorized as change factor and bias correction based methods, were used for the comparative analyses. Different hydrological models were implemented in different catchments to simulate daily runoff. A set of flood indices were derived from daily flows and their changes have been evaluated by comparing their values derived from simulations corresponding to the current and future climate. Most of the implemented downscaling methods project an increase in the extreme flow indices in most of the catchments. The catchments where the extremes are expected to increase have a rainfall-dominated flood regime. In these catchments, the downscaling methods also project an increase in the extreme precipitation in the seasons when the extreme flows occur. In catchments where the flooding is mainly caused by spring/summer snowmelt, the downscaling methods project a decrease in the extreme flows in three of the four catchments considered. A major portion of the variability in the projected changes in the extreme flow indices is attributable to the variability of the climate model ensemble, although the statistical downscaling methods contribute 35-60% of the total variance.

  1. Statistical analysis of extreme values for geomagnetic and geoelectric field variations for Canada

    Science.gov (United States)

    Nikitina, Lidia; Trichtchenko, Larisa; Boteler, David

    2016-04-01

    Disturbances of the geomagnetic field produced by space weather events cause variable geoelectric fields at Earth's surface which drive electric currents in power systems, resulting in hazardous impacts on electric power transmission. In extreme cases, as during the magnetic storm in March 13, 1989, this can result in burnt-out transformers and power blackouts. To make assessment of geomagnetic and geoelectric activity in Canada during extreme space weather events, extreme value statistical analysis has been applied to more than 40 years of magnetic data from the Canadian geomagnetic observatories network. This network has archived digital data recordings for observatories located in sub-auroral, auroral, and polar zones. Extreme value analysis was applied to hourly ranges of geomagnetic variations as an index of geomagnetic activity and to hourly maximum of rate-of-change of geomagnetic field. To estimate extreme geoelectric fields, the minute geomagnetic data were used together with Earth conductivity models for different Canadian locations to calculate geoelectric fields. The extreme value statistical analysis was applied to hourly maximum values of the horizontal geoelectric field. This assessment provided extreme values of geomagnetic and geoelectric activity which are expected to happen once per 50 years and once per 100 years. The results of this analysis are designed to be used to assess the geomagnetic hazard to power systems and help the power industry mitigate risks from extreme space weather events.

  2. Independent Component Analysis of Complex Valued Signals Based on First-order Statistics

    Directory of Open Access Journals (Sweden)

    P.C. Xu

    2013-12-01

    Full Text Available This paper proposes a novel method based on first-order statistics, aims to solve the problem of the independent component extraction of complex valued signals in instantaneous linear mixtures. Single-step and iterative algorithms are proposed and discussed under the engineering practice. Theoretical performance analysis about asymptotic interference-to-signal ratio (ISR and probability of correct support estimation (PCE are accomplished. Simulation examples validate the theoretic analysis, and demonstrate that the single-step algorithm is extremely effective. Moreover, the iterative algorithm is more efficient than complex FastICA under certain circumstances.

  3. Relation of the fourth-order statistical invariants of velocity gradient tensor in isotropic turbulence

    Science.gov (United States)

    Fang, L.; Zhang, Y. J.; Fang, J.; Zhu, Y.

    2016-08-01

    We show by direct numerical simulations (DNSs) that in different types of isotropic turbulence, the fourth-order statistical invariants have approximately a linear relation, which can be represented by a straight line in the phase plane, passing two extreme states: the Gaussian state and the restricted Euler state. Also, each DNS case corresponds to an equilibrium region that is roughly Reynolds-dependent. In addition, both the time reversal and the compressibility effect lead to nonequilibrium transition processes in this phase plane. This observation adds a new restriction on the mean-field theory.

  4. Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke

    2005-01-01

    by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...... to draw up extreme event statistics covering return periods of as much as 33 years. By comparing these two different extreme event statistics it is evident that these to a great extent depend on the uncertainties related to the input parameters of the rainfall runoff model....... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...

  5. Extreme-value statistics of intensities in a cw-pumped random fiber laser

    Science.gov (United States)

    Lima, Bismarck C.; Pincheira, Pablo I. R.; Raposo, Ernesto P.; Menezes, Leonardo de S.; de Araújo, Cid B.; Gomes, Anderson S. L.; Kashyap, Raman

    2017-07-01

    We report on the extreme-value statistics of output intensities in a one-dimensional cw-pumped erbium-doped random fiber laser, with a strongly scattering disordered medium consisting of randomly spaced Bragg gratings. The experimental findings from the analysis of a large number of emission spectra are well described by the Gumbel distribution below and above the laser threshold, whereas the Fréchet distribution, typical of strongly fluctuating extreme events with heavy power-law probability tails, provides a nice support to the data near the threshold. We establish a close connection, relying on theoretical arguments, between the reported extreme-value statistics and the shifts in the statistics of intensity fluctuations, from the Gaussian to the Lévy distribution at the threshold and back to the Gaussian well above threshold.

  6. Bipartite Diametrical Graphs of Diameter 4 and Extreme Orders

    Directory of Open Access Journals (Sweden)

    Salah Al-Addasi

    2008-01-01

    in which this upper bound is attained, this graph can be viewed as a generalization of the Rhombic Dodecahedron. Then we show that for any ≥2, the graph (2,2 is the unique (up to isomorphism bipartite diametrical graph of diameter 4 and partite sets of cardinalities 2 and 2, and hence in particular, for =3, the graph (6,8 which is just the Rhombic Dodecahedron is the unique (up to isomorphism bipartite diametrical graph of such a diameter and cardinalities of partite sets. Thus we complete a characterization of -graphs of diameter 4 and cardinality of the smaller partite set not exceeding 6. We prove that the neighborhoods of vertices of the larger partite set of (2,2 form a matroid whose basis graph is the hypercube . We prove that any -graph of diameter 4 is bipartite self complementary, thus in particular (2,2. Finally, we study some additional properties of (2,2 concerning the order of its automorphism group, girth, domination number, and when being Eulerian.

  7. Higher order statistical frequency domain decomposition for operational modal analysis

    Science.gov (United States)

    Nita, G. M.; Mahgoub, M. A.; Sharyatpanahi, S. G.; Cretu, N. C.; El-Fouly, T. M.

    2017-02-01

    Experimental methods based on modal analysis under ambient vibrational excitation are often employed to detect structural damages of mechanical systems. Many of such frequency domain methods, such as Basic Frequency Domain (BFD), Frequency Domain Decomposition (FFD), or Enhanced Frequency Domain Decomposition (EFFD), use as first step a Fast Fourier Transform (FFT) estimate of the power spectral density (PSD) associated with the response of the system. In this study it is shown that higher order statistical estimators such as Spectral Kurtosis (SK) and Sample to Model Ratio (SMR) may be successfully employed not only to more reliably discriminate the response of the system against the ambient noise fluctuations, but also to better identify and separate contributions from closely spaced individual modes. It is shown that a SMR-based Maximum Likelihood curve fitting algorithm may improve the accuracy of the spectral shape and location of the individual modes and, when combined with the SK analysis, it provides efficient means to categorize such individual spectral components according to their temporal dynamics as coherent or incoherent system responses to unknown ambient excitations.

  8. Order Statistics Theory of Unfolding of Multimeric Proteins

    Science.gov (United States)

    Zhmurov, A.; Dima, R.I.; Barsegov, V.

    2010-01-01

    Dynamic force spectroscopy has become indispensable for the exploration of the mechanical properties of proteins. In force-ramp experiments, performed by utilizing a time-dependent pulling force, the peak forces for unfolding transitions in a multimeric protein (D)N are used to map the free energy landscape for unfolding for a protein domain D. We show that theoretical modeling of unfolding transitions based on combining the observed first (f1), second (f2), …, Nth (fN) unfolding forces for a protein tandem of fixed length N, and pooling the force data for tandems of different length, n1 molecular characteristics that determine the unfolding micromechanics. We present a simple method of estimation of the parent distribution, ψD(f), based on analyzing the force data for a tandem (D)n of arbitrary length n. Order statistics theory is exemplified through a detailed analysis and modeling of the unfolding forces obtained from pulling simulations of the monomer and oligomers of the all-β-sheet WW domain. PMID:20858442

  9. Extreme events statistics in a two-layer quasi-geostrophic atmospheric model

    Science.gov (United States)

    Galfi, Vera Melinda; Bodai, Tamas; Lucarini, Valerio

    2016-04-01

    Extreme events statistics provides a theoretical framework to analyze and predict extreme events based on the convergence of the distribution of the extremes to some limiting distribution. In this work we analyze the convergence of the distribution of extreme events to the Generalized Extreme Value (GEV) distribution and to the Generalized Pareto Distribution (GPD), using a two-layer quasi-geostrophic atmospheric model, and compare our results with theoretical findings from the field of extreme value theory for dynamical systems. We study the behavior of the GEV shape parameter by increasing the block size and of the GPD shape parameter by increasing the threshold, and compare the inferred parameters with a theoretical shape parameter that depends only on the geometrical properties of the attractor. The main objective is to find out whether this theoretical shape parameter can be used to evaluate extreme event analysis based on model output. For this, we perform very long simulations. We run our system with two different levels of forcing determined by two different meridional temperature gradients, one inducing a medium level of chaos and the other one a high level of chaos. We analyze in both cases extremes of energy variables.

  10. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    Science.gov (United States)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  11. A Statistical Methodology for Determination of Safety Systems Actuation Setpoints Based on Extreme Value Statistics

    Directory of Open Access Journals (Sweden)

    D. R. Novog

    2008-01-01

    Full Text Available This paper provides a novel and robust methodology for determination of nuclear reactor trip setpoints which accounts for uncertainties in input parameters and models, as well as accounting for the variations in operating states that periodically occur. Further it demonstrates that in performing best estimate and uncertainty calculations, it is critical to consider the impact of all fuel channels and instrumentation in the integration of these uncertainties in setpoint determination. This methodology is based on the concept of a true trip setpoint, which is the reactor setpoint that would be required in an ideal situation where all key inputs and plant responses were known, such that during the accident sequence a reactor shutdown will occur which just prevents the acceptance criteria from being exceeded. Since this true value cannot be established, the uncertainties in plant simulations and plant measurements as well as operational variations which lead to time changes in the true value of initial conditions must be considered. This paper presents the general concept used to determine the actuation setpoints considering the uncertainties and changes in initial conditions, and allowing for safety systems instrumentation redundancy. The results demonstrate unique statistical behavior with respect to both fuel and instrumentation uncertainties which has not previously been investigated.

  12. EXISTENCE OF EXTREME SOLUTION TO FIRST-ORDER IMPULSIVE DIFFERENTIAL EQUATIONS WITH NONLINEAR BOUNDARY CONDITIONS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    This paper is concerned with the existence of extreme solutions to three-point boundary value problems with nonlinear boundary conditions for a class of first order impulsive differential equations. We obtain suficient conditions for the existence of extreme solutions by the upper and lower solutions method coupled with a monotone iterative technique.

  13. An empirical study to order citation statistics between subject fields

    CERN Document Server

    van Zyl, J Martin

    2012-01-01

    An empirical study is conducted to compare citations per publication, statistics and observed Hirsch indexes between subject fields using summary statistics of countries. No distributional assumptions are made and ratios are calculated. These ratios can be used to make approximate comparisons between researchers of different subject fields with respect to the Hirsch index.

  14. Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall

    Science.gov (United States)

    Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.

  15. Present-day and future mediterranean precipitation extremes assessed by different statistical approaches

    Science.gov (United States)

    Paxian, A.; Hertig, E.; Seubert, S.; Vogt, G.; Jacobeit, J.; Paeth, H.

    2015-02-01

    The Mediterranean area is strongly vulnerable to future changes in temperature and precipitation, particularly concerning extreme events, and has been identified as a climate change hot spot. This study performs a comprehensive investigation of present-day and future Mediterranean precipitation extremes based on station data, gridded observations and simulations of the regional climate model (REMO) driven by the coupled global general circulation model ECHAM5/MPI-OM. Extreme value estimates from different statistical methods—quantile-based indices, generalized pareto distribution (GPD) based return values and data from a weather generator—are compared and evaluated. Dynamical downscaling reveals improved small-scale topographic structures and more realistic higher rainfall totals and extremes over mountain ranges and in summer. REMO tends to overestimate gridded observational data in winter but is closer to local station information. The dynamical-statistical weather generator provides virtual station rainfall from gridded REMO data that overcomes typical discrepancies between area-averaged model rainfall and local station information, e.g. overestimated numbers of rainy days and underestimated extreme intensities. Concerning future rainfall amount, strong summer and winter drying over the northern and southern Mediterranean, respectively, is confronted with winter wetting over the northern part. In contrast, precipitation extremes tend to increase in even more Mediterranean areas, implying regions with decreasing totals but intensifying extremes, e.g. southern Europe and Turkey in winter and the Balkans in summer. The GPD based return values reveal slightly larger regions of increasing rainfall extremes than quantile-based indices, and the virtual stations from the weather generator show even stronger increases.

  16. Entanglement between two subsystems, the Wigner semicircle and extreme value statistics

    CERN Document Server

    Bhosale, Udaysinh T; Lakshminarayan, Arul

    2012-01-01

    The entanglement between two arbitrary subsystems of random pure states is studied via properties of the density matrix's partial transpose, $\\rho_{12}^{T_2}$. The density of states of $\\rho_{12}^{T_2}$ is close to the semicircle law when both subsystems have dimensions which are not too small and are of the same order. A simple random matrix model for the partial transpose is found to capture the entanglement properties well, including a transition across a critical dimension. Log-negativity is used to quantify entanglement between subsystems and approximate analytic formulas for this are derived. The skewness of the eigenvalue density of $\\rho_{12}^{T_2}$ is derived analytically, using the average of the third moment that is also shown to be related to a generalization of the Kempe invariant. Extreme value statistics, especially the Tracy-Widom distribution, is found to be useful in calculating the fraction of entangled states at critical dimensions. These results are tested in a quantum dynamical system of...

  17. Statistical Downscaling of Gusts During Extreme European Winter Storms Using Radial-Basis-Function Networks

    Science.gov (United States)

    Voigt, M.; Lorenz, P.; Kruschke, T.; Osinski, R.; Ulbrich, U.; Leckebusch, G. C.

    2012-04-01

    Winterstorms and related gusts can cause extensive socio-economic damages. Knowledge about the occurrence and the small scale structure of such events may help to make regional estimations of storm losses. For a high spatial and temporal representation, the use of dynamical downscaling methods (RCM) is a cost-intensive and time-consuming option and therefore only applicable for a limited number of events. The current study explores a methodology to provide a statistical downscaling, which offers small scale structured gust fields from an extended large scale structured eventset. Radial-basis-function (RBF) networks in combination with bidirectional Kohonen (BDK) maps are used to generate the gustfields on a spatial resolution of 7 km from the 6-hourly mean sea level pressure field from ECMWF reanalysis data. BDK maps are a kind of neural network which handles supervised classification problems. In this study they are used to provide prototypes for the RBF network and give a first order approximation for the output data. A further interpolation is done by the RBF network. For the training process the 50 most extreme storm events over the North Atlantic area from 1957 to 2011 are used, which have been selected from ECMWF reanalysis datasets ERA40 and ERA-Interim by an objective wind based tracking algorithm. These events were downscaled dynamically by application of the DWD model chain GME → COSMO-EU. Different model parameters and their influence on the quality of the generated high-resolution gustfields are studied. It is shown that the statistical RBF network approach delivers reasonable results in modeling the regional gust fields for untrained events.

  18. Multivariate Regression Analysis and Statistical Modeling for Summer Extreme Precipitation over the Yangtze River Basin, China

    Directory of Open Access Journals (Sweden)

    Tao Gao

    2014-01-01

    Full Text Available Extreme precipitation is likely to be one of the most severe meteorological disasters in China; however, studies on the physical factors affecting precipitation extremes and corresponding prediction models are not accurately available. From a new point of view, the sensible heat flux (SHF and latent heat flux (LHF, which have significant impacts on summer extreme rainfall in Yangtze River basin (YRB, have been quantified and then selections of the impact factors are conducted. Firstly, a regional extreme precipitation index was applied to determine Regions of Significant Correlation (RSC by analyzing spatial distribution of correlation coefficients between this index and SHF, LHF, and sea surface temperature (SST on global ocean scale; then the time series of SHF, LHF, and SST in RSCs during 1967–2010 were selected. Furthermore, other factors that significantly affect variations in precipitation extremes over YRB were also selected. The methods of multiple stepwise regression and leave-one-out cross-validation (LOOCV were utilized to analyze and test influencing factors and statistical prediction model. The correlation coefficient between observed regional extreme index and model simulation result is 0.85, with significant level at 99%. This suggested that the forecast skill was acceptable although many aspects of the prediction model should be improved.

  19. Quantum Statistical Entropy of Non-extreme and Nearly Extreme Black Holes in Higher-Dimensional Space-Time

    Institute of Scientific and Technical Information of China (English)

    XU Dian-Yan

    2003-01-01

    The free energy and entropy of Reissner-Nordstrom black holes in higher-dimensional space-time are calculated by the quantum statistic method with a brick wall model. The space-time of the black holes is divided into three regions: region 1, (r > r0); region 2, (r0 > r > n); and region 3, (T-J > r > 0), where r0 is the radius of the outer event horizon, and r, is the radius of the inner event horizon. Detailed calculation shows that the entropy contributed by region 2 is zero, the entropy contributed by region 1 is positive and proportional to the outer event horizon area, the entropy contributed by region 3 is negative and proportional to the inner event horizon area. The total entropy contributed by all the three regions is positive and proportional to the area difference between the outer and inner event horizons. As rt approaches r0 in the nearly extreme case, the total quantum statistical entropy approaches zero.

  20. Conditional Second Order Short-crested Water Waves Applied to Extreme Wave Episodes

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2005-01-01

    A derivation of the mean second order short-crested wave pattern and associated wave kinematics, conditional on a given magnitude of the wave crest, is presented. The analysis is based on the second order Sharma and Dean finite water wave theory. A comparison with a measured extreme wave profile......, the Draupner New Year Wave, shows a good agreement in the mean, indicating that this second order wave can be a good identifier of the shape and occurrence of extreme wave events. A discussion on its use as an initial condition for a fully non-linear three-dimensional surface wave analysis is given....

  1. Non-Gaussian statistics and extreme waves in a nonlinear optical cavity.

    Science.gov (United States)

    Montina, A; Bortolozzo, U; Residori, S; Arecchi, F T

    2009-10-23

    A unidirectional optical oscillator is built by using a liquid crystal light valve that couples a pump beam with the modes of a nearly spherical cavity. For sufficiently high pump intensity, the cavity field presents complex spatiotemporal dynamics, accompanied by the emission of extreme waves and large deviations from the Gaussian statistics. We identify a mechanism of spatial symmetry breaking, due to a hypercycle-type amplification through the nonlocal coupling of the cavity field.

  2. Parameter estimation for the Pearson type 3 distribution using order statistics

    Science.gov (United States)

    Rocky Durrans, S.

    1992-05-01

    The Pearson type 3 distribution and its relatives, the log Pearson type 3 and gamma family of distributions, are among the most widely applied in the field of hydrology. Parameter estimation for these distributions has been accomplished using the method of moments, the methods of mixed moments and generalized moments, and the methods of maximum likelihood and maximum entropy. This study evaluates yet another estimation approach, which is based on the use of the properties of an extreme-order statistic. Based on the hypothesis that the population is distributed as Pearson type 3, this estimation approach yields both parameter and 100-year quantile estimators that have lower biases and variances than those of the method of moments approach as recommended by the US Water Resources Council.

  3. Formulation of the third-order Grueneisen parameter at extreme compression

    Energy Technology Data Exchange (ETDEWEB)

    Shanker, J. [Department of Physics, Institute of Basic Sciences, Dr. B.R. Ambedkar University, Khandari Campus, Agra 282 002 (India); Sunil, K., E-mail: k.sunil.ibs@gmail.com [Department of Physics, Institute of Basic Sciences, Dr. B.R. Ambedkar University, Khandari Campus, Agra 282 002 (India); Sharma, B.S. [Department of Physics, Institute of Basic Sciences, Dr. B.R. Ambedkar University, Khandari Campus, Agra 282 002 (India)

    2012-06-15

    We present a direct method using the basic principles of calculus to derive the expression for the third-order Grueneisen parameter in terms of the pressure derivatives of bulk modulus at extreme compression. The derivation presented here does not depend on the assumptions regarding the values of free-volume parameter and its variation with pressure. The identities used in the present analysis are valid at extreme compression for all physically acceptable equations of state.

  4. Inferences on weather extremes and weather-related disasters: a review of statistical methods

    Directory of Open Access Journals (Sweden)

    H. Visser

    2012-02-01

    Full Text Available The study of weather extremes and their impacts, such as weather-related disasters, plays an important role in research of climate change. Due to the great societal consequences of extremes – historically, now and in the future – the peer-reviewed literature on this theme has been growing enormously since the 1980s. Data sources have a wide origin, from century-long climate reconstructions from tree rings to relatively short (30 to 60 yr databases with disaster statistics and human impacts.

    When scanning peer-reviewed literature on weather extremes and its impacts, it is noticeable that many different methods are used to make inferences. However, discussions on these methods are rare. Such discussions are important since a particular methodological choice might substantially influence the inferences made. A calculation of a return period of once in 500 yr, based on a normal distribution will deviate from that based on a Gumbel distribution. And the particular choice between a linear or a flexible trend model might influence inferences as well.

    In this article, a concise overview of statistical methods applied in the field of weather extremes and weather-related disasters is given. Methods have been evaluated as to stationarity assumptions, the choice for specific probability density functions (PDFs and the availability of uncertainty information. As for stationarity assumptions, the outcome was that good testing is essential. Inferences on extremes may be wrong if data are assumed stationary while they are not. The same holds for the block-stationarity assumption. As for PDF choices it was found that often more than one PDF shape fits to the same data. From a simulation study the conclusion can be drawn that both the generalized extreme value (GEV distribution and the log-normal PDF fit very well to a variety of indicators. The application of the normal and Gumbel distributions is more limited. As for uncertainty, it is

  5. Inferences on weather extremes and weather-related disasters: a review of statistical methods

    Directory of Open Access Journals (Sweden)

    H. Visser

    2011-09-01

    Full Text Available The study of weather extremes and their impacts, such as weather-related disasters, plays an important role in climate-change research. Due to the great societal consequences of extremes – historically, now and in the future – the peer-reviewed literature on this theme has been growing enormously since the 1980s. Data sources have a wide origin, from century-long climate reconstructions from tree rings to short databases with disaster statistics and human impacts (30 to 60 yr.

    In scanning the peer-reviewed literature on weather extremes and impacts thereof we noticed that many different methods are used to make inferences. However, discussions on methods are rare. Such discussions are important since a particular methodological choice might substantially influence the inferences made. A calculation of a return period of once in 500 yr, based on a normal distribution will deviate from that based on a Gumbel distribution. And the particular choice between a linear or a flexible trend model might influence inferences as well.

    In this article we give a concise overview of statistical methods applied in the field of weather extremes and weather-related disasters. Methods have been evaluated as to stationarity assumptions, the choice for specific probability density functions (PDFs and the availability of uncertainty information. As for stationarity we found that good testing is essential. Inferences on extremes may be wrong if data are assumed stationary while they are not. The same holds for the block-stationarity assumption. As for PDF choices we found that often more than one PDF shape fits to the same data. From a simulation study we conclude that both the generalized extreme value (GEV distribution and the log-normal PDF fit very well to a variety of indicators. The application of the normal and Gumbel distributions is more limited. As for uncertainty it is advised to test conclusions on extremes for assumptions underlying

  6. Statistics of predictions with missing higher order corrections

    CERN Document Server

    Berthier, Laure

    2016-01-01

    Effective operators have been used extensively to understand small deviations from the Standard Model in the search for new physics. So far there has been no general method to fit for small parameters when higher order corrections in these parameters are present but unknown. We present a new technique that solves this problem, allowing for an exact p-value calculation under the assumption that higher order theoretical contributions can be treated as gaussian distributed random variables. The method we propose is general, and may be used in the analysis of any perturbative theoretical prediction, ie.~truncated power series. We illustrate this new method by performing a fit of the Standard Model Effective Field Theory parameters, which include eg.~anomalous gauge and four-fermion couplings.

  7. Second order statistics of bilinear forms of robust scatter estimators

    KAUST Repository

    Kammoun, Abla

    2015-08-12

    This paper lies in the lineage of recent works studying the asymptotic behaviour of robust-scatter estimators in the case where the number of observations and the dimension of the population covariance matrix grow at infinity with the same pace. In particular, we analyze the fluctuations of bilinear forms of the robust shrinkage estimator of covariance matrix. We show that this result can be leveraged in order to improve the design of robust detection methods. As an example, we provide an improved generalized likelihood ratio based detector which combines robustness to impulsive observations and optimality across the shrinkage parameter, the optimality being considered for the false alarm regulation.

  8. Convergence of Extreme Value Statistics in a Two-Layer Quasi-Geostrophic Atmospheric Model

    Directory of Open Access Journals (Sweden)

    Vera Melinda Gálfi

    2017-01-01

    Full Text Available We search for the signature of universal properties of extreme events, theoretically predicted for Axiom A flows, in a chaotic and high-dimensional dynamical system. We study the convergence of GEV (Generalized Extreme Value and GP (Generalized Pareto shape parameter estimates to the theoretical value, which is expressed in terms of the partial information dimensions of the attractor. We consider a two-layer quasi-geostrophic atmospheric model of the mid-latitudes, adopt two levels of forcing, and analyse the extremes of different types of physical observables (local energy, zonally averaged energy, and globally averaged energy. We find good agreement in the shape parameter estimates with the theory only in the case of more intense forcing, corresponding to a strong chaotic behaviour, for some observables (the local energy at every latitude. Due to the limited (though very large data size and to the presence of serial correlations, it is difficult to obtain robust statistics of extremes in the case of the other observables. In the case of weak forcing, which leads to weaker chaotic conditions with regime behaviour, we find, unsurprisingly, worse agreement with the theory developed for Axiom A flows.

  9. [Quantitative Evaluation of Metal Artifacts on CT Images on the Basis of Statistics of Extremes].

    Science.gov (United States)

    Kitaguchi, Shigetoshi; Imai, Kuniharu; Ueda, Suguru; Hashimoto, Naomi; Hattori, Shouta; Saika, Takahiro; Ono, Yoshifumi

    2016-05-01

    It is well-known that metal artifacts have a harmful effect on the image quality of computed tomography (CT) images. However, the physical property remains still unknown. In this study, we investigated the relationship between metal artifacts and tube currents using statistics of extremes. A commercially available phantom for measuring CT dose index 160 mm in diameter was prepared and a brass rod 13 mm in diameter was placed at the centerline of the phantom. This phantom was used as a target object to evaluate metal artifacts and was scanned using an area detector CT scanner with various tube currents under a constant tube voltage of 120 kV. Sixty parallel line segments with a length of 100 pixels were placed to cross metal artifacts on CT images and the largest difference between two adjacent CT values in each of 60 CT value profiles of these line segments was employed as a feature variable for measuring metal artifacts; these feature variables were analyzed on the basis of extreme value theory. The CT value variation induced by metal artifacts was statistically characterized by Gumbel distribution, which was one of the extreme value distributions; namely, metal artifacts have the same statistical characteristic as streak artifacts. Therefore, Gumbel evaluation method makes it possible to analyze not only streak artifacts but also metal artifacts. Furthermore, the location parameter in Gumbel distribution was shown to be in inverse proportion to the square root of a tube current. This result suggested that metal artifacts have the same dose dependence as image noises.

  10. Statistics of interacting networks with extreme preferred degrees: Simulation results and theoretical approaches

    Science.gov (United States)

    Liu, Wenjia; Schmittmann, Beate; Zia, R. K. P.

    2012-02-01

    Network studies have played a central role for understanding many systems in nature - e.g., physical, biological, and social. So far, much of the focus has been the statistics of networks in isolation. Yet, many networks in the world are coupled to each other. Recently, we considered this issue, in the context of two interacting social networks. In particular, We studied networks with two different preferred degrees, modeling, say, introverts vs. extroverts, with a variety of ``rules for engagement.'' As a first step towards an analytically accessible theory, we restrict our attention to an ``extreme scenario'': The introverts prefer zero contacts while the extroverts like to befriend everyone in the society. In this ``maximally frustrated'' system, the degree distributions, as well as the statistics of cross-links (between the two groups), can depend sensitively on how a node (individual) creates/breaks its connections. The simulation results can be reasonably well understood in terms of an approximate theory.

  11. Renormalization-group theory for finite-size scaling in extreme statistics.

    Science.gov (United States)

    Györgyi, G; Moloney, N R; Ozogány, K; Rácz, Z; Droz, M

    2010-04-01

    We present a renormalization-group (RG) approach to explain universal features of extreme statistics applied here to independent identically distributed variables. The outlines of the theory have been described in a previous paper, the main result being that finite-size shape corrections to the limit distribution can be obtained from a linearization of the RG transformation near a fixed point, leading to the computation of stable perturbations as eigenfunctions. Here we show details of the RG theory which exhibit remarkable similarities to the RG known in statistical physics. Besides the fixed points explaining universality, and the least stable eigendirections accounting for convergence rates and shape corrections, the similarities include marginally stable perturbations which turn out to be generic for the Fisher-Tippett-Gumbel class. Distribution functions containing unstable perturbations are also considered. We find that, after a transitory divergence, they return to the universal fixed line at the same or at a different point depending on the type of perturbation.

  12. STATISTICAL STUDY OF STRONG AND EXTREME GEOMAGNETIC DISTURBANCES AND SOLAR CYCLE CHARACTERISTICS

    Energy Technology Data Exchange (ETDEWEB)

    Kilpua, E. K. J. [Department of Physics, University Helsinki (Finland); Olspert, N.; Grigorievskiy, A.; Käpylä, M. J.; Tanskanen, E. I.; Pelt, J. [ReSoLVE Centre of Excellence, Department of Computer Science, P.O. Box 15400, FI-00076 Aalto Univeristy (Finland); Miyahara, H. [Musashino Art University, 1-736 Ogawa-cho, Kodaira-shi, Tokyo 187-8505 (Japan); Kataoka, R. [National Institute of Polar Research, 10-3 Midori-cho, Tachikawa, Tokyo 190-8518 (Japan); Liu, Y. D. [State Key Laboratory of Space Weather, National Space Science Center, Chinese Academy of Sciences, Beijing 100190 (China)

    2015-06-20

    We study the relation between strong and extreme geomagnetic storms and solar cycle characteristics. The analysis uses an extensive geomagnetic index AA data set spanning over 150 yr complemented by the Kakioka magnetometer recordings. We apply Pearson correlation statistics and estimate the significance of the correlation with a bootstrapping technique. We show that the correlation between the storm occurrence and the strength of the solar cycle decreases from a clear positive correlation with increasing storm magnitude toward a negligible relationship. Hence, the quieter Sun can also launch superstorms that may lead to significant societal and economic impact. Our results show that while weaker storms occur most frequently in the declining phase, the stronger storms have the tendency to occur near solar maximum. Our analysis suggests that the most extreme solar eruptions do not have a direct connection between the solar large-scale dynamo-generated magnetic field, but are rather associated with smaller-scale dynamo and resulting turbulent magnetic fields. The phase distributions of sunspots and storms becoming increasingly in phase with increasing storm strength, on the other hand, may indicate that the extreme storms are related to the toroidal component of the solar large-scale field.

  13. Statistical extremes and peak factors in wind-induced vibration of tall buildings

    Institute of Scientific and Technical Information of China (English)

    Ming-feng HUANG; Chun-man CHAN; Wen-juan LOU; Kenny Chung-Siu KWOK

    2012-01-01

    In the structural design of tall buildings,peak factors have been widely used to predict mean extreme responses of tall buildings under wind excitations.Vanmarcke's peak factor is directly related to an explicit measure of structural reliability against a Gaussian response process.We review the use of this factor for time-variant reliability design by comparing it to the conventional Davenport's peak factor.Based on the asymptotic theory of statistical extremes,a new closed-form peak factor,the so-called Gamma peak factor,can be obtained for a non-Gaussian resultant response characterized by a Rayleigh distribution process.Using the Gamma peak factor,a combined peak factor method was developed for predicting the expected maximum resultant responses of a building undergoing lateral-torsional vibration.The effects of the standard deviation ratio of two sway components and the inter-component correlation on the evaluation of peak resultant response were also investigated.Utilizing wind tunnel data derived from synchronous multi-pressure measurements,we carried out a wind-induced time history response analysis of the Commonwealth Advisory Aeronautical Research Council (CAARC) standard tall building to validate the applicability of the Gamma peak factor to the prediction of the peak resultant acceleration.Results from the building example indicated that the use of the Gamma peak factor enables accurate predictions to be made of the mean extreme resultant acceleration responses for dynamic serviceability performance design of modern tall buildings.

  14. Inter-comparison of statistical downscaling methods for projection of extreme precipitation in Europe

    Science.gov (United States)

    Sunyer, M. A.; Hundecha, Y.; Lawrence, D.; Madsen, H.; Willems, P.; Martinkova, M.; Vormoor, K.; Bürger, G.; Hanel, M.; Kriaučiūnienė, J.; Loukas, A.; Osuch, M.; Yücel, I.

    2015-04-01

    Information on extreme precipitation for future climate is needed to assess the changes in the frequency and intensity of flooding. The primary source of information in climate change impact studies is climate model projections. However, due to the coarse resolution and biases of these models, they cannot be directly used in hydrological models. Hence, statistical downscaling is necessary to address climate change impacts at the catchment scale. This study compares eight statistical downscaling methods (SDMs) often used in climate change impact studies. Four methods are based on change factors (CFs), three are bias correction (BC) methods, and one is a perfect prognosis method. The eight methods are used to downscale precipitation output from 15 regional climate models (RCMs) from the ENSEMBLES project for 11 catchments in Europe. The overall results point to an increase in extreme precipitation in most catchments in both winter and summer. For individual catchments, the downscaled time series tend to agree on the direction of the change but differ in the magnitude. Differences between the SDMs vary between the catchments and depend on the season analysed. Similarly, general conclusions cannot be drawn regarding the differences between CFs and BC methods. The performance of the BC methods during the control period also depends on the catchment, but in most cases they represent an improvement compared to RCM outputs. Analysis of the variance in the ensemble of RCMs and SDMs indicates that at least 30% and up to approximately half of the total variance is derived from the SDMs. This study illustrates the large variability in the expected changes in extreme precipitation and highlights the need for considering an ensemble of both SDMs and climate models. Recommendations are provided for the selection of the most suitable SDMs to include in the analysis.

  15. Second-order complex linear differential equations with special functions or extremal functions as coefficients

    Directory of Open Access Journals (Sweden)

    Xiubi Wu

    2015-05-01

    Full Text Available The classical problem of finding conditions on the entire coefficients A(z and B(z guaranteeing that all nontrivial solutions of $f''+A(zf'+B(zf=0$ are of infinite order is discussed. Two distinct approaches are used. In the first approach the coefficient A(z itself is a solution of a differential equation $w''+P(zw=0$, where P(z is a polynomial. This assumption yields stability on the behavior of A(z via Hille's classical method on asymptotic integration. In this case A(z is a special function of which the Airy integral is one example. The second approach involves extremal functions. It is assumed that either A(z is extremal for Yang's inequality or B(z is extremal for Denjoy's conjecture. A combination of these two approaches is also discussed.

  16. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    Science.gov (United States)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative

  17. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    Directory of Open Access Journals (Sweden)

    Işık Mahmut

    2017-01-01

    Full Text Available In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  18. 76 FR 38360 - Workshop-Monitoring Changes in Extreme Storm Statistics: State of Knowledge; Notice of Open...

    Science.gov (United States)

    2011-06-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE National Oceanic and Atmospheric Administration Workshop--Monitoring Changes in Extreme Storm Statistics: State of Knowledge; Notice of Open Public Workshop AGENCY: National Environmental Satellite, Data,...

  19. Modified Inverse First Order Reliability Method (I-FORM) for Predicting Extreme Sea States.

    Energy Technology Data Exchange (ETDEWEB)

    Eckert-Gallup, Aubrey Celia; Sallaberry, Cedric Jean-Marie; Dallman, Ann Renee; Neary, Vincent Sinclair

    2014-09-01

    Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulation s as a part of the stand ard current practice for designing marine structure s to survive extreme sea states. Such environmental contours are characterized by combinations of significant wave height ( ) and energy period ( ) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first - order reliability method (IFORM) i s standard design practice for generating environmental contours. In this paper, the traditional appli cation of the IFORM to generating environmental contours representing extreme sea states is described in detail and its merits and drawbacks are assessed. The application of additional methods for analyzing sea state data including the use of principal component analysis (PCA) to create an uncorrelated representation of the data under consideration is proposed. A reexamination of the components of the IFORM application to the problem at hand including the use of new distribution fitting techniques are shown to contribute to the development of more accurate a nd reasonable representations of extreme sea states for use in survivability analysis for marine struc tures. Keywords: In verse FORM, Principal Component Analysis , Environmental Contours, Extreme Sea State Characteri zation, Wave Energy Converters

  20. Some Characterization Results based on Conditional Expectation of function of Dual Generalized Order Statistics

    Directory of Open Access Journals (Sweden)

    Md Izhar Khan

    2012-11-01

    Full Text Available  Two families of probability distributions are characterized through the conditional expectations of dual generalized order statistics ( , conditioned on a non-adjacent dual generalized order statistics. Also a result based on the unconditional expectation and a conditional expectation of  is used to characterize family of distributions. Further, some of its deductions are also discussed.

  1. Response spectrum method for extreme wave loading with higher order components of drag force

    Science.gov (United States)

    Reza, Tabeshpour Mohammad; Mani, Fatemi Dezfouli; Mohammad Ali, Dastan Diznab; Saied, Mohajernasab; Saied, Seif Mohammad

    2017-01-01

    Response spectra of fixed offshore structures impacted by extreme waves are investigated based on the higher order components of the nonlinear drag force. In this way, steel jacket platforms are simplified as a mass attached to a light cantilever cylinder and their corresponding deformation response spectra are estimated by utilizing a generalized single degree of freedom system. Based on the wave data recorded in the Persian Gulf region, extreme wave loading conditions corresponding to different return periods are exerted on the offshore structures. Accordingly, the effect of the higher order components of the drag force is considered and compared to the linearized state for different sea surface levels. When the fundamental period of the offshore structure is about one third of the main period of wave loading, the results indicate the linearized drag term is not capable of achieving a reliable deformation response spectrum.

  2. Orographic signature on multiscale statistics of extreme rainfall: A storm-scale study

    Science.gov (United States)

    Ebtehaj, Mohammad; Foufoula-Georgiou, Efi

    2010-12-01

    Rainfall intensity and spatiotemporal patterns often show a strong dependence on the underlying terrain. The main objective of this work is to study the statistical signature imprinted by orography on the spatial structure of rainfall and its temporal evolution at multiple scales, with the aim of developing a consistent theoretical basis for conditional downscaling of precipitation given the topographic information of the underlying terrain. The results of an extensive analysis of the high-resolution stage II Doppler radar data of the Rapidan storm, June 1995, over the Appalachian Mountains is reported in this study. The orographic signature on the elementary statistical structure of the precipitation fields is studied via a variable-intensity thresholding scheme. This signature is further explored at multiple scales via analysis of the dependence of precipitation fields on the underlying terrain both in Fourier and wavelet domains. The generalized normal distribution is found to be a suitable probability model to explain the variability of the rainfall wavelet coefficients and its dependence on the underlying elevations. These results provide a new perspective for more accurate statistical downscaling of orographic precipitation over complex terrain with emphasis on preservation of extremes.

  3. Intensity changes in future extreme precipitation: A statistical event-based approach.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical

  4. Comparison of different statistical downscaling methods to estimate changes in hourly extreme precipitation using RCM projections from ENSEMBLES

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Gregersen, Ida Bülow; Rosbjerg, Dan;

    2015-01-01

    Changes in extreme precipitation are expected to be one of the most important impacts of climate change in cities. Urban floods are mainly caused by short duration extreme events. Hence, robust information on changes in extreme precipitation at high-temporal resolution is required for the design...... of climate change adaptation measures. However, the quantification of these changes is challenging and subject to numerous uncertainties. This study assesses the changes and uncertainties in extreme precipitation at hourly scale over Denmark. It explores three statistical downscaling approaches: a delta...

  5. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  6. Carbon coatings for extreme-ultraviolet high-order laser harmonics

    Energy Technology Data Exchange (ETDEWEB)

    Coraggia, S.; Frassetto, F. [CNR-Institute of Photonics and Nanotechnologies, Laboratory for UV and X-Ray Optical Research, via Trasea 7, 35131 Padova (Italy); Aznarez, J.A.; Larruquert, J.I.; Mendez, J.A. [GOLD-Instituto de Optica-Consejo Superior de Investigaciones Cientificas, Serrano 144, 28006 Madrid (Spain); Negro, M.; Stagira, S.; Vozzi, C. [Department of Physics-Politecnico of Milano and CNR-Institute of Photonics and Nanotechnologies, Piazza Leonardo Da Vinci 32, 20133 Milano (Italy); Poletto, L., E-mail: poletto@dei.unipd.i [CNR-Institute of Photonics and Nanotechnologies, Laboratory for UV and X-Ray Optical Research, via Trasea 7, 35131 Padova (Italy)

    2011-04-11

    The experimental study of the optical properties of thin carbon films to be used as grazing-incidence coatings for extreme-ultraviolet high-order harmonics is presented. The carbon samples were deposited on plane glass substrates by the electron beam evaporation technique. The optical constants (real and imaginary parts of the refraction index) have been calculated through reflectivity measurements. The results are in good agreement with what reported in the literature, and confirm that carbon-coated optics operated at grazing incidence have a remarkable gain over conventional metallic coatings in the extreme ultraviolet. Since the harmonics co-propagate with the intense infrared laser generating beam, the carbon damage threshold when exposed to ultrashort infrared laser pulses has been measured.

  7. The Complex Ambiguity Function Based on Downsampled Fourth-Order Statistics

    Institute of Scientific and Technical Information of China (English)

    MAYongfeng; ZHANGWeiqiang; TAORan

    2004-01-01

    Aiming at the problem of target detection in passive radar in correlated noise surroundings, we present a new estimation of the Complex ambiguity function based on Downsampled fourth-order statistics (CAF-DFOS) to improve the performances of the traditional Complex ambiguity function based on Second-order statistics (CAF-SOS) and the existing Complex ambiguity function based on Fourth-order statistics (CAF-FOS). Both the theory and the simulations show that in the aspect of Gaussian noise suppression, its performance is better than the CAF-SOS algorithm; in the aspect of estimate variance and frequency resolution, its performance is better than the CAF-FOS algorithm.

  8. Statistical Perturbation Theory of Cosmic Fields; 1, Basic Formalism and Second-order Theory

    CERN Document Server

    Matsubara, T

    2000-01-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields, which we call as ``Statistical Perturbation Theory''. The formalism is an extensive generalization of the method used by Matsubara (1994) who derived a weakly nonlinear formula of the genus statistic in a 3D density field. After describing the general method, we apply the formalism especially to analyses of more general genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clearly described. These examples are applied to some cosmic fields, including 3D density field, 3D velocity field, 2D projected density field, and 2D weak lensing field. The results are detailed for second order theory of the formalism. The reason why the genus curves etc. in CDM-like models exhibit smaller deviations from Gaussian predictions when t...

  9. Statistical Analysis of Wave Climate Data Using Mixed Distributions and Extreme Wave Prediction

    Directory of Open Access Journals (Sweden)

    Wei Li

    2016-05-01

    Full Text Available The investigation of various aspects of the wave climate at a wave energy test site is essential for the development of reliable and efficient wave energy conversion technology. This paper presents studies of the wave climate based on nine years of wave observations from the 2005–2013 period measured with a wave measurement buoy at the Lysekil wave energy test site located off the west coast of Sweden. A detailed analysis of the wave statistics is investigated to reveal the characteristics of the wave climate at this specific test site. The long-term extreme waves are estimated from applying the Peak over Threshold (POT method on the measured wave data. The significant wave height and the maximum wave height at the test site for different return periods are also compared. In this study, a new approach using a mixed-distribution model is proposed to describe the long-term behavior of the significant wave height and it shows an impressive goodness of fit to wave data from the test site. The mixed-distribution model is also applied to measured wave data from four other sites and it provides an illustration of the general applicability of the proposed model. The methodologies used in this paper can be applied to general wave climate analysis of wave energy test sites to estimate extreme waves for the survivability assessment of wave energy converters and characterize the long wave climate to forecast the wave energy resource of the test sites and the energy production of the wave energy converters.

  10. Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-03-01

    In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present simple and efficient closed-form expression to the higher order moments of the channel capacity of dual hop transmission system with Rayleigh fading channels. In order to analyze the behavior of the higher order capacity statistics and investigate the usefulness of the mathematical analysis, some selected numerical and simulation results are presented. Our results are found to be in perfect agreement. © 2012 IEEE.

  11. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren

    2009-01-01

    In urban drainage modelling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...

  12. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Rasmussen, Michael R.; Thorndahl, Søren

    2008-01-01

    In urban drainage modeling long term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties concerning long term prediction of maximum water levels and combined sewer...... overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO...... gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity...

  13. Extreme precipitation and temperature responses to circulation patterns in current climate: statistical approaches

    NARCIS (Netherlands)

    Photiadou, C.

    2015-01-01

    Climate change is likely to influence the frequency of extreme extremes - temperature, precipitation and hydrological extremes, which implies increasing risks for flood and drought events in Europe. In current climate, European countries were often not sufficiently prepared to deal with the great so

  14. Describing supercontinuum noise and rogue wave statistics using higher-order moments

    DEFF Research Database (Denmark)

    Sørensen, Simon Toft; Bang, Ole; Wetzel, Benjamin

    2012-01-01

    We show that the noise properties of fiber supercontinuum generation and the appearance of long-tailed “rogue wave” statistics can be accurately quantified using statistical higher-order central moments. Statistical measures of skew and kurtosis, as well as the coefficient of variation provide...... improved insight into the nature of spectral fluctuations across the supercontinuum and allow regions of long-tailed statistics to be clearly identified. These moments – that depend only on analyzing intensity fluctuations – provide a complementary tool to phase-dependent coherence measures to interpret...... supercontinuum noise....

  15. The Statistical Distribution of Turbulence Driven Velocity Extremes in the Atmosperic Boundary Layer cartwright/Longuet-Higgins Revised

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose

    2007-01-01

    The statistical distribution of extreme wind excursions above a mean level, for a specified recurrence period, is of crucial importance in relation to design of wind sensitive structures. This is particularly true for wind turbine structures. Based on an assumption of a Gaussian "mother" distribu......The statistical distribution of extreme wind excursions above a mean level, for a specified recurrence period, is of crucial importance in relation to design of wind sensitive structures. This is particularly true for wind turbine structures. Based on an assumption of a Gaussian "mother...

  16. Extreme value statistics for two-dimensional convective penetration in a pre-main sequence star

    Science.gov (United States)

    Pratt, J.; Baraffe, I.; Goffrey, T.; Constantino, T.; Viallet, M.; Popov, M. V.; Walder, R.; Folini, D.

    2017-08-01

    Context. In the interior of stars, a convectively unstable zone typically borders a zone that is stable to convection. Convective motions can penetrate the boundary between these zones, creating a layer characterized by intermittent convective mixing, and gradual erosion of the density and temperature stratification. Aims: We examine a penetration layer formed between a central radiative zone and a large convection zone in the deep interior of a young low-mass star. Using the Multidimensional Stellar Implicit Code (MUSIC) to simulate two-dimensional compressible stellar convection in a spherical geometry over long times, we produce statistics that characterize the extent and impact of convective penetration in this layer. Methods: We apply extreme value theory to the maximal extent of convective penetration at any time. We compare statistical results from simulations which treat non-local convection, throughout a large portion of the stellar radius, with simulations designed to treat local convection in a small region surrounding the penetration layer. For each of these situations, we compare simulations of different resolution, which have different velocity magnitudes. We also compare statistical results between simulations that radiate energy at a constant rate to those that allow energy to radiate from the stellar surface according to the local surface temperature. Results: Based on the frequency and depth of penetrating convective structures, we observe two distinct layers that form between the convection zone and the stable radiative zone. We show that the probability density function of the maximal depth of convective penetration at any time corresponds closely in space with the radial position where internal waves are excited. We find that the maximal penetration depth can be modeled by a Weibull distribution with a small shape parameter. Using these results, and building on established scalings for diffusion enhanced by large-scale convective motions, we

  17. Evaluation of stochastic weather generators for capturing the statistics of extreme precipitation events in the Catskill Mountain watersheds, New York State

    Science.gov (United States)

    Acharya, N.; Frei, A.; Owens, E. M.; Chen, J.

    2015-12-01

    Watersheds located in the Catskill Mountains area, part of the eastern plateau climate region of New York, contributes about 90% of New York City's municipal water supply, serving 9 million New Yorkers with about 1.2 billion gallons of clean drinking water each day. The New York City Department of Environmental Protection has an ongoing series of studies to assess the potential impacts of climate change on the availability of high quality water in this water supply system. Recent studies identify increasing trends in total precipitation and in the frequency of extreme precipitation events in this region. The objectives of the present study are: to analyze the proba­bilistic structure of extreme precipitation based on historical observations: and to evaluate the abilities of stochastic weather generators (WG), statistical models that produce synthetic weather time series based on observed statistical properties at a particular location, to simulate the statistical properties of extreme precipitation events over this region. The generalized extreme value distribution (GEV) has been applied to the annual block maxima of precipitation for 60 years (1950 to 2009) observed data in order to estimate the events with return periods of 50, 75, and 100 years. These results were then used to evaluate a total of 13 WGs were : 12 parametric WGs including all combinations of three different orders of Markov chain (MC) models (1st , 2nd and 3rd) and four different probability distributions (exponential, gamma, skewed normal and mixed exponential); and one semi parametric WG based on k-nearest neighbor bootstrapping. Preliminary results suggest that three-parameter (skewed normal and mixed exponential distribution) and semi-parametric (k-nearest neighbor bootstrapping) WGs are more consistent with observations. It is also found that first order MC models perform as well as second or third order MC models.

  18. Blind equalization of underwater acoustic channels using implicit higher-order statistics

    NARCIS (Netherlands)

    Blom, Koen C.H.; Dol, Henry S.; Kokkeler, André B.J.; Smit, Gerard J.M.

    2016-01-01

    In order to reduce the length of transmission time slots and energy consumption of underwater modems, this work focuses on equalization without the need for training sequences. This type of equalization is known as blind equalization. A blind equalizer cascade based on higher-order statistics is pre

  19. Computing the Moments of Order Statistics from Truncated Pareto Distributions Based on the Conditional Expectation

    Directory of Open Access Journals (Sweden)

    Gökhan Gökdere

    2014-05-01

    Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.

  20. Higher order antibunching and subpossonian photon statistics in five wave mixing process

    CERN Document Server

    Verma, Amit

    2009-01-01

    We have investigated the possibility of observing higher order antibunching (HOA) and higher order subpossonian photon statistics (HOSPS) in five wave mixing and third harmonic generation process. It had been shown that both processes satisfy the criteria of HOA and HOSPS. Further, some observations on the nature of interaction which produces HOA and HOSPS are reported.

  1. A new ordering principle for the classical statistical analysis of Poisson processes with background

    CERN Document Server

    Giunti, C

    1999-01-01

    Inspired by the recent proposal by Feldman and Cousins of a ``unified approach to the classical statistical analysis of small signals'' based on a choice of ordering in Neyman's construction of classical confidence intervals, I propose a new ordering principle for the classical statistical analysis of Poisson processes with background which minimizes the effect on the resulting confidence intervals of the observation of less background events than expected. The new ordering principle is applied to the calculation of the confidence region implied by the recent null result of the KARMEN neutrino oscillation experiment.

  2. New ordering principle for the classical statistical analysis of Poisson processes with background

    Science.gov (United States)

    Giunti, C.

    1999-03-01

    Inspired by the recent proposal by Feldman and Cousins of a ``unified approach to the classical statistical analysis of small signals'' based on a choice of ordering in Neyman's construction of classical confidence intervals, I propose a new ordering principle for the classical statistical analysis of Poisson processes with a background which minimizes the effect on the resulting confidence intervals of the observation of fewer background events than expected. The new ordering principle is applied to the calculation of the confidence region implied by the recent null result of the KARMEN neutrino oscillation experiment.

  3. Statistical modeling and trend detection of extreme sea level records in the Pearl River Estuary

    Science.gov (United States)

    Wang, Weiwen; Zhou, Wen

    2017-03-01

    Sea level rise has become an important issue in global climate change studies. This study investigates trends in sea level records, particularly extreme records, in the Pearl River Estuary, using measurements from two tide gauge stations in Macau and Hong Kong. Extremes in the original sea level records (daily higher high water heights) and in tidal residuals with and without the 18.6-year nodal modulation are investigated separately. Thresholds for defining extreme sea levels are calibrated based on extreme value theory. Extreme events are then modeled by peaks-over-threshold models. The model applied to extremes in original sea level records does not include modeling of their durations, while a geometric distribution is added to model the duration of extremes in tidal residuals. Realistic modeling results are recommended in all stationary models. Parametric trends of extreme sea level records are then introduced to nonstationary models through a generalized linear model framework. The result shows that, in recent decades, since the 1960s, no significant trends can be found in any type of extreme at any station, which may be related to a reduction in the influence of tropical cyclones in the region. For the longer-term record since the 1920s at Macau, a regime shift of tidal amplitudes around the 1970s may partially explain the diverse trend of extremes in original sea level records and tidal residuals.

  4. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images.

    Science.gov (United States)

    Hu, Qin; Victor, Jonathan D

    2016-09-01

    Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features, but they are challenging to study - largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH) functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis) of the distribution of filter coefficients depends only on the projection of the function onto a 1-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank.

  5. Fabrication of ordered honeycomb amphiphobic films with extremely low fluorine content.

    Science.gov (United States)

    Gao, Fei; Wang, Wei; Li, Xinxin; Li, Lei; Lin, Jiaping; Lin, Shaoliang

    2016-04-15

    A series of poly(methyl methacrylate)-block-poly(perfluoroalkyl ethyl acrylate) (PMMA-b-PFAEA) with various fluorine content were employed to fabricate honeycomb ordered films via breath figure strategy. The influences of temperature, concentration, relative humidity, fluorine content on the morphology of porous films were investigated. Wetting behavior including hydrophobic property and wetting state of the films was studied. High surface roughness from the porous structure and low surface free energy from the increasing PFAEA fraction led to the enhancement of hydrophobicity. Additionally, fabrication of porous films by the mixture of PMMA and PMMA-b-PFAEA was investigated. Ordered porous film with excellent hydrophobicity and oleophobicity was obtained with only 7 wt% of PMMA-b-PFAEA by simultaneous processes of breath figure mechanism and phase separation. This work facilitates our further comprehension of the mechanism of breath figure and contributes to the fabrication of porous film from fluorinated copolymers. Meanwhile, it opens a new route to prepare films possessing excellent hydrophobicity and oleophobicity with extremely low fluorine content.

  6. Estimating statistics of European wet and dry spells and associated precipitation extremes - interannual variability and trends

    Science.gov (United States)

    Zolina, O.; Simmer, C.; Belyaev, K.; Gulev, S.; Koltermann, K. P.

    2013-12-01

    Probability distributions of the durations of wet and dry spells were modeled by applying truncated geometric distribution. It has been also extended to the fractional truncated geometric distribution which allows for the discrimination between the roles of a changing number of wet days and of a regrouping of wet and dry days in forming synoptic structure of precipitation. Analyses were performed using 2 collections of daily rain gauge data namely ECA (about 1000 stations) and regional German DWD network (more than 6000 stations) for the period from 1950 to 2009. Wet spells exhibit a statistically significant lengthening over northern Europe and central European Russia, which is especially pronounced in winter when the mean duration of wet periods increased by 15%-20%. In summer wet spells become shorter over Scandinavia and northern Russia. The duration of dry spells decreases over Scandinavia and southern Europe in both winter and summer. Climate tendencies in extreme wet and dry spell durations may not necessarily follow those in mean characteristics. The changing numbers of wet days cannot explain the long-term variability in the duration of wet and dry periods. The observed changes are mainly due to the regrouping of wet and dry days. The tendencies in duration of wet and dry spells have been analyzed for a number of European areas. Over the Netherlands both wet and dry periods are extended in length during the cold and the warm season. A simultaneous shortening of wet and dry periods is found in southern Scandinavia in summer. Over France and central southern Europe during both winter and summer and over the Scandinavian Atlantic coast in summer, opposite tendencies in the duration of wet and dry spells were identified. Growing durations of wet spells are associated with more intense precipitation events while precipitation during shorter wet spells become weaker. Both analyses of relatively coarse resolution ECA data and high resolution DWD station network

  7. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    Science.gov (United States)

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information.

  8. Leading Order Response of Statistical Averages of a Dynamical System to Small Stochastic Perturbations

    Science.gov (United States)

    Abramov, Rafail V.

    2017-03-01

    The classical fluctuation-dissipation theorem predicts the average response of a dynamical system to an external deterministic perturbation via time-lagged statistical correlation functions of the corresponding unperturbed system. In this work we develop a fluctuation-response theory and test a computational framework for the leading order response of statistical averages of a deterministic or stochastic dynamical system to an external stochastic perturbation. In the case of a stochastic unperturbed dynamical system, we compute the leading order fluctuation-response formulas for two different cases: when the existing stochastic term is perturbed, and when a new, statistically independent, stochastic perturbation is introduced. We numerically investigate the effectiveness of the new response formulas for an appropriately rescaled Lorenz 96 system, in both the deterministic and stochastic unperturbed dynamical regimes.

  9. Second order Statistical Texture Features from a New CSLBPGLCM for Ultrasound Kidney Images Retrieval

    Directory of Open Access Journals (Sweden)

    Chelladurai CALLINS CHRISTIYANA

    2013-12-01

    Full Text Available This work proposes a new method called Center Symmetric Local Binary Pattern Grey Level Co-occurrence Matrix (CSLBPGLCM for the purpose of extracting second order statistical texture features in ultrasound kidney images. These features are then feed into ultrasound kidney images retrieval system for the point of medical applications. This new GLCM matrix combines the benefit of CSLBP and conventional GLCM. The main intention of this CSLBPGLCM is to reduce the number of grey levels in an image by not simply accumulating the grey levels but incorporating another statistical texture feature in it. The proposed approach is cautiously evaluated in ultrasound kidney images retrieval system and has been compared with conventional GLCM. It is experimentally proved that the proposed method increases the retrieval efficiency, accuracy and reduces the time complexity of ultrasound kidney images retrieval system by means of second order statistical texture features.

  10. Speckle reduction in ultrasound medical images using adaptive filter based on second order statistics.

    Science.gov (United States)

    Thakur, A; Anand, R S

    2007-01-01

    This article discusses an adaptive filtering technique for reducing speckle using second order statistics of the speckle pattern in ultrasound medical images. Several region-based adaptive filter techniques have been developed for speckle noise suppression, but there are no specific criteria for selecting the region growing size in the post processing of the filter. The size appropriate for one local region may not be appropriate for other regions. Selection of the correct region size involves a trade-off between speckle reduction and edge preservation. Generally, a large region size is used to smooth speckle and a small size to preserve the edges into an image. In this paper, a smoothing procedure combines the first order statistics of speckle for the homogeneity test and second order statistics for selection of filters and desired region growth. Grey level co-occurrence matrix (GLCM) is calculated for every region during the region contraction and region growing for second order statistics. Further, these GLCM features determine the appropriate filter for the region smoothing. The performance of this approach is compared with the aggressive region-growing filter (ARGF) using edge preservation and speckle reduction tests. The processed image results show that the proposed method effectively reduces speckle noise and preserves edge details.

  11. 75 FR 28825 - Order Granting Temporary Conditional Exemption for Nationally Recognized Statistical Rating...

    Science.gov (United States)

    2010-05-24

    ... COMMISSION Order Granting Temporary Conditional Exemption for Nationally Recognized Statistical Rating... hired NRSRO to resist such pressure by increasing the likelihood that any steps taken to inappropriately... 63834. \\24\\ 17 CFR 240.17g-5(a)(3). IV. Description of the Conditional Temporary Exemption...

  12. Kriging with cumulative distribution function of order statistics for delineation of heavy-metal contaminated soils

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.W.; Lee, D.Y.; Hsiao, C.K. [National Taiwan Univ., Tapei (Taiwan, Province of China)

    1998-10-01

    Accurate delineation of contaminated soils is essential for risk assessment and remediation. The probability of pollutant concentrations lower than a cutoff value is more important than the best estimate of pollutant concentrations for unsampled locations in delineating contaminated soils. In this study, a new method, kriging with the cumulative distribution function (CDF) of order statistics (CDF kriging), is introduced and compared with indicator kriging. It is used to predict the probability that extractable concentrations of Zn will be less than a cut-off value for soils to be declared hazardous. The 0.1 M HCl-extractable Zn concentrations of topsoil of a paddy field having an area of about 2000 ha located in Taiwan are used. A comparison of the CDF of order statistics and indicator function transformation shows that the variance and the coefficient of variation (CV) of the CDF of order statistics transformed data are smaller than those of the indicator function transformed data. This suggests that the CDF of order statistics transformation possesses less variability than does the indicator function transformation. In addition, based on cross-validation, CDF kriging is found to reduce the mean squared errors of estimations by about 30% and to reduce the mean kriging variances by about 26% compared with indicator kriging.

  13. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    Science.gov (United States)

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values (rth ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  14. Review on Order Statistics and Record Values from F^{α} Distributions

    Directory of Open Access Journals (Sweden)

    Mohammad Shakil

    2012-01-01

    Full Text Available v\\:* {behavior:url(#default#VML;} o\\:* {behavior:url(#default#VML;} w\\:* {behavior:url(#default#VML;} .shape {behavior:url(#default#VML;} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Abstract: Both order statistics and records arise naturally in many fields of studies such as climatology, sports, science, engineering, medicine, traffic, and industry, among others. Studies of their properties and applications play important roles in many areas of statistical research, for example, statistical inference, nonparametric statistics, among others. In recent years, the F^{α} distributions have been widely studied in statistics because of their wide applicability in the modeling and analysis of life time data. An absolutely continuous positive random variable X is said to have an F^{α} or exponentiated distribution if its cumulative distribution function (cdf is given by G(x=F^{α}(x=[F(x]^{α}, α>0, x>0, which is the αth power of the base line distribution function F(x. Many researchers and authors have developed various classes of F^{α} distributions. It appears from the literature that not much attention has been paid to the analysis of record values and order statistics from these members of F^{α} family of distributions, and, therefore, need

  15. Review on Order Statistics and Record Values from F^{α} Distributions

    Directory of Open Access Journals (Sweden)

    Mohammad Shakil

    2012-01-01

    Full Text Available v\\:* {behavior:url(#default#VML;} o\\:* {behavior:url(#default#VML;} w\\:* {behavior:url(#default#VML;} .shape {behavior:url(#default#VML;} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Abstract: Both order statistics and records arise naturally in many fields of studies such as climatology, sports, science, engineering, medicine, traffic, and industry, among others. Studies of their properties and applications play important roles in many areas of statistical research, for example, statistical inference, nonparametric statistics, among others. In recent years, the F^{α} distributions have been widely studied in statistics because of their wide applicability in the modeling and analysis of life time data. An absolutely continuous positive random variable X is said to have an F^{α} or exponentiated distribution if its cumulative distribution function (cdf is given by G(x=F^{α}(x=[F(x]^{α}, α>0, x>0, which is the αth power of the base line distribution function F(x. Many researchers and authors have developed various classes of F^{α} distributions. It appears from the literature that not much attention has been paid to the analysis of record values and order statistics from these members of F^{α} family of distributions, and, therefore, need

  16. Long-term statistics of extreme tsunami height at Crescent City

    Science.gov (United States)

    Dong, Sheng; Zhai, Jinjin; Tao, Shanshan

    2017-06-01

    Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.

  17. Statistical analysis and ANN modeling for predicting hydrological extremes under climate change scenarios: the example of a small Mediterranean agro-watershed.

    Science.gov (United States)

    Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P

    2015-05-01

    The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events.

  18. Handbook of tables for order statistics from lognormal distributions with applications

    CERN Document Server

    Balakrishnan, N

    1999-01-01

    Lognormal distributions are one of the most commonly studied models in the sta­ tistical literature while being most frequently used in the applied literature. The lognormal distributions have been used in problems arising from such diverse fields as hydrology, biology, communication engineering, environmental science, reliability, agriculture, medical science, mechanical engineering, material science, and pharma­ cology. Though the lognormal distributions have been around from the beginning of this century (see Chapter 1), much of the work concerning inferential methods for the parameters of lognormal distributions has been done in the recent past. Most of these methods of inference, particUlarly those based on censored samples, involve extensive use of numerical methods to solve some nonlinear equations. Order statistics and their moments have been discussed quite extensively in the literature for many distributions. It is very well known that the moments of order statistics can be derived explicitly only...

  19. Statistical distribution of surface elevation for the fourth order nonlinear random sea waves

    Institute of Scientific and Technical Information of China (English)

    管长龙; 孙孚

    1997-01-01

    Based upon the nonlinear model of random sea waves, the statistical distribution of wave surface elevation exact to the fourth order is derived as the truncated Gram-Charlier series, by calculating directly each order moment. The phenomenon found by Huang et al. that the agreement between observed data and investigated series deteriorates much more when the series is kept to λ8 is explained. The effect of the approximation order on the truncation of series and the determination of coefficients is investigated. For the mth order approximation, the derived series is truncated at H3m-3 with the absence of H3m-4, and the coefficients of H3m-3 and H3m-6 are connected by a simple algebraic relation.

  20. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images

    Science.gov (United States)

    Hu, Qin; Victor, Jonathan D.

    2016-01-01

    Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features, but they are challenging to study – largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH) functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis) of the distribution of filter coefficients depends only on the projection of the function onto a 1-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank. PMID:27713838

  1. Two-Dimensional Hermite Filters Simplify the Description of High-Order Statistics of Natural Images

    Directory of Open Access Journals (Sweden)

    Qin Hu

    2016-09-01

    Full Text Available Natural image statistics play a crucial role in shaping biological visual systems, understanding their function and design principles, and designing effective computer-vision algorithms. High-order statistics are critical for conveying local features but they are challenging to study, largely because their number and variety is large. Here, via the use of two-dimensional Hermite (TDH functions, we identify a covert symmetry in high-order statistics of natural images that simplifies this task. This emerges from the structure of TDH functions, which are an orthogonal set of functions that are organized into a hierarchy of ranks. Specifically, we find that the shape (skewness and kurtosis of the distribution of filter coefficients depends only on the projection of the function onto a one-dimensional subspace specific to each rank. The characterization of natural image statistics provided by TDH filter coefficients reflects both their phase and amplitude structure, and we suggest an intuitive interpretation for the special subspace within each rank.

  2. Computing the Performance Measures in Queueing Models via the Method of Order Statistics

    Directory of Open Access Journals (Sweden)

    Yousry H. Abdelkader

    2011-01-01

    Full Text Available This paper focuses on new measures of performance in single-server Markovian queueing system. These measures depend on the moments of order statistics. The expected value and the variance of the maximum (minimum number of customers in the system as well as the expected value and the variance of the minimum (maximum waiting time are presented. Application to an M/M/1 model is given to illustrate the idea and the applicability of the proposed measures.

  3. Regularized learning of linear ordered-statistic constant false alarm rate filters (Conference Presentation)

    Science.gov (United States)

    Havens, Timothy C.; Cummings, Ian; Botts, Jonathan; Summers, Jason E.

    2017-05-01

    The linear ordered statistic (LOS) is a parameterized ordered statistic (OS) that is a weighted average of a rank-ordered sample. LOS operators are useful generalizations of aggregation as they can represent any linear aggregation, from minimum to maximum, including conventional aggregations, such as mean and median. In the fuzzy logic field, these aggregations are called ordered weighted averages (OWAs). Here, we present a method for learning LOS operators from training data, viz., data for which you know the output of the desired LOS. We then extend the learning process with regularization, such that a lower complexity or sparse LOS can be learned. Hence, we discuss what 'lower complexity' means in this context and how to represent that in the optimization procedure. Finally, we apply our learning methods to the well-known constant-false-alarm-rate (CFAR) detection problem, specifically for the case of background levels modeled by long-tailed distributions, such as the K-distribution. These backgrounds arise in several pertinent imaging problems, including the modeling of clutter in synthetic aperture radar and sonar (SAR and SAS) and in wireless communications.

  4. Higher-order generalized hydrodynamics: Foundations within a nonequilibrium statistical ensemble formalism.

    Science.gov (United States)

    Silva, Carlos A B; Rodrigues, Clóves G; Ramos, J Galvão; Luzzi, Roberto

    2015-06-01

    Construction, in the framework of a nonequilibrium statistical ensemble formalism, of a higher-order generalized hydrodynamics, also referred to as mesoscopic hydrothermodynamics, that is, covering phenomena involving motion of fluids displaying variations short in space and fast in time-unrestricted values of Knudsen numbers, is presented. In that way, an approach is provided enabling the coupling and simultaneous treatment of the kinetics and hydrodynamic levels of descriptions. It is based on a complete thermostatistical approach in terms of the densities of matter and energy and their fluxes of all orders covering systems arbitrarily driven away from equilibrium. The set of coupled nonlinear integrodifferential hydrodynamic equations is derived. They are the evolution equations of the Gradlike moments of all orders, derived from a generalized kinetic equation built in the framework of the nonequilibrium statistical ensemble formalism. For illustration, the case of a system of particles embedded in a fluid acting as a thermal bath is fully described. The resulting enormous set of coupled evolution equations is of unmanageable proportions, thus requiring in practice to introduce an appropriate description using the smallest possible number of variables. We have obtained a hierarchy of Maxwell times, associated to the set of all the higher-order fluxes, which have a particular relevance in the process of providing criteria for establishing the contraction of description.

  5. Higher-order-statistics-based radial basis function networks for signal enhancement.

    Science.gov (United States)

    Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei

    2007-05-01

    In this paper, a higher-order-statistics (HOS)-based radial basis function (RBF) network for signal enhancement is introduced. In the proposed scheme, higher order cumulants of the reference signal were used as the input of HOS-based RBF. An HOS-based supervised learning algorithm, with mean square error obtained from higher order cumulants of the desired input and the system output as the learning criterion, was used to adapt weights. The motivation is that the HOS can effectively suppress Gaussian and symmetrically distributed non-Gaussian noise. The influence of a Gaussian noise on the input of HOS-based RBF and the HOS-based learning algorithm can be mitigated. Simulated results indicate that HOS-based RBF can provide better performance for signal enhancement under different noise levels, and its performance is insensitive to the selection of learning rates. Moreover, the efficiency of HOS-based RBF under the nonstationary Gaussian noise is stable.

  6. Nonlinear Fitting Method of Long-Term Distributions for Statistical Analysis of Extreme Negative Surge Elevations

    Institute of Scientific and Technical Information of China (English)

    DONG Sheng; LI Fengli; JIAO Guiying

    2003-01-01

    Hydrologic frequency analysis plays an important role in coastal and ocean engineering for structural design and disaster prevention in coastal areas. This paper proposes a Nonlinear Least Squares Method (NLSM), which estimates the three unknown parameters of the Weibull distribution simultaneously by an iteration method. Statistical test shows that the NLSM fits each data sample well. The effects of different parameter-fitting methods, distribution models, and threshold values are also discussed in the statistical analysis of storm set-down elevation. The best-fitting probability distribution is given and the corresponding return values are estimated for engineering design.

  7. Inter-comparison of statistical downscaling methods for projection of extreme precipitation in Europe

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Hundecha, Y.; Lawrence, D.;

    2015-01-01

    Information on extreme precipitation for future climate is needed to assess the changes in the frequency and intensity of flooding. The primary source of information in climate change impact studies is climate model projections. However, due to the coarse resolution and biases of these models...... be drawn regarding the differences between CFs and BC methods. The performance of the BC methods during the control period also depends on the catchment, but in most cases they represent an improvement compared to RCM outputs. Analysis of the variance in the ensemble of RCMs and SDMs indicates...

  8. Extreme Value Statistical Characterization of Time Domain Pulse-to-Pulse Measurements

    CERN Document Server

    AUTHOR|(SzGeCERN)712364; Arpaia, Pasquale; Martino, Michele

    2015-01-01

    An analytical method, based on Extreme Value Theory (EV T), for predicting the worst case repeatability of time domain pulse-to-pulse measurements, modeled as independent and identically distributed random variables, is proposed. The method allows the use of the noise level of a measurement system for predicting the upcoming peak values over a given number of independent observations. The proposed analytical model is compared against simulated distributions generated in Matlab, highlighting satisfying match for any sample size.The simulations are based on a case study on the characterization of a pulsed power supply for the klystron modulators of the Compact LInear Collider (CLIC) under study at CERN.

  9. Statistical model with two order parameters for ductile and soft fiber bundles in nanoscience and biomaterials.

    Science.gov (United States)

    Rinaldi, Antonio

    2011-04-01

    Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).

  10. Automatic Assessment of Pathological Voice Quality Using Higher-Order Statistics in the LPC Residual Domain

    Directory of Open Access Journals (Sweden)

    JiYeoun Lee

    2009-01-01

    Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.

  11. Conditional budgets of second-order statistics in nonpremixed and premixed turbulent combustion

    Science.gov (United States)

    Macart, Jonathan F.; Grenga, Temistocle; Mueller, Michael E.

    2016-11-01

    Combustion heat release modifies or introduces a number of new terms to the balance equations for second-order turbulence statistics (turbulent kinetic energy, scalar variance, etc.) compared to incompressible flow. A major modification is a significant increase in viscosity and dissipation in the high-temperature combustion products, but new terms also appear due to density variation and gas expansion (dilatation). Previous scaling analyses have hypothesized that dilatation effects are important in turbulent premixed combustion but are unimportant in turbulent nonpremixed combustion. To explore this hypothesis, a series of DNS calculations have been performed in the low Mach number limit for spatially evolving turbulent planar jet flames of hydrogen and air in both premixed and nonpremixed configurations. Unlike other studies exploring the effects of heat release on turbulence, the turbulence is not forced, and detailed chemical kinetics are used to describe hydrogen-air combustion. Budgets for second-order statistics are computed conditioned on progress variable in the premixed flame and on mixture fraction in the nonpremixed flame in order to locate regions with respect to the flame structure where dilatation effects are strongest.

  12. PERFORMANCE COMPARISON OF A LINEARLY COMBINED ORDERED-STATISTIC DETECTORS UNDER POSTDETECTION INTEGRATION AND NONHOMOGENEOUS SITUATIONS

    Institute of Scientific and Technical Information of China (English)

    Mohamed Bakry El_Mashade

    2006-01-01

    Several Constant False Alarm Rate (CFAR) architectures, where radar systems often employ them to automatically adapt the detection threshold to the local background noise or clutter power in an attempt to maintain an approximately constant rate of false alarm, have been recently proposed to estimate the unknown noise power level. Since the Ordered-Statistics (OS) based algorithm has some advantages over the Cell-Averaging (CA) technique, we are concerned here with this type of CFAR detectors. The Linearly Combined Ordered-Statistic (LCOS) processor, which sets threshold by processing a weighted ordered range samples within finite moving window, may actually perform somewhat better than the conventional OS detector.Our objective in this paper is to analyze the LCOS processor along with the conventional OS scheme for the case where the radar receiver incorporates a postdetection integrator amongst its contents and where the operating environments contain a number of secondary interfering targets along with the primary target of concern and the two target types fluctuate in accordance with the Swerling Ⅱ fluctuation model and to compare their performances under various operating conditions.

  13. A new simple model for composite fading channels: Second order statistics and channel capacity

    KAUST Repository

    Yilmaz, Ferkan

    2010-09-01

    In this paper, we introduce the most general composite fading distribution to model the envelope and the power of the received signal in such fading channels as millimeter wave (60 GHz or above) fading channels and free-space optical channels, which we term extended generalized-K (EGK) composite fading distribution. We obtain the second-order statistics of the received signal envelope characterized by the EGK composite fading distribution. Expressions for probability density function, cumulative distribution function, level crossing rate and average fade duration, moments, amount of fading and average capacity are derived. Numerical and computer simulation examples validate the accuracy of the presented mathematical analysis. © 2010 IEEE.

  14. Improved system blind identification based on second-order cyclostationary statistics: A group delay approach

    Indian Academy of Sciences (India)

    P V S Giridhar; S V Narasimhan

    2000-04-01

    An improved system blind identification method based on second-order cyclostationary statistics and the properties of group delay, has been proposed. This is achieved by applying a correction to the estimated phase (by the spectral correlation density of the system output) for the poles, in the group delay domain. The results indicate a significant improvement in system blind identification, in terms of root mean square error. Depending upon the signal-to-noise ratio, the improvement in percentage normalized mean square error ranges between 20 and 50%.

  15. Characterizing the Sample Complexity of Large-Margin Learning With Second-Order Statistics

    CERN Document Server

    Sabato, Sivan; Tishby, Naftali

    2012-01-01

    We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the margin-adapted dimension of the data distribution. The upper bounds are universal, and the lower bounds hold for a rich family of sub-Gaussian distributions. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification.

  16. Cost-Effective Implementation of Order-Statistics Based Vector Filters Using Minimax Approximations

    CERN Document Server

    Celebi, M Emre; Lukac, Rastislav; Celiker, Fatih; 10.1364/JOSAA.26.001518

    2010-01-01

    Vector operators based on robust order statistics have proved successful in digital multichannel imaging applications, particularly color image filtering and enhancement, in dealing with impulsive noise while preserving edges and fine image details. These operators often have very high computational requirements which limits their use in time-critical applications. This paper introduces techniques to speed up vector filters using the minimax approximation theory. Extensive experiments on a large and diverse set of color images show that proposed approximations achieve an excellent balance among ease of implementation, accuracy, and computational speed.

  17. Implementasi Order-Statistic Filters Untuk Mereduksi Noise Pada Citra Digital

    OpenAIRE

    Sihotang, Juni Santo

    2014-01-01

    Salt-and-pepper Noise or Gaussian Noise is noise there is usually found in digital images. Noise on the image usually occurs due to errors in image acquisition technique or because the image has been stored too long. To reduce noise we need a proper filter method so that the image resulted in accordance with the original. Order-Statistic Filters method is a non-linear filter whose results determined in accordance with the sorting image pixels that fil the area that are in the scope of the fil...

  18. Linguistic Analysis of the Human Heartbeat Using Frequency and Rank Order Statistics

    Science.gov (United States)

    Yang, Albert C.-C.; Hseu, Shu-Shya; Yien, Huey-Wen; Goldberger, Ary L.; Peng, C.-K.

    2003-03-01

    Complex physiologic signals may carry unique dynamical signatures that are related to their underlying mechanisms. We present a method based on rank order statistics of symbolic sequences to investigate the profile of different types of physiologic dynamics. We apply this method to heart rate fluctuations, the output of a central physiologic control system. The method robustly discriminates patterns generated from healthy and pathologic states, as well as aging. Furthermore, we observe increased randomness in the heartbeat time series with physiologic aging and pathologic states and also uncover nonrandom patterns in the ventricular response to atrial fibrillation.

  19. High order statistics based blind deconvolution of bi-level images with unknown intensity values.

    Science.gov (United States)

    Kim, Jeongtae; Jang, Soohyun

    2010-06-07

    We propose a novel linear blind deconvolution method for bi-level images. The proposed method seeks an optimal point spread function and two parameters that maximize a high order statistics based objective function. Unlike existing minimum entropy deconvolution and least squares minimization methods, the proposed method requires neither unrealistic assumption that the pixel values of a bi-level image are independently identically distributed samples of a random variable nor tuning of regularization parameters.We demonstrate the effectiveness of the proposed method in simulations and experiments.

  20. Statistical properties of first-order bang-bang Pll with nonzero loop delay

    OpenAIRE

    Chun, Byungjin; Kennedy, Michael Peter

    2008-01-01

    A method to solve the stationary state probability is presented for the first-order bang-bang phase-locked loop (BBPLL) with nonzero loop delay. This is based on a delayed Markov chain model and a state How diagram for tracing the state history due to the loop delay. As a result, an eigenequation is obtained, and its closed form solutions are derived for some cases. After obtaining the state probability, statistical characteristics such as mean gain of the binary phase detector and timing err...

  1. Extreme value statistics of 2D Gaussian free field: effect of finite domains

    Science.gov (United States)

    Cao, X.; Rosso, A.; Santachiara, R.

    2016-01-01

    We study minima statistics of the 2D Gaussian free field (GFF) on circles in the unit disk with Dirichlet boundary condition. Free energy distributions of the associated random energy models are exactly calculated in the high temperature phase, and shown to satisfy the duality property, which enables us to predict the minima distribution by assuming the freezing scenario. Numerical tests are provided. Related questions concerning the GFF on a sphere are also considered.

  2. Improved LSB-matching Steganography for Preserving Second-order Statistics

    Directory of Open Access Journals (Sweden)

    Guangjie Liu

    2010-10-01

    Full Text Available In the paper, for enhancing the security of the traditional LSB matching, two improved LSB-matching methods are proposed. In the steganograhical procedure, the Markov chain distance based on the second-order statistics is chosen as the security metric to control the modification directions of ±1 embedding. The first method is based on stochastic modification, which directly determines the modification directions by the empirical Markov transition matrix of a cover image and the pseudorandom number generated by a pseudorandom number generator. The second one is based on genetic algorithm, which is used to find the optimum matching vector to make the security metric as small as possible. Experiments show the proposed algorithms outperform LSB matching and LSB replacement in a sense of the firstorder and second-order security metrics. And the adjacent calibrated COM-HCF steganalytical tests also show that the two algorithms are more secure than the traditional ones.

  3. Extreme-value statistics from Lagrangian convex hull analysis I. Validation for homogeneous turbulent Boussinesq convection and MHD convection

    CERN Document Server

    Pratt, J; Müller, W -C; Chapman, S C; Watkins, N W

    2016-01-01

    We investigate the utility of the convex hull to analyze physical questions related to the dispersion of a group of much more than four Lagrangian tracer particles in a turbulent flow. Validation of standard dispersion behaviors is a necessary preliminary step for use of the convex hull to describe turbulent flows. In simulations of statistically homogeneous and stationary Navier-Stokes turbulence, neutral fluid Boussinesq convection, and MHD Boussinesq convection we show that the convex hull can be used to reasonably capture the dispersive behavior of a large group of tracer particles. We validate dispersion results produced with convex hull analysis against scalings for Lagrangian particle pair dispersion. In addition to this basic validation study, we show that convex hull analysis provides information that particle pair dispersion does not, in the form of a extreme value statistics, surface area, and volume for a cluster of particles. We use the convex hull surface area and volume to examine the degree of...

  4. Multivariate High Order Statistics of Measurements of the Temporal Evolution of Fission Chain-Reactions

    Energy Technology Data Exchange (ETDEWEB)

    Mattingly, J.K.

    2001-03-08

    The development of high order statistical analyses applied to measurements of the temporal evolution of fission chain-reactions is described. These statistics are derived via application of Bayes' rule to conditional probabilities describing a sequence of events in a fissile system beginning with the initiation of a chain-reaction by source neutrons and ending with counting events in a collection of neutron-sensitive detectors. Two types of initiating neutron sources are considered: (1) a directly observable source introduced by the experimenter (active initiation), and (2) a source that is intrinsic to the system and is not directly observable (passive initiation). The resulting statistics describe the temporal distribution of the population of prompt neutrons in terms of the time-delays between members of a collection (an n-tuplet) of correlated detector counts, that, in turn, may be collectively correlated with a detected active source neutron emission. These developments are a unification and extension of Rossi-a, pulsed neutron, and neutron noise methods, each of which measure the temporal distribution of pairs of correlated events, to produce a method that measures the temporal distribution of n-tuplets of correlated counts of arbitrary dimension n. In general the technique should expand present capabilities in the analysis of neutron counting measurements.

  5. Introducing Switching Ordered Statistic CFAR Type I in Different Radar Environments

    Directory of Open Access Journals (Sweden)

    Saeed Erfanian

    2009-01-01

    Full Text Available In this paper, a new CFAR detector based on a switching algorithm and OS-CFAR for nonhomogeneous background environments is introduced. The new detector is named Switching Ordered Statistic CFAR type I (SOS CFAR I. The SOS CFAR I selects a set of suitable cells and then with the help of the ordering method, estimates the unknown background noise level. The proposed detector does not require any prior information about the background environment and uses cells with similar statistical specifications to estimate the background noise. The performance of SOS CFAR I is evaluated and compared with other detectors such as CA-CFAR, GO-CFAR, SO-CFAR, and OS-CFAR for the Swerling I target model in homogeneous and nonhomogeneous noise environments such as those with multiple interference and clutter edges. The results show that SOS CFAR I detectors considerably reduce the problem of excessive false alarm probability near clutter edges while maintaining good performance in other environments. Also, simulation results confirm the achievement of an optimum detection threshold in homogenous and nonhomogeneous radar environments by the mentioned processor.

  6. Low-order statistics of effective permittivity and electric field fluctuations in two-phase heterostructures

    Science.gov (United States)

    Shamoon, D.; Lasquellec, S.; Brosseau, C.

    2017-07-01

    Understanding the collective, low-frequency dielectric properties of heterostructures is a major goal in condensed matter. In 1935, Bruggeman [Ann. Phys. Lpz. 24, 636 (1935)] conceived the concept of an effective medium approximation (EMA) involving a decoupling between the low-order statistics of the electric field fluctuations and the characteristic length scales. We report on and characterize, via finite element studies, the low-order statistics effective permittivity of two-phase 2D and 3D random and deterministic heterostructures as geometry, phase permittivity contrast, and inclusion content are varied. Since EMA analytical expressions become cumbersome even for simple shapes and arrangements, numerical approaches are more suitable for studying heterostructures with complex shapes and topologies. Our numerical study verifies the EMA analytic predictions when the scales are well-separated. Our numerical study compares two approaches for calculating effective permittivity by explicit calculations of local average fields and energy as geometry, phase permittivity contrast, and inclusion content are varied. We study the conditions under which these approaches give a reliable estimate of permittivity by comparing with 2D/3D EMA analytical models and duality relation. By considering 2D checkerboards which consist of a multitude of contiguous N × N square cells, the influence of the internal length scale (i.e., N) on permittivity is discussed.

  7. Wavelet and Higher order statistics as a tool for revealing Electromagnetic precursors of Earthquakes

    Science.gov (United States)

    Sondhiya, Deepak Kumar; Gwal, Ashok Kumar; Kumar, Sushil

    2016-07-01

    In recent years a number of scientists have reported correlations between observations of electromagnetic radiation and earthquakes. These observation of seismo-electromagnetic waves have been made both on the ground in the earthquake regions and by spacecraft over earthquake regions. In this work an attempt to develop a complex approach to the problem of searching for electromagnetic earthquake precursor signatures is made on the basis of DEMETER satellite observation.The main focus is concerned with the analysis of electric field data in Very Low Frequency (VLF) range using wavelet transform and higher order statistics. We observed electromagnetic turbulence in VLF range resulting from three earthquakes occurred at Keplulauan, Talud, Indonesia form 2009-2011.It is probably due to generation of electric field in a forthcoming earthquake's epicentral zone and penetrating in to the ionosphere. Large value of kurtosis shows the higher level of intermittence in the VLF signal before earthquake. It is possible to conjecture that the sources of this intermittence are the Coherent Structure (CS). For the better understanding of this behavior skewness parameter are used. The high energy at the large scales of the VLF turbulence due the earthquake preparation process contributes to creation of CS in the VLF signal. The results discussed were obtained during a very quiet time and therefore no ionospheric and magnetospheric sources of perturbation were expected. The statistical behavior of the signal (intermittent) and the shape of the spectra suggest that turbulence observed during this event is of the Kolmogorov type. Keywords: Turbulence, Higher order statistics and wave-wave interaction

  8. How Much Math Do Students Need to Succeed in Business and Economics Statistics? An Ordered Probit Analysis

    Science.gov (United States)

    Green, Jeffrey J.; Stone, Courtenay C.; Zegeye, Abera; Charles, Thomas A.

    2009-01-01

    Because statistical analysis requires the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, many students find it difficult to learn business statistics. In this study, we use an ordered probit…

  9. How Much Math Do Students Need to Succeed in Business and Economics Statistics? An Ordered Probit Analysis

    Science.gov (United States)

    Green, Jeffrey J.; Stone, Courtenay C.; Zegeye, Abera; Charles, Thomas A.

    2009-01-01

    Because statistical analysis requires the ability to use mathematics, students typically are required to take one or more prerequisite math courses prior to enrolling in the business statistics course. Despite these math prerequisites, however, many students find it difficult to learn business statistics. In this study, we use an ordered probit…

  10. Recurrence relations for the moments of order statistics from doubly truncated modified Makeham distribution and its characterization

    Directory of Open Access Journals (Sweden)

    Ali A. Ismail

    2014-07-01

    Full Text Available In this study a general form of recurrence relations of continuous function for doubly truncated modified Makeham distribution is obtained. Recurrence relations between single and product moments of order statistics from doubly truncated modified Makeham distribution are given. Also, a characterization of modified Makeham distribution from the right and the left is discussed through the properties of order statistics.

  11. Improving pan-european hydrological simulation of extreme events through statistical bias correction of RCM-driven climate simulations

    Directory of Open Access Journals (Sweden)

    R. Rojas

    2011-04-01

    Full Text Available In this work we asses the benefits of removing bias in climate forcing data used for hydrological climate change impact assessment at pan-European scale, with emphasis on floods. Climate simulations from the HIRHAM5-ECHAM5 model driven by the SRES-A1B emission scenario are corrected for bias using a histogram equalization method. As predictand for the bias correction we employ gridded interpolated observations of precipitation, average, minimum, and maximum temperature from the E-OBS data set. Bias removal transfer functions are derived for the control period 1961–1990. These are subsequently used to correct the climate simulations for the control period, and, under the assumption of a stationary error model, for the future time window 2071–2100. Validation against E-OBS climatology in the control period shows that the correction method performs successfully in removing bias in average and extreme statistics relevant for flood simulation over the majority of the European domain in all seasons. This translates into considerably improved simulations with the hydrological model of observed average and extreme river discharges at a majority of 554 validation river stations across Europe. Probabilities of extreme events derived employing extreme value techniques are also more closely reproduced. Results indicate that projections of future flood hazard in Europe based on uncorrected climate simulations, both in terms of their magnitude and recurrence interval, are likely subject to large errors. Notwithstanding the inherent limitations of the large-scale approach used herein, this study strongly advocates the removal of bias in climate simulations prior to their use in hydrological impact assessment.

  12. Improving pan-European hydrological simulation of extreme events through statistical bias correction of RCM-driven climate simulations

    Directory of Open Access Journals (Sweden)

    R. Rojas

    2011-08-01

    Full Text Available In this work we asses the benefits of removing bias in climate forcing data used for hydrological climate change impact assessment at pan-European scale, with emphasis on floods. Climate simulations from the HIRHAM5-ECHAM5 model driven by the SRES-A1B emission scenario are corrected for bias using a histogram equalization method. As target for the bias correction we employ gridded interpolated observations of precipitation, average, minimum, and maximum temperature from the E-OBS data set. Bias removal transfer functions are derived for the control period 1961–1990. These are subsequently used to correct the climate simulations for the control period, and, under the assumption of a stationary error model, for the future time window 2071–2100. Validation against E-OBS climatology in the control period shows that the correction method performs successfully in removing bias in average and extreme statistics relevant for flood simulation over the majority of the European domain in all seasons. This translates into considerably improved simulations with the hydrological model of observed average and extreme river discharges at a majority of 554 validation river stations across Europe. Probabilities of extreme events derived employing extreme value techniques are also more closely reproduced. Results indicate that projections of future flood hazard in Europe based on uncorrected climate simulations, both in terms of their magnitude and recurrence interval, are likely subject to large errors. Notwithstanding the inherent limitations of the large-scale approach used herein, this study strongly advocates the removal of bias in climate simulations prior to their use in hydrological impact assessment.

  13. Global statistical maps of extreme-event magnetic observatory 1 min first differences in horizontal intensity

    Science.gov (United States)

    Love, Jeffrey J.; Coïsson, Pierdavide; Pulkkinen, Antti

    2016-05-01

    Analysis is made of the long-term statistics of three different measures of ground level, storm time geomagnetic activity: instantaneous 1 min first differences in horizontal intensity ΔBh, the root-mean-square of 10 consecutive 1 min differences S, and the ramp change R over 10 min. Geomagnetic latitude maps of the cumulative exceedances of these three quantities are constructed, giving the threshold (nT/min) for which activity within a 24 h period can be expected to occur once per year, decade, and century. Specifically, at geomagnetic 55°, we estimate once-per-century ΔBh, S, and R exceedances and a site-to-site, proportional, 1 standard deviation range [1 σ, lower and upper] to be, respectively, 1000, [690, 1450]; 500, [350, 720]; and 200, [140, 280] nT/min. At 40°, we estimate once-per-century ΔBh, S, and R exceedances and 1 σ values to be 200, [140, 290]; 100, [70, 140]; and 40, [30, 60] nT/min.

  14. A statistical analysis of insurance damage claims related to rainfall extremes

    Directory of Open Access Journals (Sweden)

    M. H. Spekkers

    2012-10-01

    Full Text Available In this paper, a database of water-related insurance damage claims related to private properties and content was analysed. The aim was to investigate whether high numbers of damage claims were associated with high rainfall intensities. Rainfall data were used for the period of 2003–2010 in the Netherlands based on a network of 33 automatic rain gauges operated by the Royal Netherlands Meteorological Institute. Insurance damage data were aggregated to areas within 10-km range of the rain gauges. Through a logistic regression model, high claim numbers were linked to maximum rainfall intensities, with rainfall intensity based on 10-min to 4-h time windows. Rainfall intensity proved to be a significant damage predictor; however, the explained variance, approximated by a pseudo-R2 statistic, was at most 34% for property damage and at most 30% for content damage. When directly comparing predicted and observed values, the model was able to predict 5–17% more cases correctly compared to a random prediction. No important differences were found between relations with property and content damage data. A considerable fraction of the variance is left unexplained, which emphasizes the need to study damage generating mechanisms and additional explanatory variables.

  15. Analysis of Slight Discrepancy Between Quantum Dynamics and Classical Statistical Dynamics For Second Order Field Theories

    CERN Document Server

    Werbos, P J

    2003-01-01

    Quantum Field Theory (QFT) makes predictions by combining two sets of assumptions: (1) quantum dynamics, such as a Schrodinger or Liouville equation; (2) quantum measurement, such as stochastic collapse to an eigenfunction of a measurement operator. A previous paper defined a classical density matrix R encoding the statistical moments of an ensemble of states of classical second-order Hamiltonian field theory. It proved Tr(RQ)=E(Q), etc., for the usual field operators as defined by Weinberg, and it proved that those observables of the classical system obey the usual Heisenberg dynamic equation. However, R itself obeys dynamics different from the usual Liouville equation! This paper derives those dynamics, and calculates the discrepancy between CFT and normal form QFT in predicting general observables g(Q,P). There is some preliminary evidence for the conjecture that the discrepancies disappear in equilibrium states (bound states and scattering states) for finite bosonic field theories. Even if not, they appea...

  16. Second-Order Statistics for Wave Propagation through Complex Optical Systems

    DEFF Research Database (Denmark)

    Yura, H.T.; Hanson, Steen Grüner

    1989-01-01

    -order Rytov approximation, explicit general expressions are presented for the mutual coherence function, the log-amplitude and phase correlation functions, and the mean-square irradiance that are obtained in propagation through an arbitrary paraxial ABCD optical system containing Gaussian-shaped limiting......Closed-form expressions are derived for various statistical functions that arise in optical propagation through arbitrary optical systems that can be characterized by a complex ABCD matrix in the presence of distributed random inhomogeneities along the optical path. Specifically, within the second...... apertures. Additionally, we consider the performance of adaptive-optics systems through arbitrary real paraxial ABCD optical systems and derive an expression for the mean irradiance of an adaptive-optics laser transmitter through such systems. © 1989 Optical Society of America...

  17. Prediction and reconstruction of future and missing unobservable modified Weibull lifetime based on generalized order statistics

    Directory of Open Access Journals (Sweden)

    Amany E. Aly

    2016-04-01

    Full Text Available When a system consisting of independent components of the same type, some appropriate actions may be done as soon as a portion of them have failed. It is, therefore, important to be able to predict later failure times from earlier ones. One of the well-known failure distributions commonly used to model component life, is the modified Weibull distribution (MWD. In this paper, two pivotal quantities are proposed to construct prediction intervals for future unobservable lifetimes based on generalized order statistics (gos from MWD. Moreover, a pivotal quantity is developed to reconstruct missing observations at the beginning of experiment. Furthermore, Monte Carlo simulation studies are conducted and numerical computations are carried out to investigate the efficiency of presented results. Finally, two illustrative examples for real data sets are analyzed.

  18. Atrial fibrillatory signal estimation using blind source extraction algorithm based on high-order statistics

    Institute of Scientific and Technical Information of China (English)

    WANG Gang; RAO NiNi; ZHANG Ying

    2008-01-01

    The analysis and the characterization of atrial fibrillation (AF) requires,in a previous key step,the extraction of the atrial activity (AA) free from 12-lead electrocardiogram (ECG).This contribution proposes a novel non-invasive approach for the AA estimation in AF episodes.The method is based on blind source extraction (BSE) using high order statistics (HOS).The validity and performance of this algorithm are confirmed by extensive computer simulations and experiments on realworld data.In contrast to blind source separation (BSS) methods,BSE only extract one desired signal,and it is easy for the machine to judge whether the extracted signal is AA source by calculating its spectrum concentration,while it is hard for the machine using BSS method to judge which one of the separated twelve signals is AA source.Therefore,the proposed method is expected to have great potential in clinical monitoring.

  19. Fast and Adaptive Bidimensional Empirical Mode Decomposition Using Order-Statistics Filter Based Envelope Estimation

    Directory of Open Access Journals (Sweden)

    Jesmin F. Khan

    2008-08-01

    Full Text Available A novel approach for bidimensional empirical mode decomposition (BEMD is proposed in this paper. BEMD decomposes an image into multiple hierarchical components known as bidimensional intrinsic mode functions (BIMFs. In each iteration of the process, two-dimensional (2D interpolation is applied to a set of local maxima (minima points to form the upper (lower envelope. But, 2D scattered data interpolation methods cause huge computation time and other artifacts in the decomposition. This paper suggests a simple, but effective, method of envelope estimation that replaces the surface interpolation. In this method, order statistics filters are used to get the upper and lower envelopes, where filter size is derived from the data. Based on the properties of the proposed approach, it is considered as fast and adaptive BEMD (FABEMD. Simulation results demonstrate that FABEMD is not only faster and adaptive, but also outperforms the original BEMD in terms of the quality of the BIMFs.

  20. Detection of seizure and epilepsy using higher order statistics in the EMD domain.

    Science.gov (United States)

    Alam, S M Shafiul; Bhuiyan, M I H

    2013-03-01

    In this paper, a method using higher order statistical moments of EEG signals calculated in the empirical mode decomposition (EMD) domain is proposed for detecting seizure and epilepsy. The appropriateness of these moments in distinguishing the EEG signals is investigated through an extensive analysis in the EMD domain. An artificial neural network is employed as the classifier of the EEG signals wherein these moments are used as features. The performance of the proposed method is studied using a publicly available benchmark database for various classification cases that include healthy, interictal (seizure-free interval) and ictal (seizure), healthy and seizure, nonseizure and seizure, and interictal and ictal, and compared with that of several recent methods based on time-frequency analysis and statistical moments. It is shown that the proposed method can provide, in almost all the cases, 100% accuracy, sensitivity, and specificity, especially in the case of discriminating seizure activities from the nonseizure ones for patients with epilepsy while being much faster as compared to the time-frequency analysis-based techniques.

  1. De-trending of wind speed variance based on first-order and second-order statistical moments only

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2014-01-01

    statistical data only. The first model is a pure time series analysis approach, which quantifies the effect of non-stationary characteristics of ensemble mean wind speeds on the estimated wind speed standard deviations as based on mean wind speed statistics only. This model is applicable to statistics......The lack of efficient methods for de-trending of wind speed resource data may lead to erroneous wind turbine fatigue and ultimate load predictions. The present paper presents two models, which quantify the effect of an assumed linear trend on wind speed standard deviations as based on available...... of arbitrary types of time series. The second model uses the full set of information and includes thus additionally observed wind speed standard deviations to estimate the effect of ensemble mean non-stationarities on wind speed standard deviations. This model takes advantage of a simple physical relationship...

  2. Early prediction of lung cancer recurrence after stereotactic radiotherapy using second order texture statistics

    Science.gov (United States)

    Mattonen, Sarah A.; Palma, David A.; Haasbeek, Cornelis J. A.; Senan, Suresh; Ward, Aaron D.

    2014-03-01

    Benign radiation-induced lung injury is a common finding following stereotactic ablative radiotherapy (SABR) for lung cancer, and is often difficult to differentiate from a recurring tumour due to the ablative doses and highly conformal treatment with SABR. Current approaches to treatment response assessment have shown limited ability to predict recurrence within 6 months of treatment. The purpose of our study was to evaluate the accuracy of second order texture statistics for prediction of eventual recurrence based on computed tomography (CT) images acquired within 6 months of treatment, and compare with the performance of first order appearance and lesion size measures. Consolidative and ground-glass opacity (GGO) regions were manually delineated on post-SABR CT images. Automatic consolidation expansion was also investigated to act as a surrogate for GGO position. The top features for prediction of recurrence were all texture features within the GGO and included energy, entropy, correlation, inertia, and first order texture (standard deviation of density). These predicted recurrence with 2-fold cross validation (CV) accuracies of 70-77% at 2- 5 months post-SABR, with energy, entropy, and first order texture having leave-one-out CV accuracies greater than 80%. Our results also suggest that automatic expansion of the consolidation region could eliminate the need for manual delineation, and produced reproducible results when compared to manually delineated GGO. If validated on a larger data set, this could lead to a clinically useful computer-aided diagnosis system for prediction of recurrence within 6 months of SABR and allow for early salvage therapy for patients with recurrence.

  3. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  4. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    Science.gov (United States)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  5. An evaluation of the uncertainty of extreme events statistics at the WMO/CIMO Lead Centre on precipitation intensity

    Science.gov (United States)

    Colli, M.; Lanza, L. G.; La Barbera, P.

    2012-12-01

    Improving the quality of point-scale rainfall measurements is a crucial issue fostered in recent years by the WMO Commission for Instruments and Methods of Observation (CIMO) by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries. The WMO/CIMO Lead Centre on Precipitation Intensity (LC) was recently constituted, in a joint effort between the Dep. of Civil, Chemical and Environmental Engineering of the University of Genova and the Italian Air Force Met Service, gathering the considerable asset of data and information achieved by the past infield and laboratory campaigns with the aim of researching novel methodologies for improving the accuracy of rainfall intensity (RI) measurement techniques. Among the ongoing experimental activities carried out by the LC laboratory particular attention is paid to the reliability evaluation of extreme rainfall events statistics , a common tool in the engineering practice for urban and non urban drainage system design, based on real world observations obtained from weighing gauges. Extreme events statistics were proven already to be highly affected by the traditional tipping-bucket rain gauge RI measurement inaccuracy (La Barbera et al., 2002) and the time resolution of the available RI series certainly constitutes another key-factor in the reliability of the derived hyetographs. The present work reports the LC laboratory efforts in assembling a rainfall simulation system to reproduce the inner temporal structure of the rainfall process by means of dedicated calibration and validation tests. This allowed testing of catching type rain gauges under non-steady flow conditions and quantifying, in a first instance, the dynamic behaviour of the investigated instruments. Considerations about the influence of the dynamic response on

  6. Identification of Unknown Parameters and Orders via Cuckoo Search Oriented Statistically by Differential Evolution for Noncommensurate Fractional-Order Chaotic Systems

    OpenAIRE

    2013-01-01

    Identification of the unknown parameters and orders of fractional chaotic systems is of vital significance in controlling and synchronization of fractional-order chaotic systems. In this paper, a non-Lyapunov novel approach is proposed to estimate the unknown parameters and orders together for non-commensurate and hyper fractional chaotic systems based on cuckoo search oriented statistically the differential evolution (CSODE). Firstly, a novel Gao's mathematical model is put and analysed in t...

  7. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  8. Statistical modelling of wildfire size and intensity: a step toward meteorological forecasting of summer extreme fire risk

    Science.gov (United States)

    Hernandez, C.; Keribin, C.; Drobinski, P.; Turquety, S.

    2015-12-01

    In this article we investigate the use of statistical methods for wildfire risk assessment in the Mediterranean Basin using three meteorological covariates, the 2 m temperature anomaly, the 10 m wind speed and the January-June rainfall occurrence anomaly. We focus on two remotely sensed characteristic fire variables, the burnt area (BA) and the fire radiative power (FRP), which are good proxies for fire size and intensity respectively. Using the fire data we determine an adequate parametric distribution function which fits best the logarithm of BA and FRP. We reconstruct the conditional density function of both variables with respect to the chosen meteorological covariates. These conditional density functions for the size and intensity of a single event give information on fire risk and can be used for the estimation of conditional probabilities of exceeding certain thresholds. By analysing these probabilities we find two fire risk regimes different from each other at the 90 % confidence level: a "background" summer fire risk regime and an "extreme" additional fire risk regime, which corresponds to higher probability of occurrence of larger fire size or intensity associated with specific weather conditions. Such a statistical approach may be the ground for a future fire risk alert system.

  9. An MGF-based unified framework to determine the joint statistics of partial sums of ordered random variables

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Order statistics find applications in various areas of communications and signal processing. In this paper, we introduce an unified analytical framework to determine the joint statistics of partial sums of ordered random variables (RVs). With the proposed approach, we can systematically derive the joint statistics of any partial sums of ordered statistics, in terms of the moment generating function (MGF) and the probability density function (PDF). Our MGF-based approach applies not only when all the K ordered RVs are involved but also when only the Ks(Ks < K) best RVs are considered. In addition, we present the closed-form expressions for the exponential RV special case. These results apply to the performance analysis of various wireless communication systems over fading channels. © 2006 IEEE.

  10. Cooperative Transmission for Relay Networks Based on Second-Order Statistics of Channel State Information

    Science.gov (United States)

    Li, Jiangyuan; Petropulu, Athina P.; Poor, H. Vincent

    2011-03-01

    Cooperative beamforming in relay networks is considered, in which a source transmits to its destination with the help of a set of cooperating nodes. The source first transmits locally. The cooperating nodes that receive the source signal retransmit a weighted version of it in an amplify-and-forward (AF) fashion. Assuming knowledge of the second-order statistics of the channel state information, beamforming weights are determined so that the signal-to-noise ratio (SNR) at the destination is maximized subject to two different power constraints, i.e., a total (source and relay) power constraint, and individual relay power constraints. For the former constraint, the original problem is transformed into a problem of one variable, which can be solved via Newton's method. For the latter constraint, the original problem is transformed into a homogeneous quadratically constrained quadratic programming (QCQP) problem. In this case, it is shown that when the number of relays does not exceed three the global solution can always be constructed via semidefinite programming (SDP) relaxation and the matrix rank-one decomposition technique. For the cases in which the SDP relaxation does not generate a rank one solution, two methods are proposed to solve the problem: the first one is based on the coordinate descent method, and the second one transforms the QCQP problem into an infinity norm maximization problem in which a smooth finite norm approximation can lead to the solution using the augmented Lagrangian method.

  11. Going Beyond a Mean-field Model for the Learning Cortex: Second-Order Statistics

    Science.gov (United States)

    Steyn-Ross, Moira L.; Steyn-Ross, D. A.; Sleigh, J. W.

    2008-01-01

    Mean-field models of the cortex have been used successfully to interpret the origin of features on the electroencephalogram under situations such as sleep, anesthesia, and seizures. In a mean-field scheme, dynamic changes in synaptic weights can be considered through fluctuation-based Hebbian learning rules. However, because such implementations deal with population-averaged properties, they are not well suited to memory and learning applications where individual synaptic weights can be important. We demonstrate that, through an extended system of equations, the mean-field models can be developed further to look at higher-order statistics, in particular, the distribution of synaptic weights within a cortical column. This allows us to make some general conclusions on memory through a mean-field scheme. Specifically, we expect large changes in the standard deviation of the distribution of synaptic weights when fluctuation in the mean soma potentials are large, such as during the transitions between the “up” and “down” states of slow-wave sleep. Moreover, a cortex that has low structure in its neuronal connections is most likely to decrease its standard deviation in the weights of excitatory to excitatory synapses, relative to the square of the mean, whereas a cortex with strongly patterned connections is most likely to increase this measure. This suggests that fluctuations are used to condense the coding of strong (presumably useful) memories into fewer, but dynamic, neuron connections, while at the same time removing weaker (less useful) memories. PMID:19669541

  12. Direction-of-Arrival Estimation Based on Sparse Recovery with Second-Order Statistics

    Directory of Open Access Journals (Sweden)

    H. Chen

    2015-04-01

    Full Text Available Traditional direction-of-arrival (DOA estimation techniques perform Nyquist-rate sampling of the received signals and as a result they require high storage. To reduce sampling ratio, we introduce level-crossing (LC sampling which captures samples whenever the signal crosses predetermined reference levels, and the LC-based analog-to-digital converter (LC ADC has been shown to efficiently sample certain classes of signals. In this paper, we focus on the DOA estimation problem by using second-order statistics based on the LC samplings recording on one sensor, along with the synchronous samplings of the another sensors, a sparse angle space scenario can be found by solving an $ell_1$ minimization problem, giving the number of sources and their DOA's. The experimental results show that our proposed method, when compared with some existing norm-based constrained optimization compressive sensing (CS algorithms, as well as subspace method, improves the DOA estimation performance, while using less samples when compared with Nyquist-rate sampling and reducing sensor activity especially for long time silence signal.

  13. Passive Steganalysis Based on Higher Order Image Statistics of Curvelet Transform

    Institute of Scientific and Technical Information of China (English)

    S.Geetha; Siva S.Sivatha Sindhu; N.Kamaraj

    2010-01-01

    Steganographic techniques accomplish covert communication by embedding secret messages into innocuous digital images in ways that are imperceptible to the human eye.This paper presents a novel passive steganalysis strategy in which the task is approached as a pattern classification problem.A critical part of the steganalyser design depends on the selection of informative features.This paper is aimed at proposing a novel attack with improved performance indices with the following implications: 1)employing higher order statistics from a curvelet sub-band image representation that offers better discrimination ability for detecting stego anomalies in images,as compared to other conventional wavelet transforms;2)increasing the sensitivity and specificity of the system by the feature reduction phase;3)realizing the system using an efficient classification engine,a neuro-C4.5 classifier,which provides better classification rate.An extensive experimental evaluation on a database containing 5600 clean and stego images shows that the proposed scheme is a state-of-the-art steganalyser that outperforms other previous steganalytic methods.

  14. Object representation for multi-beam sonar image using local higher-order statistics

    Science.gov (United States)

    Li, Haisen; Gao, Jue; Du, Weidong; Zhou, Tian; Xu, Chao; Chen, Baowei

    2017-01-01

    Multi-beam sonar imaging has been widely used in various underwater tasks such as object recognition and object tracking. Problems remain, however, when the sonar images are characterized by low signal-to-noise ratio, low resolution, and amplitude alterations due to viewpoint changes. This paper investigates the capacity of local higher-order statistics (HOS) to represent objects in multi-beam sonar images. The Weibull distribution has been used for modeling the background of the image. Local HOS involving skewness is estimated using a sliding computational window, thus generating the local skewness image of which a square structure is associated with a potential object. The ability of object representation with different signal-to-noise ratio (SNR) between object and background is analyzed, and the choice of the computational window size is discussed. In the case of the object with high SNR, a novel algorithm based on background estimation is proposed to reduce side lobe and retain object regions. The performance of object representation has been evaluated using real data that provided encouraging results in the case of the object with low amplitude, high side lobes, or large fluctuant amplitude. In conclusion, local HOS provides more reliable and stable information relating to the potential object and improves the object representation in multi-beam sonar image.

  15. Statistical changes in lakes in urbanizing watersheds and lake return frequencies adjusted for trend and initial stage utilizing generalized extreme value theory

    Science.gov (United States)

    Paynter, Shayne

    Many water resources throughout the world are demonstrating changes in historic water levels. Potential reasons for these changes include climate shifts, anthropogenic alterations or basin urbanization. The focus of this research was threefold: (1) to determine the extent of spatio-temporal changes in regional precipitation patterns, (2) to determine the statistical changes that occur in lakes with urbanizing watersheds, and (3) to develop accurate prediction of trends and lake level return frequencies. To investigate rainfall patterns regionally, appropriate distributions, either gamma or generalized extreme value (GEV), were fitted to variables at a number of rainfall gages utilizing maximum likelihood estimation. The spatial distribution of rainfall variables was found to be quite homogenous within the region in terms of an average annual expectation. Furthermore, the temporal distribution of rainfall variables was found to be stationary with only one gage evidencing a significant trend. In order to study statistical changes of lake water surface levels in urbanizing watersheds, serial changes in time series parameters, autocorrelation and variance were evaluated and a regression model to estimate weekly lake level fluctuations was developed. The following general conclusions about lakes in urbanizing watersheds were reached: (1) The statistical structure of lake level time series is systematically altered and is related to the extent of urbanization, (2) in the absence of other forcing mechanisms, autocorrelation and baseflow appear to decrease, and (3) the presence of wetlands adjacent to lakes can offset the reduction in baseflow. In regards to the third objective, the direction and magnitude of trends in flood and drought stages were estimated and both long-term and short-term flood and drought stage return frequencies were predicted utilizing the generalized extreme value (GEV) distribution with time and starting stage covariates. All of the lakes

  16. Downscaling of Extreme Precipitation: Proposing a New Statistical Approach and Investigating a Taken-for-Granted Assumption

    Science.gov (United States)

    Elshorbagy, Amin; Alam, Shahabul

    2015-04-01

    In spite of the ability of General Circulation Models (GCMs) to predict and generate atmospheric variables under pre-identified climate change scenarios, their coarse horizontal scale is an obstacle for impact studies. Therefore, downscaling of variables (e.g., precipitation) from coarse spatial and temporal scales to finer ones is inevitable. Downscaling methods are classified into various types ranging from applications related to short term numerical weather prediction to multidecadal global climate prediction. For engineering applications of impact assessment of climate change on infrastructure, the multidecadal global climate projection, is the most widely used type. One of the important engineering applications of climate change impact assessment is the development and reconstruction of intensity-duration-frequency (IDF) curves under possible climate change. IDF curves are widely used for design and management of urban hydrosystems. Their construction requires accurate information about intense short duration rainfall, including sub-hourly, extremes. Previous attempts were made to construct IDF curves in various places under climate change using dynamical and statistical downscaling. The deficiency of GCMs, and even RCMs, in representing local surface conditions, especially extreme weather and convective precipitation in many areas, necessitates the use of statistical downscaling for IDF-related applications. In statistical downscaling methods, and in particular regression-based methods, the search is always for the optimum set of inputs at a coarser scale that act as predictors for the desired surface weather variable (predictand) at the local finer scale. The grid box nearest to the local site may not provide the optimum predictor-predictand relationship. In fact, even the set of predictors varies from one region to another. In this study, a novel approach using genetic programming (GP) for specific application of downscaling annual maximum precipitation

  17. Spin-resolved photoelectron spectroscopy using femtosecond extreme ultraviolet light pulses from high-order harmonic generation

    Science.gov (United States)

    Plötzing, M.; Adam, R.; Weier, C.; Plucinski, L.; Eich, S.; Emmerich, S.; Rollinger, M.; Aeschlimann, M.; Mathias, S.; Schneider, C. M.

    2016-04-01

    The fundamental mechanism responsible for optically induced magnetization dynamics in ferromagnetic thin films has been under intense debate since almost two decades. Currently, numerous competing theoretical models are in strong need for a decisive experimental confirmation such as monitoring the triggered changes in the spin-dependent band structure on ultrashort time scales. Our approach explores the possibility of observing femtosecond band structure dynamics by giving access to extended parts of the Brillouin zone in a simultaneously time-, energy- and spin-resolved photoemission experiment. For this purpose, our setup uses a state-of-the-art, highly efficient spin detector and ultrashort, extreme ultraviolet light pulses created by laser-based high-order harmonic generation. In this paper, we present the setup and first spin-resolved spectra obtained with our experiment within an acquisition time short enough to allow pump-probe studies. Further, we characterize the influence of the excitation with femtosecond extreme ultraviolet pulses by comparing the results with data acquired using a continuous wave light source with similar photon energy. In addition, changes in the spectra induced by vacuum space-charge effects due to both the extreme ultraviolet probe- and near-infrared pump-pulses are studied by analyzing the resulting spectral distortions. The combination of energy resolution and electron count rate achieved in our setup confirms its suitability for spin-resolved studies of the band structure on ultrashort time scales.

  18. Extreme Response Predictions for Jack-up Units in Second Order Stochastic Waves by FORM

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher; Capul, Julien

    2006-01-01

    An efficient procedure for derivation of mean outcrossing rates for non-linear wave-induced responses in stationary sea states is presented and applied to an analysis of the horizontal deck sway of a jack-up unit. The procedure is based on the theory of random vibrations and uses the first order...... reliability method (FORM) to estimate the most probable set of wave components in the ocean wave system that will lead to exceedance of a specific response level together with the associated mean outcrossing rate. The procedure bears some resemblance to the Constrained NewWave methodology, but is conceptually......-tic waves, not previously included in the analysis of jack-up units in stochastic seaways....

  19. Fission Multiplicity Detection with Temporal Gamma-Neutron Discrimination from Higher-Order Time Correlation Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Oberer, R.B.

    2002-11-12

    The current practice of nondestructive assay (NDA) of fissile materials using neutrons is dominated by the {sup 3}He detector. This has been the case since the mid 1980s when Fission Multiplicity Detection (FMD) was replaced with thermal well counters and neutron multiplicity counting (NMC). The thermal well counters detect neutrons by neutron capture in the {sup 3}He detector subsequent to moderation. The process of detection requires from 30 to 60 {micro}s. As will be explained in Section 3.3 the rate of detecting correlated neutrons (signal) from the same fission are independent of this time but the rate of accidental correlations (noise) are proportional to this time. The well counters are at a distinct disadvantage when there is a large source of uncorrelated neutrons present from ({alpha}, n) reactions for example. Plastic scintillating detectors, as were used in FMD, require only about 20 ns to detect neutrons from fission. One thousandth as many accidental coincidences are therefore accumulated. The major problem with the use of fast-plastic scintillation detectors, however, is that both neutrons and gamma rays are detected. The pulses from the two are indistinguishable in these detectors. For this thesis, a new technique was developed to use higher-order time correlation statistics to distinguish combinations of neutron and gamma ray detections in fast-plastic scintillation detectors. A system of analysis to describe these correlations was developed based on simple physical principles. Other sources of correlations from non-fission events are identified and integrated into the analysis developed for fission events. A number of ratios and metric are identified to determine physical properties of the source from the correlations. It is possible to determine both the quantity being measured and detection efficiency from these ratios from a single measurement without a separate calibration. To account for detector dead-time, an alternative analytical technique

  20. High-order statistics of microtexton for HEp-2 staining pattern classification.

    Science.gov (United States)

    Han, Xian-Hua; Wang, Jian; Xu, Gang; Chen, Yen-Wei

    2014-08-01

    This study addresses the classification problem of the HEp-2 cell using indirect immunofluorescent (IIF) image analysis, which can indicate the presence of autoimmune diseases by finding antibodies in the patient serum. Generally, the method used for IIF analysis remains subjective, and depends too heavily on the experience and expertise of the physician. Recently, studies have shown that it is possible to identify the cell patterns using IIF image analysis and machine learning techniques. However, it still has large gap in recognition rates to the physical experts' one. This paper explores an approach in which the discriminative features of HEp-2 cell images in IIF are extracted and then, the patterns of the HEp-2 cell are identified using machine learning techniques. Motivated by the progress in the research field of computer vision, as a result of which small local pixel pattern distributions can now be highly discriminative, the proposed strategy employs a parametric probability process to model local image patches (textons: microstructures in the cell image) and extract the higher-order statistics of the model parameters for the image description. The proposed strategy can adaptively characterize the microtexton space of HEp-2 cell images as a generative probability model, and discover the parameters that yield a better fitting of the training space, which would lead to a more discriminant representation for the cell image. The simple linear support vector machine is used for cell pattern identification because of its low computational cost, in particular for large-scale datasets. Experiments using the open HEp-2 cell dataset used in the ICIP2013 contest validate that the proposed strategy can achieve a much better performance than the widely used local binary pattern (LBP) histogram and its extensions, rotation invariant co-occurrence LBP, and pairwise rotation invariant co-occurrence LBP, and that the achieved recognition error rate is even very

  1. A Single-Case Study of Resiliency After Extreme Incest in an Old Order Amish Family.

    Science.gov (United States)

    McGuigan, William M; Stephenson, Sarah J

    2015-01-01

    This exploratory research brief presents a single case study of the resiliency of "Mary B." She grew up in an Old Order Amish family where isolation, secrecy, and patriarchy masked repeated sexual assaults by her older brothers that began at age 7. By the age of 20, Mary alleged she had been raped on more than 200 separate occasions by members of her Amish family. After years of pleading with her mother and church officials to intervene, she sought therapy outside the Amish community. This led to three of her brothers being incarcerated. Her family disowned her and she was banned from the Amish community, leaving with an 8th grade education and little more than the clothes she was wearing. In less than 2 years, Mary had moved to a new town, completed her GED, obtained a car and driving license, maintained a small home, and worked as a certified nursing assistant. She consented to tape recorded interviews and completed several quantitative diagnostic measures. Scores on the diagnostic measures placed her within the normal range on self-esteem, competency, depression, stress, social support, and life skills. Analysis of interviews revealed Mary rebounded from her past by reframing her experiences. Themes identified within the interviews supported 6 of the 7 types of resiliencies (insight, independence, initiative, relationships, humor, and morality) outlined in the therapeutic Challenge Model.

  2. Nondestructive Evaluation (NDE) Technology Initiatives (NTIP). Delivery Order 0039: Statistical Comparison of Competing Material Models

    Science.gov (United States)

    2003-01-01

    DeGroot , Morris H., “Lindley’s Paradox: Comment,” Journal of the American Statistical Association, June 1982, Volume: 77 Number: 378 Page: pp336-339...22) DeGroot , Morris H., “Lindley’s Paradox: Comment,” Journal of the American Statistical Association, June 1982, Volume: 77 Number: 378 Page: pp336

  3. 广义次序统计量间隔的多维随机排序%Multivariate Stochastic Orderings of Spacings of Generalized Order Statistics

    Institute of Scientific and Technical Information of China (English)

    方兆本; 胡太忠; 吴耀华; 庄玮玮

    2006-01-01

    本文研究了附加于广义次序统计量底分布以及参数的条件,使得人们在多维似然比序和多维通常随机序意义下对广义次序统计量的间隔向量进行比较,同时也给出了文中主要结果的应用.%In this paper, we investigate conditions on the underlying distribution function and the parameters on which the generalized order statistics are based, to obtain stochastic comparisons of spacing vectors of generalized order statistics in the multivariate likelihood ratio and the usual multivariate stochastic orders. Some applications of the main results are also given.

  4. Higher-Order Moment Characterisation of Rogue Wave Statistics in Supercontinuum Generation

    DEFF Research Database (Denmark)

    Sørensen, Simon Toft; Bang, Ole; Wetzel, Benjamin

    2012-01-01

    The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour.......The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour....

  5. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  6. Internal frequency conversion extreme ultraviolet interferometer using mutual coherence properties of two high-order-harmonic sources

    Energy Technology Data Exchange (ETDEWEB)

    Dobosz, S.; Stabile, H.; Tortora, A.; Monot, P.; Reau, F.; Bougeard, M.; Merdji, H.; Carre, B.; Martin, Ph. [CEA, IRAMIS, Service des Photons Atomes et Molecules, F-91191 Gif- sur-Yvette (France); Joyeux, D.; Phalippou, D.; Delmotte, F.; Gautier, J.; Mercier, R. [Laboratoire Charles Fabry de l' Institut d' Optique, CNRS et Universite Paris Sud, Campus Polytechnique, RD 128, F-91127 Palaiseau cedex (France)

    2009-11-15

    We report on an innovative two-dimensional imaging extreme ultraviolet (XUV) interferometer operating at 32 nm based on the mutual coherence of two laser high order harmonics (HOH) sources, separately generated in gas. We give the first evidence that the two mutually coherent HOH sources can be produced in two independent spatially separated gas jets, allowing for probing centimeter-sized objects. A magnification factor of 10 leads to a micron resolution associated with a subpicosecond temporal resolution. Single shot interferograms with a fringe visibility better than 30% are routinely produced. As a test of the XUV interferometer, we measure a maximum electronic density of 3x10{sup 20} cm{sup -3} 1.1 ns after the creation of a plasma on aluminum target.

  7. A nonlinear scalar model of extreme mass ratio inspirals in effective field theory I. Self force through third order

    CERN Document Server

    Galley, Chad R

    2010-01-01

    The motion of a small compact object in a background spacetime is investigated in the context of a model nonlinear scalar field theory. This model is constructed to have a perturbative structure analogous to the General Relativistic description of extreme mass ratio inspirals (EMRIs). We apply the effective field theory approach to this model and calculate the finite part of the self force on the small compact object through third order in the ratio of the size of the compact object to the curvature scale of the background (e.g., black hole) spacetime. We use well-known renormalization methods and demonstrate the consistency of the formalism in rendering the self force finite at higher orders within a point particle prescription for the small compact object. This nonlinear scalar model should be useful for studying various aspects of higher-order self force effects in EMRIs but within a comparatively simpler context than the full gravitational case. These aspects include developing practical schemes for highe...

  8. Second-order advantage from kinetic-spectroscopic data matrices in the presence of extreme spectral overlapping

    Energy Technology Data Exchange (ETDEWEB)

    Culzoni, Maria J. [Laboratorio de Desarrollo Analitico y Quimiometria (LADAQ), Catedra de Quimica Analitica I, Facultad de Bioquimica y Ciencias Biologicas, Universidad Nacional del Litoral, Ciudad Universitaria, Santa Fe S3000ZAA (Argentina); Goicoechea, Hector C. [Laboratorio de Desarrollo Analitico y Quimiometria (LADAQ), Catedra de Quimica Analitica I, Facultad de Bioquimica y Ciencias Biologicas, Universidad Nacional del Litoral, Ciudad Universitaria, Santa Fe S3000ZAA (Argentina)], E-mail: hgoico@fbcb.unl.edu.ar; Ibanez, Gabriela A.; Lozano, Valeria A. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario and Instituto de Quimica Rosario (IQUIR-CONICET), Suipacha 531, Rosario S2002LRK (Argentina); Marsili, Nilda R. [Laboratorio de Desarrollo Analitico y Quimiometria (LADAQ), Catedra de Quimica Analitica I, Facultad de Bioquimica y Ciencias Biologicas, Universidad Nacional del Litoral, Ciudad Universitaria, Santa Fe S3000ZAA (Argentina); Olivieri, Alejandro C. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario and Instituto de Quimica Rosario (IQUIR-CONICET), Suipacha 531, Rosario S2002LRK (Argentina)], E-mail: aolivier@fbioyf.unr.edu.ar; Pagani, Ariana P. [Departamento de Quimica Analitica, Facultad de Ciencias Bioquimicas y Farmaceuticas, Universidad Nacional de Rosario and Instituto de Quimica Rosario (IQUIR-CONICET), Suipacha 531, Rosario S2002LRK (Argentina)

    2008-04-28

    Multivariate curve resolution coupled to alternating least-squares (MCR-ALS) has been employed to model kinetic-spectroscopic second-order data, with focus on the achievement of the important second-order advantage, under conditions of extreme spectral overlapping among sample components. A series of simulated examples shows that MCR-ALS can conveniently handle the studied analytical problem unlike other second-order multivariate calibration algorithms, provided matrix augmentation is implemented in the spectral mode instead of in the usual kinetic mode. The approach has also been applied to three experimental examples, which involve the determination of: (1) the antiparkinsonian carbidopa (analyte) in the presence of levodopa as a potential interferent, both reacting with cerium (IV) to produce the fluorescent species cerium (III) with different kinetics; (2) Fe(II) (analyte) in the presence of the interferent Zn(II), both catalyzing the oxidation of methyl orange with potassium bromate; and (3) tartrazine (analyte) in the presence of the interferent brilliant blue, both oxidized with potassium bromate, with the interferent leading to a product with an absorption spectrum very similar to tartrazine. The results indicate good analytical performance towards the analytes, despite the intense spectral overlapping and the presence of unexpected constituents in the test samples.

  9. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  10. Low-dimensional reduced-order models for statistical response and uncertainty quantification: Barotropic turbulence with topography

    Science.gov (United States)

    Qi, Di; Majda, Andrew J.

    2017-03-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty to changes in forcing in a barotropic turbulent system with topography involving interactions between small-scale motions and a large-scale mean flow. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model parameters. Statistical theories about a Gaussian invariant measure and the exact statistical energy equations are also developed for the truncated barotropic equations that can be used to improve the imperfect model prediction skill. A stringent paradigm model of 57 degrees of freedom is used to display the feasibility of the reduced-order methods. This simple model creates large-scale zonal mean flow shifting directions from westward to eastward jets with an abrupt change in amplitude when perturbations are applied, and prototype blocked and unblocked patterns can be generated in this simple model similar to the real natural system. Principal statistical responses in mean and variance can be captured by the reduced-order models with desirable accuracy and efficiency with only 3 resolved modes. An even more challenging regime with non-Gaussian equilibrium statistics using the fluctuation equations is also tested in the reduced-order models with accurate prediction using the first 5 resolved modes. These reduced-order models also show potential for uncertainty quantification and prediction in more complex realistic geophysical turbulent dynamical systems.

  11. Higher order moment description of supercontinuum noise and rogue wave statistics

    DEFF Research Database (Denmark)

    Sørensen, Simon Toft; Bang, Ole; Dudley, John M.

    for identifying regions of long-tailed rogue wave like behaviour. By carrying out multiple numerical simulations in the presence of noise, we demonstrate that the statistical moments of Coefficient of Variation, Skew and Kurtosis provide the necessary rigorous measure of the SC histograms to yield a clear means...

  12. Identification of Unknown Parameters and Orders via Cuckoo Search Oriented Statistically by Differential Evolution for Noncommensurate Fractional-Order Chaotic Systems

    Directory of Open Access Journals (Sweden)

    Fei Gao

    2013-01-01

    Full Text Available In this paper, a non-Lyapunov novel approach is proposed to estimate the unknown parameters and orders together for noncommensurate and hyper fractional chaotic systems based on cuckoo search oriented statistically by the differential evolution (CSODE. Firstly, a novel Gaos’ mathematical model is proposed and analyzed in three submodels, not only for the unknown orders and parameters’ identification but also for systems’ reconstruction of fractional chaos systems with time delays or not. Then the problems of fractional-order chaos’ identification are converted into a multiple modal nonnegative functions’ minimization through a proper translation, which takes fractional-orders and parameters as its particular independent variables. And the objective is to find the best combinations of fractional-orders and systematic parameters of fractional order chaotic systems as special independent variables such that the objective function is minimized. Simulations are done to estimate a series of noncommensurate and hyper fractional chaotic systems with the new approaches based on CSODE, the cuckoo search, and Genetic Algorithm, respectively. The experiments’ results show that the proposed identification mechanism based on CSODE for fractional orders and parameters is a successful method for fractional-order chaotic systems, with the advantages of high precision and robustness.

  13. Statistical identification with hidden Markov models of large order splitting strategies in an equity market

    Science.gov (United States)

    Vaglica, Gabriella; Lillo, Fabrizio; Mantegna, Rosario N.

    2010-07-01

    Large trades in a financial market are usually split into smaller parts and traded incrementally over extended periods of time. We address these large trades as hidden orders. In order to identify and characterize hidden orders, we fit hidden Markov models to the time series of the sign of the tick-by-tick inventory variation of market members of the Spanish Stock Exchange. Our methodology probabilistically detects trading sequences, which are characterized by a significant majority of buy or sell transactions. We interpret these patches of sequential buying or selling transactions as proxies of the traded hidden orders. We find that the time, volume and number of transaction size distributions of these patches are fat tailed. Long patches are characterized by a large fraction of market orders and a low participation rate, while short patches have a large fraction of limit orders and a high participation rate. We observe the existence of a buy-sell asymmetry in the number, average length, average fraction of market orders and average participation rate of the detected patches. The detected asymmetry is clearly dependent on the local market trend. We also compare the hidden Markov model patches with those obtained with the segmentation method used in Vaglica et al (2008 Phys. Rev. E 77 036110), and we conclude that the former ones can be interpreted as a partition of the latter ones.

  14. Statistical Analysis of a Third-Order Cumulants Based Algorithm for Discrete-Time Errors-in-Variables Identification

    OpenAIRE

    2008-01-01

    International audience; This paper deals with identification of dynamic discrete-time errors-in-variables systems. The statistical accuracy of a least squares estimator based on third-order cumulants is analyzed. In particular, the asymptotic covariance matrix of the estimated parameters is derived. The results are supported by numerical simulation studies.

  15. On the statistical implications of certain Random permutations in Markovian Arrival Processes (MAPs) and second order self-similar processes

    DEFF Research Database (Denmark)

    Andersen, Allan T.; Nielsen, Bo Friis

    2000-01-01

    . The implications for the correlation structure when shuffling an exactly second-order self-similar process are examined. We apply the Markovian arrival process (MAP) as a tool to investigate whether general conclusions can be made with regard to the statistical implications of the shuffling experiments...

  16. Order-specific fertility rates for Germany: Estimates from perinatal statistics for the period 2001-2008

    NARCIS (Netherlands)

    M. Kreyenfeld (Michaela); Scholz, R. (Rembrandt); F. Peters (Frederick); Wlosnewski, I. (Ines)

    2010-01-01

    textabstractUntil 2008, Germany's vital statistics did not include information on the biological order of each birth. This resulted in a dearth of important demographic indicators, such as the mean age at first birth and the level of childlessness. Researchers have tried to fill this gap by generati

  17. A study on the influence of triggering pipe flow regarding mean and higher order statistics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmer, F; Egbers, C [Department of Aerodynamics and Fluid Mechanics, Brandenburg University of Technology Cottbus (Germany); Zanoun, E-S [Mechanical Engineering Department, British University in Egypt (BUE), Al-Shorouk City, Cairo (Egypt)

    2011-12-22

    The evolution of statistical pipe flow quantities, such as the turbulence intensity, the skewness and the flatness, are investigated to clarify, hich development length is needed, until the state of fully developed turbulence is achieved. This observations take place in a relatively large pipe test facility with an inner pipe diameter of D{sub i} = 0.19 m and a total length of L = 27 m. The reached Reynolds number range is 1.5 {center_dot} 10{sup 5} {<=} Re{sub m} {<=} 8.5 {center_dot} 10{sup 5}. To quantify the mean and fluctuating velocity as well as the depending statistical quantities, the constant temperature hot-wire anemometry is applied. Through these intensive centerline measurements we observe a development length of L = 70 D, to ensure a fully developed turbulent flow state.

  18. A Molecular Diode with a Statistically Robust Rectification Ratio of Three Orders of Magnitude.

    Science.gov (United States)

    Yuan, Li; Breuer, Rochus; Jiang, Li; Schmittel, Michael; Nijhuis, Christian A

    2015-08-12

    This paper describes a molecular diode with high, statistically robust, rectification ratios R of 1.1 × 10(3). These diodes operate with a new mechanism of charge transport based on sequential tunneling involving both the HOMO and HOMO-1 positioned asymmetrically inside the junction. In addition, the diodes are stable and withstand voltage cycling for 1500 times, and the yield in working junctions is 90%.

  19. The statistical nature of the second order corrections to the thermal SZE

    OpenAIRE

    2004-01-01

    This paper shows that the accepted expressions for the second order corrections in the parameter $z$ to the thermal Sunyaev-Zel'dovich effect can be accurately reproduced by a simple convolution integral approach. This representation allows to separate the second order SZE corrections into two type of components. One associated to a single line broadening, directly related to the even derivative terms present in the distortion intensity curve, while the other is related to a frequency shift, ...

  20. Ordered random variables theory and applications

    CERN Document Server

    Shahbaz, Muhammad Qaiser; Hanif Shahbaz, Saman; Al-Zahrani, Bander M

    2016-01-01

    Ordered Random Variables have attracted several authors. The basic building block of Ordered Random Variables is Order Statistics which has several applications in extreme value theory and ordered estimation. The general model for ordered random variables, known as Generalized Order Statistics has been introduced relatively recently by Kamps (1995).

  1. Perspectives on the application of order-statistics in best-estimate plus uncertainty nuclear safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Robert P., E-mail: smsrpm@owt.co [AREVA NP, Inc., 3315 Old Forest Road, Lynchburg, VA (United States); Nutt, William T. [Nuclear Safety Analysis Services, 500 Aloha Street 403, Seattle, WA 98109 (United States)

    2011-01-15

    Research highlights: Historical recitation on application of order-statistics models to nuclear power plant thermal-hydraulics safety analysis. Interpretation of regulatory language regarding 10 CFR 50.46 reference to a 'high level of probability'. Derivation and explanation of order-statistics-based evaluation methodologies considering multi-variate acceptance criteria. Summary of order-statistics models and recommendations to the nuclear power plant thermal-hydraulics safety analysis community. - Abstract: The application of order-statistics in best-estimate plus uncertainty nuclear safety analysis has received a considerable amount of attention from methodology practitioners, regulators, and academia. At the root of the debate are two questions: (1) what is an appropriate quantitative interpretation of 'high level of probability' in regulatory language appearing in the LOCA rule, 10 CFR 50.46 and (2) how best to mathematically characterize the multi-variate case. An original derivation is offered to provide a quantitative basis for 'high level of probability.' At root of the second question is whether one should recognize a probability statement based on the tolerance region method of Wald and Guba, et al., for multi-variate problems, one explicitly based on the regulatory limits, best articulated in the Wallis-Nutt 'Testing Method', or something else entirely. This paper reviews the origins of the different positions, key assumptions, limitations, and relationship to addressing acceptance criteria. It presents a mathematical interpretation of the regulatory language, including a complete derivation of uni-variate order-statistics (as credited in AREVA's Realistic Large Break LOCA methodology) and extension to multi-variate situations. Lastly, it provides recommendations for LOCA applications, endorsing the 'Testing Method' and addressing acceptance methods allowing for limited sample failures.

  2. A study on the influence of triggering pipe flow regarding mean and higher order statistics

    Science.gov (United States)

    Zimmer, F.; Zanoun, E.-S.; Egbers, C.

    2011-12-01

    The evolution of statistical pipe flow quantities, such as the turbulence intensity, the skewness and the flatness, are investigated to clarify, hich development length is needed, until the state of fully developed turbulence is achieved. This observations take place in a relatively large pipe test facility with an inner pipe diameter of Di = 0.19 m and a total length of L = 27 m. The reached Reynolds number range is 1.5 · 105 temperature hot-wire anemometry is applied. Through these intensive centerline measurements we observe a development length of L = 70 D, to ensure a fully developed turbulent flow state.

  3. The statistical nature of the second order corrections to the thermal SZE

    CERN Document Server

    Sandoval-Villalbazo, A

    2004-01-01

    This paper shows that the accepted expressions for the second order corrections in the parameter $z$ to the thermal Sunyaev-Zel'dovich effect can be accurately reproduced by a simple convolution integral approach. This representation allows to separate the second order SZE corrections into two type of components. One associated to a single line broadening, directly related to the even derivative terms present in the distortion intensity curve, while the other is related to a frequency shift, which is in turn related to the first derivative term.

  4. Technical Note: Higher-order statistical moments and a procedure that detects potentially anomalous years as two alternative methods describing alterations in continuous environmental data

    Science.gov (United States)

    Arismendi, I.; Johnson, S. L.; Dunham, J. B.

    2015-03-01

    Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.

  5. Identifying the Drivers and Occurrence of Historical and Future Extreme Air-quality Events in the United States Using Advanced Statistical Techniques

    Science.gov (United States)

    Porter, W. C.; Heald, C. L.; Cooley, D. S.; Russell, B. T.

    2013-12-01

    Episodes of air-quality extremes are known to be heavily influenced by meteorological conditions, but traditional statistical analysis techniques focused on means and standard deviations may not capture important relationships at the tails of these two respective distributions. Using quantile regression (QR) and extreme value theory (EVT), methodologies specifically developed to examine the behavior of heavy-tailed phenomena, we analyze extremes in the multi-decadal record of ozone (O3) and fine particulate matter (PM2.5) in the United States. We investigate observations from the Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) networks for connections to meteorological drivers, as provided by the National Center for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) product. Through regional characterization by quantile behavior and EVT modeling of the meteorological covariates most responsible for extreme levels of O3 and PM2.5, we estimate pollutant exceedance frequencies and uncertainties in the United States under current and projected future climates, highlighting those meteorological covariates and interactions whose influence on air-quality extremes differs most significantly from the behavior of the bulk of the distribution. As current policy may be influenced by air-quality projections, we then compare these estimated frequencies to those produced by NCAR's Community Earth System Model (CESM) identifying regions, covariates, and species whose extreme behavior may not be adequately captured by current models.

  6. First-order decomposition of thermal light in terms of a statistical mixture of pulses

    OpenAIRE

    Chenu, Aurélia; Brańczyk, Agata M.; J.E. Sipe

    2014-01-01

    We investigate the connection between thermal light and coherent pulses, constructing mixtures of single pulses that yield the same first-order, equal-space-point correlation function as thermal light. We present mixtures involving (i) pulses with a Gaussian lineshape and narrow bandwidths, and (ii) pulses with a coherence time that matches that of thermal light. We characterize the properties of the mixtures and pulses. Our results introduce an alternative description of thermal light in ter...

  7. Nearly best linear estimates of logistic parameters based on complete ordered statistics

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Deals with the determination of the nearly best linear estimates of location and scale parameters of a logistic population, when both parameters are unknown, by introducing Bloms semi-empirical α, β-correction′into the asymptotic mean and covariance formulae with complete and ordered samples taken into consideration and various nearly best linear estimates established and points out the high efficiency of these estimators relative to the best linear unbiased estimators (BLUEs) and other linear estimators makes them useful in practice.

  8. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  9. HELIOS—A laboratory based on high-order harmonic generation of extreme ultraviolet photons for time-resolved spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Plogmaker, S., E-mail: Stefan.Plogmaker@physics.uu.se, E-mail: Joachim.Terschluesen@physics.uu.se, E-mail: Johan.Soderstrom@physics.uu.se; Terschlüsen, J. A., E-mail: Stefan.Plogmaker@physics.uu.se, E-mail: Joachim.Terschluesen@physics.uu.se, E-mail: Johan.Soderstrom@physics.uu.se; Krebs, N.; Svanqvist, M.; Forsberg, J.; Cappel, U. B.; Rubensson, J.-E.; Siegbahn, H.; Söderström, J., E-mail: Stefan.Plogmaker@physics.uu.se, E-mail: Joachim.Terschluesen@physics.uu.se, E-mail: Johan.Soderstrom@physics.uu.se [Department of Physics and Astronomy, Molecular and Condensed Matter Physics, Uppsala University, P.O. Box 516, 75120 Uppsala (Sweden)

    2015-12-15

    In this paper, we present the HELIOS (High Energy Laser Induced Overtone Source) laboratory, an in-house high-order harmonic generation facility which generates extreme ultraviolet (XUV) photon pulses in the range of 15-70 eV with monochromatized XUV pulse lengths below 35 fs. HELIOS is a source for time-resolved pump-probe/two-color spectroscopy in the sub-50 fs range, which can be operated at 5 kHz or 10 kHz. An optical parametric amplifier is available for pump-probe experiments with wavelengths ranging from 240 nm to 20 000 nm. The produced XUV radiation is monochromatized by a grating in the so-called off-plane mount. Together with overall design parameters, first monochromatized spectra are shown with an intensity of 2 ⋅ 10{sup 10} photons/s (at 5 kHz) in the 29th harmonic, after the monochromator. The XUV pulse duration is measured to be <25 fs after monochromatization.

  10. HELIOS—A laboratory based on high-order harmonic generation of extreme ultraviolet photons for time-resolved spectroscopy

    Science.gov (United States)

    Plogmaker, S.; Terschlüsen, J. A.; Krebs, N.; Svanqvist, M.; Forsberg, J.; Cappel, U. B.; Rubensson, J.-E.; Siegbahn, H.; Söderström, J.

    2015-12-01

    In this paper, we present the HELIOS (High Energy Laser Induced Overtone Source) laboratory, an in-house high-order harmonic generation facility which generates extreme ultraviolet (XUV) photon pulses in the range of 15-70 eV with monochromatized XUV pulse lengths below 35 fs. HELIOS is a source for time-resolved pump-probe/two-color spectroscopy in the sub-50 fs range, which can be operated at 5 kHz or 10 kHz. An optical parametric amplifier is available for pump-probe experiments with wavelengths ranging from 240 nm to 20 000 nm. The produced XUV radiation is monochromatized by a grating in the so-called off-plane mount. Together with overall design parameters, first monochromatized spectra are shown with an intensity of 2 ṡ 1010 photons/s (at 5 kHz) in the 29th harmonic, after the monochromator. The XUV pulse duration is measured to be <25 fs after monochromatization.

  11. HELIOS--A laboratory based on high-order harmonic generation of extreme ultraviolet photons for time-resolved spectroscopy.

    Science.gov (United States)

    Plogmaker, S; Terschlüsen, J A; Krebs, N; Svanqvist, M; Forsberg, J; Cappel, U B; Rubensson, J-E; Siegbahn, H; Söderström, J

    2015-12-01

    In this paper, we present the HELIOS (High Energy Laser Induced Overtone Source) laboratory, an in-house high-order harmonic generation facility which generates extreme ultraviolet (XUV) photon pulses in the range of 15-70 eV with monochromatized XUV pulse lengths below 35 fs. HELIOS is a source for time-resolved pump-probe/two-color spectroscopy in the sub-50 fs range, which can be operated at 5 kHz or 10 kHz. An optical parametric amplifier is available for pump-probe experiments with wavelengths ranging from 240 nm to 20,000 nm. The produced XUV radiation is monochromatized by a grating in the so-called off-plane mount. Together with overall design parameters, first monochromatized spectra are shown with an intensity of 2 ⋅ 10(10) photons/s (at 5 kHz) in the 29th harmonic, after the monochromator. The XUV pulse duration is measured to be <25 fs after monochromatization.

  12. Statistical learning of an auditory sequence and reorganization of acquired knowledge: A time course of word segmentation and ordering.

    Science.gov (United States)

    Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato

    2017-01-27

    Previous neural studies have supported the hypothesis that statistical learning mechanisms are used broadly across different domains such as language and music. However, these studies have only investigated a single aspect of statistical learning at a time, such as recognizing word boundaries or learning word order patterns. In this study, we neutrally investigated how the two levels of statistical learning for recognizing word boundaries and word ordering could be reflected in neuromagnetic responses and how acquired statistical knowledge is reorganised when the syntactic rules are revised. Neuromagnetic responses to the Japanese-vowel sequence (a, e, i, o, and u), presented every .45s, were recorded from 14 right-handed Japanese participants. The vowel order was constrained by a Markov stochastic model such that five nonsense words (aue, eao, iea, oiu, and uoi) were chained with an either-or rule: the probability of the forthcoming word was statistically defined (80% for one word; 20% for the other word) by the most recent two words. All of the word transition probabilities (80% and 20%) were switched in the middle of the sequence. In the first and second quarters of the sequence, the neuromagnetic responses to the words that appeared with higher transitional probability were significantly reduced compared with those that appeared with a lower transitional probability. After switching the word transition probabilities, the response reduction was replicated in the last quarter of the sequence. The responses to the final vowels in the words were significantly reduced compared with those to the initial vowels in the last quarter of the sequence. The results suggest that both within-word and between-word statistical learning are reflected in neural responses. The present study supports the hypothesis that listeners learn larger structures such as phrases first, and they subsequently extract smaller structures, such as words, from the learned phrases. The present

  13. On the computation of the higher-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-12-01

    The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.

  14. Restoration and enhancement of textural properties in SAR images using second-order statistics

    Science.gov (United States)

    Nezry, Edmond; Kohl, Hans-Guenther; De Groof, Hugo

    1994-12-01

    Local second order properties, describing spatial relations between pixels are introduced into the single-point speckle adaptive filtering processes, in order to account for the effects of speckle spatial correlation and to enhance scene textural properties in the restored image. To this end, texture measures originating, first from local grey level co-occurrence matrices (GLCM), and second from the local autocorrelation functions (ACF) are used. Results obtained on 3-look processed ERS-1 FDC and PRI spaceborne images illustrate the performance allowed by the introduction of these texture measures in the structure retaining speckle adaptive filters. The observable texture in remote sensing images is related to the physical spatial resolution of the sensor. For this reason, other spatial speckle decorrelation methods, more simple and easier to implement, for example post-filtering and linear image resampling, are also presented in this paper. In the particular case of spaceborne SAR imagery, all these methods lead to visually similar results. They produce filtered (radar reflectivity) images visually comparable to optical images.

  15. How extreme are extremes?

    Science.gov (United States)

    Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro

    2016-04-01

    High temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. Heat-wave indicators have been mainly developed with the aim of capturing the potential impacts on specific sectors (agriculture, health, wildfires, transport, power generation and distribution). However, the ability to capture the occurrence of extreme temperature events is an essential property of a multi-hazard extreme climate indicator. Aim of this study is to develop a standardized heat-wave indicator, that can be combined with other indices in order to describe multiple hazards in a single indicator. The proposed approach can be used in order to have a quantified indicator of the strenght of a certain extreme. As a matter of fact, extremes are usually distributed in exponential or exponential-exponential functions and it is difficult to quickly asses how strong was an extreme events considering only its magnitude. The proposed approach simplify the quantitative and qualitative communication of extreme magnitude

  16. Bootstrap and Order Statistics for Quantifying Thermal-Hydraulic Code Uncertainties in the Estimation of Safety Margins

    Directory of Open Access Journals (Sweden)

    Enrico Zio

    2008-01-01

    Full Text Available In the present work, the uncertainties affecting the safety margins estimated from thermal-hydraulic code calculations are captured quantitatively by resorting to the order statistics and the bootstrap technique. The proposed framework of analysis is applied to the estimation of the safety margin, with its confidence interval, of the maximum fuel cladding temperature reached during a complete group distribution blockage scenario in a RBMK-1500 nuclear reactor.

  17. Wavelet Transform Based Higher Order Statistical Analysis of Wind and Wave Time Histories

    Science.gov (United States)

    Habib Huseni, Gulamhusenwala; Balaji, Ramakrishnan

    2017-10-01

    Wind, blowing on the surface of the ocean, imparts the energy to generate the waves. Understanding the wind-wave interactions is essential for an oceanographer. This study involves higher order spectral analyses of wind speeds and significant wave height time histories, extracted from European Centre for Medium-Range Weather Forecast database at an offshore location off Mumbai coast, through continuous wavelet transform. The time histories were divided by the seasons; pre-monsoon, monsoon, post-monsoon and winter and the analysis were carried out to the individual data sets, to assess the effect of various seasons on the wind-wave interactions. The analysis revealed that the frequency coupling of wind speeds and wave heights of various seasons. The details of data, analysing technique and results are presented in this paper.

  18. Unvoiced/voiced classification and voiced harmonic parameters estimation using the third-order statistics

    Institute of Scientific and Technical Information of China (English)

    YING Na; ZHAO Xiao-hui; DONG Jing

    2007-01-01

    Unvoiced/voiced classification of speech is a challenging problem especially under conditions of low signal-to-noise ratio or the non-white-stationary noise environment. To solve this problem, an algorithm for speech classification, and a technique for the estimation of pairwise magnitude frequency in voiced speech are proposed. By using third order spectrum of speech signal to remove noise, in this algorithm the least spectrum difference to get refined pitch and the max harmonic number is given. And this algorithm utilizes spectral envelope to estimate signal-to-noise ratio of speech harmonics. Speech classification, voicing probability, and harmonic parameters of the voiced frame can be obtained.Simulation results indicate that the proposed algorithm, under complicated background noise, especially Gaussian noise, can effectively classify speech in high accuracy for voicing probability and the voiced parameters.

  19. CLASSIFICATION OF GEAR FAULTS USING HIGHER-ORDER STATISTICS AND SUPPORT VECTOR MACHINES

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Gears alternately mesh and detach in driving process, and then working conditions of gears are alternately changing, so they are easy to be spalled and worn. But because of the effect of additive gaussian measurement noises, the signal-to-noises ratio is low; their fault features are difficult to extract. This study aims to propose an approach of gear faults classification,using the cumulants and support vector machines. The cumulants can eliminate the additive gaussian noises, boost the signal-to-noises ratio. Generalisation of support vector machines as classifier, which is employed structural risk minimisation principle, is superior to that of conventional neural networks, which is employed traditional empirical risk minimisation principle. Support vector machines as the classifier, and the third and fourth order cumulants as input, gears faults are successfully recognized. The experimental results show that the method of fault classification combining cumulants with support vector machines is very effective.

  20. SyntEyes KTC: higher order statistical eye model for developing keratoconus.

    Science.gov (United States)

    Rozema, Jos J; Rodriguez, Pablo; Ruiz Hidalgo, Irene; Navarro, Rafael; Tassignon, Marie-José; Koppen, Carina

    2017-05-01

    To present and validate a stochastic eye model for developing keratoconus to e.g. improve optical corrective strategies. This could be particularly useful for researchers that do not have access to original keratoconic data. The Scheimpflug tomography, ocular biometry and wavefront of 145 keratoconic right eyes were collected. These data were processed using principal component analysis for parameter reduction, followed by a multivariate Gaussian fit that produces a stochastic model for keratoconus (SyntEyes KTC). The output of this model is filtered to remove the occasional incorrect topography patterns by either an automatic or manual procedure. Finally, the output of this keratoconus model is matched to that of the original model for normal eyes using the non-corneal biometry to obtain a description of keratoconus development. The synthetic data generated by the model were found to be significantly equal to the original data (non-parametric Mann-Whitney equivalence test; 145/154 passed). The variability of the synthetic data, however, was often significantly less than that of the original data, especially for the higher order Zernike terms of corneal elevation (non-parametric Levene test; p keratoconus progression. The synthetic data provided by the proposed keratoconus model closely resembles actual clinical data and may be used for a range of research applications when (sufficient) real data is not available. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  1. Statistical mechanics of random geometric graphs: Geometry-induced first-order phase transition.

    Science.gov (United States)

    Ostilli, Massimo; Bianconi, Ginestra

    2015-04-01

    Random geometric graphs (RGGs) can be formalized as hidden-variables models where the hidden variables are the coordinates of the nodes. Here we develop a general approach to extract the typical configurations of a generic hidden-variables model and apply the resulting equations to RGGs. For any RGG, defined through a rigid or a soft geometric rule, the method reduces to a nontrivial satisfaction problem: Given N nodes, a domain D, and a desired average connectivity 〈k〉, find, if any, the distribution of nodes having support in D and average connectivity 〈k〉. We find out that, in the thermodynamic limit, nodes are either uniformly distributed or highly condensed in a small region, the two regimes being separated by a first-order phase transition characterized by a O(N) jump of 〈k〉. Other intermediate values of 〈k〉 correspond to very rare graph realizations. The phase transition is observed as a function of a parameter a∈[0,1] that tunes the underlying geometry. In particular, a=1 indicates a rigid geometry where only close nodes are connected, while a=0 indicates a rigid antigeometry where only distant nodes are connected. Consistently, when a=1/2 there is no geometry and no phase transition. After discussing the numerical analysis, we provide a combinatorial argument to fully explain the mechanism inducing this phase transition and recognize it as an easy-hard-easy transition. Our result shows that, in general, ad hoc optimized networks can hardly be designed, unless to rely to specific heterogeneous constructions, not necessarily scale free.

  2. Image reconstruction in phase-contrast tomography exploiting the second-order statistical properties of the projection data.

    Science.gov (United States)

    Chou, Cheng-Ying; Huang, Pin-Yu

    2011-11-21

    X-ray phase-contrast tomography (PCT) methods seek to quantitatively reconstruct separate images that depict an object's absorption and refractive contrasts. Most PCT reconstruction algorithms generally operate by explicitly or implicitly performing the decoupling of the projected absorption and phase properties at each tomographic view angle by use of a phase-retrieval formula. However, the presence of zero-frequency singularity in the Fourier-based phase retrieval formulas will lead to a strong noise amplification in the projection estimate and the subsequent refractive image obtained using conventional algorithms like filtered backprojection (FBP). Tomographic reconstruction by use of statistical methods can account for the noise model and a priori information, and thereby can produce images with better quality over conventional filtered backprojection algorithms. In this work, we demonstrate an iterative image reconstruction method that exploits the second-order statistical properties of the projection data can mitigate noise amplification in PCT. The autocovariance function of the reconstructed refractive images was empirically computed and shows smaller and shorter noise correlation compared to those obtained using the FBP and unweighted penalized least-squares methods. Concepts from statistical decision theory are applied to demonstrate that the statistical properties of images produced by our method can improve signal detectability.

  3. Statistical modeling of CMIP5 projected changes in extreme wet spells over China in the late 21st century

    Science.gov (United States)

    Zhu, Lianhua; Li, Yun; Jiang, Zhihong

    2017-08-01

    The observed intensity, frequency, and duration (IFD) of summer wet spells, defined here as extreme events with one or more consecutive days in which daily precipitation exceeds a given threshold (the 95th percentile), and their future changes in RCP4.5 and RCP8.5 in the late 21st century over China, are investigated by using the wet spell model (WSM) and by extending the point process approach to extreme value analysis. Wet spell intensity is modeled by a conditional generalized Pareto distribution, frequency by a Poisson distribution, and duration by a geometric distribution, respectively. The WSM is able to realistically model summer extreme rainfall spells during 1961-2005, as verified with observations at 553 stations throughout China. To minimize the impact of systematic biases over China in the global climate models (GCMs) of the Coupled Model Intercomparison Project Phase 5 (CMIP5), five best GCMs are selected based on their performance to reproduce observed wet spell IFD and average precipitation during the historical period. Furthermore, a quantile-quantile scaling correction procedure is proposed and applied to produce ensemble projections of wet spell IFD and corresponding probability distributions. The results show that in the late 21st century, most of China will experience more extreme rainfall and less low-intensity rainfall. The intensity and frequency of wet spells are projected to increase considerably, while the duration of wet spells will increase but to a much less extent. The IFD changes in RCP8.5 are in general much larger than those in RCP4.5.

  4. Injury Statistics

    Science.gov (United States)

    ... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...

  5. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging.

    Science.gov (United States)

    Juang, Kai-Wei; Lee, Dar-Yuan; Teng, Yun-Lung

    2005-11-01

    Correctly classifying "contaminated" areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the "contaminated" areas.

  6. Non-equilibrium statistical field theory for classical particles: Non-linear structure evolution with first-order interaction

    CERN Document Server

    Bartelmann, Matthias; Berg, Daniel; Kozlikin, Elena; Lilow, Robert; Viermann, Celia

    2014-01-01

    We calculate the power spectrum of density fluctuations in the statistical non-equilibrium field theory for classical, microscopic degrees of freedom to first order in the interaction potential. We specialise our result to cosmology by choosing appropriate initial conditions and propagators and show that the non-linear growth of the density power spectrum found in numerical simulations of cosmic structure evolution is reproduced well to redshift zero and for arbitrary wave numbers. The main difference of our approach to ordinary cosmological perturbation theory is that we do not perturb a dynamical equation for the density contrast. Rather, we transport the initial phase-space distribution of a canonical particle ensemble forward in time and extract any collective information from it at the time needed. Since even small perturbations of particle trajectories can lead to large fluctuations in density, our approach allows to reach high density contrast already at first order in the perturbations of the particle...

  7. Higher-Order Statistics for the Detection of Small Objects in a Noisy Background Application on Sonar Imaging

    Directory of Open Access Journals (Sweden)

    M. Amate

    2007-01-01

    Full Text Available An original algorithm for the detection of small objects in a noisy background is proposed. Its application to underwater objects detection by sonar imaging is addressed. This new method is based on the use of higher-order statistics (HOS that are locally estimated on the images. The proposed algorithm is divided into two steps. In a first step, HOS (skewness and kurtosis are estimated locally using a square sliding computation window. Small deterministic objects have different statistical properties from the background they are thus highlighted. The influence of the signal-to-noise ratio (SNR on the results is studied in the case of Gaussian noise. Mathematical expressions of the estimators and of the expected performances are derived and are experimentally confirmed. In a second step, the results are focused by a matched filter using a theoretical model. This enables the precise localization of the regions of interest. The proposed method generalizes to other statistical distributions and we derive the theoretical expressions of the HOS estimators in the case of a Weibull distribution (both when only noise is present or when a small deterministic object is present within the filtering window. This enables the application of the proposed technique to the processing of synthetic aperture sonar data containing underwater mines whose echoes have to be detected and located. Results on real data sets are presented and quantitatively evaluated using receiver operating characteristic (ROC curves.

  8. Perspectives on the Application of Order-Statistics in Best-Estimate Plus Uncertainty Nuclear Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Robert P. Martin; William T. Nutt

    2011-01-01

    The application of order-statistics in best-estimate plus uncertainty nuclear safety analysis has received a considerable amount of attention from methodology practitioners, regulators, and academia. At the root of the debate are two questions: (1) what is an appropriate quantitative interpretation of “high level of probability” in regulatory language appearing in the LOCA rule, 10 CFR 50.46 and (2) how best to mathematically characterize the multi-variate case. An original derivation is offered to provide a quantitative basis for “high level of probability.” At root of the second question is whether one should recognize a probability statement based on the tolerance region method of Wald and Guba, et al., for multi-variate problems, one explicitly based on the regulatory limits, best articulated in the Wallis–Nutt “Testing Method”, or something else entirely. This paper reviews the origins of the different positions, key assumptions, limitations, and relationship to addressing acceptance criteria. It presents a mathematical interpretation of the regulatory language, including a complete derivation of uni-variate order-statistics (as credited in AREVA’s Realistic Large Break LOCA methodology) and extension to multi-variate situations. Lastly, it provides recommendations for LOCA applications, endorsing the “Testing Method” and addressing acceptance methods allowing for limited sample failures.

  9. Joint statistics of partial sums of ordered exponential variates and performance of GSC RAKE receivers over rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2011-08-01

    Spread spectrum receivers with generalized selection combining (GSC) RAKE reception were proposed and have been studied as alternatives to the classical two fundamental schemes: maximal ratio combining and selection combining because the number of diversity paths increases with the transmission bandwidth. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such as the exact performance analysis of the capture probability and an exact assessment of the impact of self-interference on GSC RAKE receivers. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability and outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels, and compare it to that of partial RAKE receivers. © 2011 IEEE.

  10. A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics

    Directory of Open Access Journals (Sweden)

    Guangkuo Lu

    2015-01-01

    Full Text Available Methods of utilizing independent component analysis (ICA give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments by applying a modified segmentation algorithm. Then the state space is reconstructed and the single-channel signal is transformed into a pseudo multiple input multiple output (MIMO mode using a method of nonlinear analysis based on the high order statistics (HOS. In the last step, ICA is performed on the pseudo MIMO data to decompose the single channel recording into its underlying independent components (ICs and the interested ICs are then extracted. Finally, the effectiveness and excellence of the higher order single-channel ICA (SCICA method are validated with measured data throughout experiments. Also, the proposed method in this paper is proved to be more robust under different SNR and/or embedding dimension via explicit formulae and simulations.

  11. Statistical modelling of wildfire size and intensity: a step toward meteorological forecasting of summer extreme fire risk

    OpenAIRE

    Hernandez, C; Keribin, C.; Drobinski, P.; Turquety, S.

    2015-01-01

    International audience; In this article we investigate the use of statistical methods for wildfire risk assessment in the Mediterranean Basin using three meteorological covariates, the 2 m temperature anomaly, the 10 m wind speed and the January– June rainfall occurrence anomaly. We focus on two remotely sensed characteristic fire variables, the burnt area (BA) and the fire radiative power (FRP), which are good proxies for fire size and intensity respectively. Using the fire data we determine...

  12. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    Science.gov (United States)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  13. Study of higher order correlation functions and photon statistics using multiphoton-subtracted states and quadrature measurements

    Science.gov (United States)

    Bogdanov, Yu. I.; Katamadze, K. G.; Avosopyants, G. V.; Belinsky, L. V.; Bogdanova, N. A.; Kulik, S. P.; Lukichev, V. F.

    2016-12-01

    The estimation of high order correlation function values is an important problem in the field of quantum computation. We show that the problem can be reduced to preparation and measurement of optical quantum states resulting after annihilation of a set number of quanta from the original beam. We apply this approach to explore various photon bunching regimes in optical states with gamma-compounded Poisson photon number statistics. We prepare and perform measurement of the thermal quantum state as well as states produced by subtracting one to ten photons from it. Maximum likelihood estimation is employed for parameter estimation. The goal of this research is the development of highly accurate procedures for generation and quality control of optical quantum states.

  14. A Fractional Lower Order Statistics-Based MIMO Detection Method in Impulse Noise for Power Line Channel

    Directory of Open Access Journals (Sweden)

    CHEN, Z.

    2014-11-01

    Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.

  15. Planck 2013 results. XXI. Power spectrum and high-order statistics of the Planck all-sky Compton parameter map

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.

    2014-01-01

    We have constructed the first all-sky map of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 100 to 857 GHz frequency channel maps from the Planck survey. This map shows an obvious galaxy cluster tSZ signal that is well matched...... with blindly detected clusters in the Planck SZ catalogue. To characterize the signal in the tSZ map we have computed its angular power spectrum. At large angular scales (l thermal dust emission. At small angular scales (l > 500) the clustered cosmic......-Gaussianity of the Compton parameter map is further characterized by computing its 1D probability distribution function and its bispectrum. The measured tSZ power spectrum and high order statistics are used to place constraints on sigma(8)....

  16. On the computation of the higher order statistics of the channel capacity for amplify-and-forward multihop transmission

    KAUST Repository

    Yilmaz, Ferkan

    2014-01-01

    Higher order statistics (HOS) of the channel capacity provide useful information regarding the level of reliability of signal transmission at a particular rate. In this paper, we propose a novel and unified analysis, which is based on the moment-generating function (MGF) approach, to efficiently and accurately compute the HOS of the channel capacity for amplify-and-forward (AF) multihop transmission over generalized fading channels. More precisely, our easy-to-use and tractable mathematical formalism requires only the reciprocal MGFs of the transmission hop signal-to-noise ratio (SNR). Numerical and simulation results, which are performed to exemplify the usefulness of the proposed MGF-based analysis, are shown to be in perfect agreement. © 2013 IEEE.

  17. GLOBAL APPROACH OF CHANNEL MODELING IN MOBILE AD HOC NETWORKS INCLUDING SECOND ORDER STATISTICS AND SYSTEM PERFORMANCES ANALYSIS

    Directory of Open Access Journals (Sweden)

    Basile L. AGBA

    2008-06-01

    Full Text Available Mobile ad hoc networks (MANET are very difficult to design in terms of scenarios specification and propagation modeling. All these aspects must be taken into account when designing MANET. For cost-effective designing, powerful and accurate simulation tools are needed. Our first contribution in this paper is to provide a global approach process (GAP in channel modeling combining scenarios and propagation in order to have a better analysis of the physical layer, and finally to improve performances of the whole network. The GAP is implemented in an integrated simulation tool, Ad-SMPro. Moreover, channel statistics, throughput and delay are some key points to be considered when studying a mobile wireless networks. A carefully analysis of mobility effects over second order channel statistics and system performances is made based on our optimized simulation tool, Ad-SMProl. The channel is modeled by large scale fading and small scale fading including Doppler spectrum due to the double mobility of the nodes. Level Cross Rate and Average Duration of Fade are simulated as function of double mobility degree, a defined to be the ratio of the nodes' speeds. These results are compared to the theoretical predictions. We demonstrate that, in mobile ad hoc networks, flat fading channels and frequency-selective fading channels are differently affected. In addition, Bit Error rate is analysed as function of the ratio of the average bit energy to thermal noise density. Other performances (such as throughput, delay and routing traffic are analysed and conclusions related to the proposed simulation model and the mobility effects are drawn.

  18. Controlling the degree of caution in statistical inference with the Bayesian and frequentist approaches as opposite extremes

    CERN Document Server

    Bickel, David R

    2011-01-01

    In statistical practice, whether a Bayesian or frequentist approach is used in inference depends not only on the availability of prior information but also on the attitude taken toward partial prior information, with frequentists tending to be more cautious than Bayesians. The proposed framework defines that attitude in terms of a specified amount of caution, thereby enabling data analysis at the level of caution desired and on the basis of any prior information. The caution parameter represents the attitude toward partial prior information in much the same way as a loss function represents the attitude toward risk. When there is very little prior information and nonzero caution, the resulting inferences correspond to those of the candidate confidence intervals and p-values that are most similar to the credible intervals and hypothesis probabilities of the specified Bayesian posterior. On the other hand, in the presence of a known physical distribution of the parameter, inferences are based only on the corres...

  19. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.-W. [Department of Post-Modern Agriculture, MingDao University, Pitou, Changhua, Taiwan (China); Lee, D.-Y. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)]. E-mail: dylee@ccms.ntu.edu.tw; Teng, Y.-L. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)

    2005-11-15

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging.

  20. Improved Syndrome-Based Ordered Statistic Decoding Algorithm%基于伴随式的OSD改进算法

    Institute of Scientific and Technical Information of China (English)

    董自健; 酆广增

    2011-01-01

    A syndrome-based ordered statistics decoding(SOSD) algorithm for LDPC codes is proposed based on ordered statistics decoding (OSD) algorithm and syndrome-based decoding algorithm. The indexed systematic form of parity check matrix is obtained based on the least rehable independent positions (LRIPs) by performing Gaussian eliminations,which leads to that candidate codes are generated in the SOSD processing by using several binary vector additions ,instead of using the re-encoding process, computation complexity in thereby reduced. A block accumulated log-likelihood ratio(LLR) strategy is further proposed to reconstruct the ordered information sequence, which can reduce the sensitivity of accumulated parameter selection to performance fluctuation.%在研究了分阶统计译码(OSD)算法和伴随式译码算法的基础上,提出了一种适于LDPC码的基于伴随式的分阶统计译码(SOSD)算法.通过对接收序列似然比(LLR)进行排序,并进行高斯消元,获得接收序列的低可信相互独立符号集合(LRIPs),并使校验矩阵中,对应于LRIPs的列转化为系统形式.SOSD算法在产生候选码字时,不需要重新编码过程,而只需要数次二进制向量模2加即可完成,因此可以减少大量的计算.针对置信传播(BP)和SOSD的级联算法,提出了一种对对数似然比(LLR)信息进行分段累加作为SOSD排序依据的策略.这种策略能够减少累加参数的选取时性能的影响.

  1. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    Science.gov (United States)

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  2. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    Science.gov (United States)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  3. Probing the limitations of isotropic pair potentials to produce ground-state structural extremes via inverse statistical mechanics.

    Science.gov (United States)

    Zhang, G; Stillinger, F H; Torquato, S

    2013-10-01

    Inverse statistical-mechanical methods have recently been employed to design optimized short-range radial (isotropic) pair potentials that robustly produce novel targeted classical ground-state many-particle configurations. The target structures considered in those studies were low-coordinated crystals with a high degree of symmetry. In this paper, we further test the fundamental limitations of radial pair potentials by targeting crystal structures with appreciably less symmetry, including those in which the particles have different local structural environments. These challenging target configurations demanded that we modify previous inverse optimization techniques. In particular, we first find local minima of a candidate enthalpy surface and determine the enthalpy difference ΔH between such inherent structures and the target structure. Then we determine the lowest positive eigenvalue λ(0) of the Hessian matrix of the enthalpy surface at the target configuration. Finally, we maximize λ(0)ΔH so that the target structure is both locally stable and globally stable with respect to the inherent structures. Using this modified optimization technique, we have designed short-range radial pair potentials that stabilize the two-dimensional kagome crystal, the rectangular kagome crystal, and rectangular lattices, as well as the three-dimensional structure of the CaF(2) crystal inhabited by a single-particle species. We verify our results by cooling liquid configurations to absolute zero temperature via simulated annealing and ensuring that such states have stable phonon spectra. Except for the rectangular kagome structure, all of the target structures can be stabilized with monotonic repulsive potentials. Our work demonstrates that single-component systems with short-range radial pair potentials can counterintuitively self-assemble into crystal ground states with low symmetry and different local structural environments. Finally, we present general principles that offer

  4. Cosmological constraints with weak lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, Francois; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-01-01

    Peak statistics in weak lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. To prepare for the high precision afforded by next-generation weak lensing surveys, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how CAMELUS---a fast stochastic model for predicting peaks---can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. We measure the abundance histogram of peaks in a mock shear catalogue of approximately 5,000 deg2 using a multiscale mass map filtering technique, and we then constrain the parameters of the mock survey using CAMELUS combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. We find that peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, indicating the need to better understand and control the model's systematics before applying it to a real survey of this size or larger. We perform a calibration of the model to remove the bias and compare results to those from the two-point correlation functions (2PCF) measured on the same field. In this case, we find the derived parameter Σ8 = σ8(Ωm/0.27)α = 0.76 (-0.03 +0.02) with α = 0.65 for peaks, while for 2PCF the values are Σ8 = 0.76 (-0.01 +0.02) and α = 0.70. We conclude that the constraining power can therefore be comparable between the two weak lensing observables in large-field surveys. Furthermore, the tilt in the σ8-Ωm degeneracy direction for peaks with respect to that of 2PCF suggests that a combined analysis would yield tighter constraints than either measure alone. As expected, w0de cannot be

  5. Relationships between statistics of rainfall extremes and mean annual precipitation: an application for design-storm estimation in northern central Italy

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2006-01-01

    Full Text Available Several hydrological analyses need to be founded on a reliable estimate of the design storm, which is the expected rainfall depth corresponding to a given duration and probability of occurrence, usually expressed in terms of return period. The annual series of precipitation maxima for storm duration ranging from 15 min to 1 day, observed at a dense network of raingauges sited in northern central Italy, are analyzed using an approach based on L-moments. The analysis investigates the statistical properties of rainfall extremes and detects significant relationships between these properties and the mean annual precipitation (MAP. On the basis of these relationships, we developed a regional model for estimating the rainfall depth for a given storm duration and recurrence interval in any location of the study region. The applicability of the regional model was assessed through Monte Carlo simulations. The uncertainty of the model for ungauged sites was quantified through an extensive cross-validation.

  6. Musical-Noise Analysis in Methods of Integrating Microphone Array and Spectral Subtraction Based on Higher-Order Statistics

    Directory of Open Access Journals (Sweden)

    Kazunobu Kondo

    2010-01-01

    Full Text Available We conduct an objective analysis on musical noise generated by two methods of integrating microphone array signal processing and spectral subtraction. To obtain better noise reduction, methods of integrating microphone array signal processing and nonlinear signal processing have been researched. However, nonlinear signal processing often generates musical noise. Since such musical noise causes discomfort to users, it is desirable that musical noise is mitigated. Moreover, it has been recently reported that higher-order statistics are strongly related to the amount of musical noise generated. This implies that it is possible to optimize the integration method from the viewpoint of not only noise reduction performance but also the amount of musical noise generated. Thus, we analyze the simplest methods of integration, that is, the delay-and-sum beamformer and spectral subtraction, and fully clarify the features of musical noise generated by each method. As a result, it is clarified that a specific structure of integration is preferable from the viewpoint of the amount of generated musical noise. The validity of the analysis is shown via a computer simulation and a subjective evaluation.

  7. Musical-Noise Analysis in Methods of Integrating Microphone Array and Spectral Subtraction Based on Higher-Order Statistics

    Science.gov (United States)

    Takahashi, Yu; Saruwatari, Hiroshi; Shikano, Kiyohiro; Kondo, Kazunobu

    2010-12-01

    We conduct an objective analysis on musical noise generated by two methods of integrating microphone array signal processing and spectral subtraction. To obtain better noise reduction, methods of integrating microphone array signal processing and nonlinear signal processing have been researched. However, nonlinear signal processing often generates musical noise. Since such musical noise causes discomfort to users, it is desirable that musical noise is mitigated. Moreover, it has been recently reported that higher-order statistics are strongly related to the amount of musical noise generated. This implies that it is possible to optimize the integration method from the viewpoint of not only noise reduction performance but also the amount of musical noise generated. Thus, we analyze the simplest methods of integration, that is, the delay-and-sum beamformer and spectral subtraction, and fully clarify the features of musical noise generated by each method. As a result, it is clarified that a specific structure of integration is preferable from the viewpoint of the amount of generated musical noise. The validity of the analysis is shown via a computer simulation and a subjective evaluation.

  8. Cosmological constraints with weak-lensing peak counts and second-order statistics in a large-field survey

    Science.gov (United States)

    Peel, Austin; Lin, Chieh-An; Lanusse, François; Leonard, Adrienne; Starck, Jean-Luc; Kilbinger, Martin

    2017-03-01

    Peak statistics in weak-lensing maps access the non-Gaussian information contained in the large-scale distribution of matter in the Universe. They are therefore a promising complementary probe to two-point and higher-order statistics to constrain our cosmological models. Next-generation galaxy surveys, with their advanced optics and large areas, will measure the cosmic weak-lensing signal with unprecedented precision. To prepare for these anticipated data sets, we assess the constraining power of peak counts in a simulated Euclid-like survey on the cosmological parameters Ωm, σ8, and w0de. In particular, we study how Camelus, a fast stochastic model for predicting peaks, can be applied to such large surveys. The algorithm avoids the need for time-costly N-body simulations, and its stochastic approach provides full PDF information of observables. Considering peaks with a signal-to-noise ratio ≥ 1, we measure the abundance histogram in a mock shear catalogue of approximately 5000 deg2 using a multiscale mass-map filtering technique. We constrain the parameters of the mock survey using Camelus combined with approximate Bayesian computation, a robust likelihood-free inference algorithm. Peak statistics yield a tight but significantly biased constraint in the σ8-Ωm plane, as measured by the width ΔΣ8 of the 1σ contour. We find Σ8 = σ8(Ωm/ 0.27)α = 0.77-0.05+0.06 with α = 0.75 for a flat ΛCDM model. The strong bias indicates the need to better understand and control the model systematics before applying it to a real survey of this size or larger. We perform a calibration of the model and compare results to those from the two-point correlation functions ξ± measured on the same field. We calibrate the ξ± result as well, since its contours are also biased, although not as severely as for peaks. In this case, we find for peaks Σ8 = 0.76-0.03+0.02 with α = 0.65, while for the combined ξ+ and ξ- statistics the values are Σ8 = 0.76-0.01+0.02 and α = 0

  9. Efficacy and Physicochemical Evaluation of an Optimized Semisolid Formulation of Povidone Iodine Proposed by Extreme Vertices Statistical Design; a Practical Approach.

    Science.gov (United States)

    Lotfipour, Farzaneh; Valizadeh, Hadi; Shademan, Shahin; Monajjemzadeh, Farnaz

    2015-01-01

    One of the most significant issues in pharmaceutical industries, prior to commercialization of a pharmaceutical preparation is the "preformulation" stage. However, far too attention has been paid to verification of the software assisted statistical designs in preformulation studies. The main aim of this study was to report a step by step preformulation approach for a semisolid preparation based on a statistical mixture design and to verify the predictions made by the software with an in-vitro efficacy bioassay test. Extreme vertices mixture design (4 factors, 4 levels) was applied for preformulation of a semisolid Povidone Iodine preparation as Water removable ointment using different PolyEthylenGlycoles. Software Assisted (Minitab) analysis was then performed using four practically assessed response values including; Available iodine, viscosity (N index and yield value) and water absorption capacity. Subsequently mixture analysis was performed and finally, an optimized formulation was proposed. The efficacy of this formulation was bio-assayed using microbial tests in-vitro and MIC values were calculated for Escherichia coli, pseudomonaaeruginosa, staphylococcus aureus and Candida albicans. Results indicated the acceptable conformity of the measured responses. Thus, it can be concluded that the proposed design had an adequate power to predict the responses in practice. Stability studies, proved no significant change during the one year study for the optimized formulation. Efficacy was eligible on all tested species and in the case of staphylococcus aureus; the prepared semisolid formulation was even more effective.

  10. Counting statistics of transport through Coulomb blockade nanostructures: High-order cumulants and non-Markovian effects

    DEFF Research Database (Denmark)

    Flindt, Christian; Novotny, Tomás; Braggio, Alessandro

    2010-01-01

    Recent experimental progress has made it possible to detect in real-time single electrons tunneling through Coulomb blockade nanostructures, thereby allowing for precise measurements of the statistical distribution of the number of transferred charges, the so-called full counting statistics...

  11. At the Early Brazilian Republic, Bulhoes Carvalho legalizes the statistical activity and put it into the State’s order

    Directory of Open Access Journals (Sweden)

    Nelson de Castro Senra

    2009-12-01

    Full Text Available Recreated in the early days of the Brazilian Republic (jan./1890 as a central agency, the General Directory of Statistics (DGE faced difficulties in consolidating the Brazilian statistical activity. A new time only was achieved when José Luiz Sayão de Bulhões Carvalho (1866-1940, a medical doctor specialist in public health dedicated to demographic studies and researches, took its direction (in two times for almost 17 years. His performance applied to facilitate the interaction of DGE with its similar agencies in the federation. He idealized agreements with provincial statistical agencies; imagined counsels (or committees for making collective decisions; imagined a national statistical conference based on the experience he has accumulated in two conferences of the International Statistical Institute (ISI; applied attention to the formation of a statistical professional community; encouraged and sponsored the translation of books, among other measures. The Census of 1920, he had made, the only major census at the Early Brazilian Republic, was guided by friendly relations with the federal and provincial statistical organizations and with the society (Catholic Church, media, unions, clubs etc.. It was, we can say, a modern census, using processing machines, and concluded in a timely fashion. The printed volumes of results were illustrated by pictorial graphics, also used at the Pavilion of Statistics, called (by the press the Pavilion of Precisely Science, which fixed the presence of DGE in the International Exhibit of the Brazilian Independence Centennial (1822-1922. With his experience, anticipated the basis of Brazilian Institute of Geography and Statistics (IBGE, created in 1936 as the new Brazilian Statistical central agency (still working with growing success, that can be see, in sum, as a measure of his success in a long term vision.

  12. Turbulence statistics of Couette Poiseuille turbulent flow. 2nd Report. Higher order turbulence statistics; Couette Poiseuille gata ranryu no midare tokeiryo. 2. Koji tokeiryo

    Energy Technology Data Exchange (ETDEWEB)

    Nakabayashi, K.; Kito, O.; Kato, Y. [Nagoya Institute of Technology, Nagoya (Japan)

    1998-10-25

    Turbulence statistics in Couette Poiseuille flow are obtained by measurements. These include correlation coefficient, skewness and flatness factors and four-quadrant analysis of Reynolds shear stress -{rho}uv. In the region of y{sup +} {<=}30-40, the distributions of all these quantities are only affected by non-dimensional parameter {mu}({identical_to}u*{sup 3}/{alpha}{nu}), as the mean velocity and the turbulence intensities profiles are. The four-quadrant analysis shows that the fractional contribution from 4th-quadrant is affected largely by parameter {mu} whereas that from 2nd-quadrant remains unaffected. In the case of 0<{mu}{<=}94, the fractional contribution from 4th-quadrant is greater than that from 2nd-quadrant, unlike the conventional wall turbulent flow. 8 refs., 17 figs., 2 tabs.

  13. Possible Kondo-Lattice-Enhanced Magnetic Ordering at Anomalously High Temperature in Nd Metal under Extreme Compression

    Science.gov (United States)

    Schilling, James S.; Song, Jing; Soni, Vikas; Lim, Jinhyuk

    Most elemental lanthanides order magnetically at temperatures To well below ambient, the highest being 292 K for Gd. Sufficiently high pressure is expected to destabilize the well localized magnetic 4 f state of the heavy lanthanides, leading to increasing influence of Kondo physics on the RKKY interaction. For pressures above 80 GPa, To for Dy and Tb begins to increase dramatically, extrapolating for Dy to a record-high value near 400 K at 160 GPa. This anomalous increase may be an heretofore unrecognized feature of the Kondo lattice state; if so, one would expect To to pass through a maximum and fall rapidly at even higher pressures. A parallel is suggested to the ferromagnet CeRh3B2 where To = 115 K at ambient pressure, a temperature more than 100-times higher than anticipated from simple de Gennes scaling. Here we discuss recent experiments on Nd where anomalous behavior in To (P) is found to occur at lower pressures, perhaps reflecting the fact that Nd's 4 f wave function is less localized. Work at Washington University is supported by NSF Grant DMR-1104742 and CDAC through NNSA/DOE Grant DE-FC52-08NA28554.

  14. Statistical methods in nonlinear dynamics

    Indian Academy of Sciences (India)

    K P N Murthy; R Harish; S V M Satyanarayana

    2005-03-01

    Sensitivity to initial conditions in nonlinear dynamical systems leads to exponential divergence of trajectories that are initially arbitrarily close, and hence to unpredictability. Statistical methods have been found to be helpful in extracting useful information about such systems. In this paper, we review briefly some statistical methods employed in the study of deterministic and stochastic dynamical systems. These include power spectral analysis and aliasing, extreme value statistics and order statistics, recurrence time statistics, the characterization of intermittency in the Sinai disorder problem, random walk analysis of diffusion in the chaotic pendulum, and long-range correlations in stochastic sequences of symbols.

  15. Future changes in extreme temperature events using the statistical downscaling model (SDSM in the trans-boundary region of the Jhelum river basin

    Directory of Open Access Journals (Sweden)

    Rashid Mahmood

    2014-10-01

    On the whole in the Jhelum basin, the intensity and frequency of warm temperature extremes are likely to be higher and the intensity and frequency of cold temperature extremes to be lower in the future.

  16. Approximate Forward Difference Equations for the Lower Order Non-Stationary Statistics of Geometrically Non-Linear Systems subject to Random Excitation

    DEFF Research Database (Denmark)

    Köylüoglu, H. U.; Nielsen, Søren R. K.; Cakmak, A. S.

    Geometrically non-linear multi-degree-of-freedom (MDOF) systems subject to random excitation are considered. New semi-analytical approximate forward difference equations for the lower order non-stationary statistical moments of the response are derived from the stochastic differential equations...... of motion, and, the accuracy of these equations is numerically investigated. For stationary excitations, the proposed method computes the stationary statistical moments of the response from the solution of non-linear algebraic equations....

  17. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    Science.gov (United States)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  18. Impact of the second order self-forces on the dephasing of the gravitational waves from quasi-circular extreme mass-ratio inspirals

    CERN Document Server

    Isoyama, Soichiro; Sago, Norichika; Tagoshi, Hideyuki; Tanaka, Takahiro

    2012-01-01

    The accurate calculation of long-term phase evolution of gravitational wave (GW) forms from extreme (intermediate) mass ratio inspirals (E(I)MRIs) is an inevitable step to extract information from this system. Achieving this goal, it is believed that we need to understand the gravitational self-forces. However, it is not quntatively demonstrated that the second order self-forces are necessary for this purpose. In this paper we revisit the problem to estimate the order of magnitude of the dephasing caused by the second order self-forces on a small body in a quasi-circular orbit around a Kerr black hole, based on the knowledge of the post-Newtonian (PN) approximation and invoking the energy balance argument. In particular, we focus on the averaged dissipative part of the self-force, since it gives the leading order contribution among the various components of them. To avoid the possibility that the energy flux of GWs becomes negative, we propose a new simple resummation called exponential resummation, which ass...

  19. High-order nonlinear optical processes in ablated carbon-containing materials: Recent approaches in development of the nonlinear spectroscopy using harmonic generation in the extreme ultraviolet range

    Science.gov (United States)

    Ganeev, R. A.

    2017-08-01

    The nonlinear spectroscopy using harmonic generation in the extreme ultraviolet range became a versatile tool for the analysis of the optical, structural and morphological properties of matter. The carbon-contained materials have shown the advanced properties among other studied species, which allowed both the definition of the role of structural properties on the nonlinear optical response and the analysis of the fundamental features of carbon as the attractive material for generation of coherent short-wavelength radiation. We review the studies of the high-order harmonic generation by focusing ultrashort pulses into the plasmas produced during laser ablation of various organic compounds. We discuss the role of ionic transitions of ablated carbon-containing molecules on the harmonic yield. We also show the similarities and distinctions of the harmonic and plasma spectra of organic compounds and graphite. We discuss the studies of the generation of harmonics up to the 27th order (λ = 29.9 nm) of 806 nm radiation in the boron carbide plasma and analyze the advantages and disadvantages of this target compared with the ingredients comprising B4C (solid boron and graphite) by comparing plasma emission and harmonic spectra from three species. We also show that the coincidence of harmonic and plasma emission wavelengths in most cases does not cause the enhancement or decrease of the conversion efficiency of this harmonic.

  20. The cross-cut statistic and its sensitivity to bias in observational studies with ordered doses of treatment.

    Science.gov (United States)

    Rosenbaum, Paul R

    2016-03-01

    A common practice with ordered doses of treatment and ordered responses, perhaps recorded in a contingency table with ordered rows and columns, is to cut or remove a cross from the table, leaving the outer corners--that is, the high-versus-low dose, high-versus-low response corners--and from these corners to compute a risk or odds ratio. This little remarked but common practice seems to be motivated by the oldest and most familiar method of sensitivity analysis in observational studies, proposed by Cornfield et al. (1959), which says that to explain a population risk ratio purely as bias from an unobserved binary covariate, the prevalence ratio of the covariate must exceed the risk ratio. Quite often, the largest risk ratio, hence the one least sensitive to bias by this standard, is derived from the corners of the ordered table with the central cross removed. Obviously, the corners use only a portion of the data, so a focus on the corners has consequences for the standard error as well as for bias, but sampling variability was not a consideration in this early and familiar form of sensitivity analysis, where point estimates replaced population parameters. Here, this cross-cut analysis is examined with the aid of design sensitivity and the power of a sensitivity analysis.

  1. Improvements on Particle Tracking Velocimetry: model-free calibration and noiseless measurement of second order statistics of the velocity field

    CERN Document Server

    Machicoane, Nathanael; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain

    2016-01-01

    This article describes two independent developments aimed at improving the Particle Tracking Method for measurements of flow or particle velocities. First, a stereoscopic multicamera calibration method that does not require any optical model is described and evaluated. We show that this new calibration method gives better results than the most commonly-used technique, based on the Tsai camera/optics model. Additionally, the methods uses a simple interpolant to compute the transformation matrix and it is trivial to apply for any experimental fluid dynamics visualization set up. The second contribution proposes a solution to remove noise from Eulerian measurements of velocity statistics obtained from Particle Tracking velocimetry, without the need of filtering and/or windowing. The novel method presented here is based on recomputing particle displacement measurements from two consecutive frames for multiple different time-step values between frames. We show the successful application of this new technique to re...

  2. Higher-order organisation of extremely amplified, potentially functional and massively methylated 5S rDNA in European pikes (Esox sp.).

    Science.gov (United States)

    Symonová, Radka; Ocalewicz, Konrad; Kirtiklis, Lech; Delmastro, Giovanni Battista; Pelikánová, Šárka; Garcia, Sonia; Kovařík, Aleš

    2017-05-18

    Pikes represent an important genus (Esox) harbouring a pre-duplication karyotype (2n = 2x = 50) of economically important salmonid pseudopolyploids. Here, we have characterized the 5S ribosomal RNA genes (rDNA) in Esox lucius and its closely related E. cisalpinus using cytogenetic, molecular and genomic approaches. Intragenomic homogeneity and copy number estimation was carried out using Illumina reads. The higher-order structure of rDNA arrays was investigated by the analysis of long PacBio reads. Position of loci on chromosomes was determined by FISH. DNA methylation was analysed by methylation-sensitive restriction enzymes. The 5S rDNA loci occupy exclusively (peri)centromeric regions on 30-38 acrocentric chromosomes in both E. lucius and E. cisalpinus. The large number of loci is accompanied by extreme amplification of genes (>20,000 copies), which is to the best of our knowledge one of the highest copy number of rRNA genes in animals ever reported. Conserved secondary structures of predicted 5S rRNAs indicate that most of the amplified genes are potentially functional. Only few SNPs were found in genic regions indicating their high homogeneity while intergenic spacers were more heterogeneous and several families were identified. Analysis of 10-30 kb-long molecules sequenced by the PacBio technology (containing about 40% of total 5S rDNA) revealed that the vast majority (96%) of genes are organised in large several kilobase-long blocks. Dispersed genes or short tandems were less common (4%). The adjacent 5S blocks were directly linked, separated by intervening DNA and even inverted. The 5S units differing in the intergenic spacers formed both homogeneous and heterogeneous (mixed) blocks indicating variable degree of homogenisation between the loci. Both E. lucius and E. cisalpinus 5S rDNA was heavily methylated at CG dinucleotides. Extreme amplification of 5S rRNA genes in the Esox genome occurred in the absence of significant pseudogenisation

  3. Statistics, distillation, and ordering emergence in a two-dimensional stochastic model of particles in counterflowing streams

    Science.gov (United States)

    Stock, Eduardo Velasco; da Silva, Roberto; Fernandes, H. A.

    2017-07-01

    In this paper, we propose a stochastic model which describes two species of particles moving in counterflow. The model generalizes the theoretical framework that describes the transport in random systems by taking into account two different scenarios: particles can work as mobile obstacles, whereas particles of one species move in the opposite direction to the particles of the other species, or particles of a given species work as fixed obstacles remaining in their places during the time evolution. We conduct a detailed study about the statistics concerning the crossing time of particles, as well as the effects of the lateral transitions on the time required to the system reaches a state of complete geographic separation of species. The spatial effects of jamming are also studied by looking into the deformation of the concentration of particles in the two-dimensional corridor. Finally, we observe in our study the formation of patterns of lanes which reach the steady state regardless of the initial conditions used for the evolution. A similar result is also observed in real experiments involving charged colloids motion and simulations of pedestrian dynamics based on Langevin equations, when periodic boundary conditions are considered (particles counterflow in a ring symmetry). The results obtained through Monte Carlo simulations and numerical integrations are in good agreement with each other. However, differently from previous studies, the dynamics considered in this work is not Newton-based, and therefore, even artificial situations of self-propelled objects should be studied in this first-principles modeling.

  4. New Closed-Form Results on Ordered Statistics of Partial Sums of Gamma Random Variables and its Application to Performance Evaluation in the Presence of Nakagami Fading

    KAUST Repository

    Nam, Sung Sik

    2017-06-19

    Complex wireless transmission systems require multi-dimensional joint statistical techniques for performance evaluation. Here, we first present the exact closed-form results on order statistics of any arbitrary partial sums of Gamma random variables with the closedform results of core functions specialized for independent and identically distributed Nakagami-m fading channels based on a moment generating function-based unified analytical framework. These both exact closed-form results have never been published in the literature. In addition, as a feasible application example in which our new offered derived closed-form results can be applied is presented. In particular, we analyze the outage performance of the finger replacement schemes over Nakagami fading channels as an application of our method. Note that these analysis results are directly applicable to several applications, such as millimeter-wave communication systems in which an antenna diversity scheme operates using an finger replacement schemes-like combining scheme, and other fading scenarios. Note also that the statistical results can provide potential solutions for ordered statistics in any other research topics based on Gamma distributions or other advanced wireless communications research topics in the presence of Nakagami fading.

  5. Statistical-thermodynamic description of the order-disorder transformation of D0{sub 19}-type phase in Ti-Al alloy

    Energy Technology Data Exchange (ETDEWEB)

    Radchenko, T.M. [Department of Solid State Theory, Institute for Metal Physics, N.A.S. of Ukraine, 36 Acad. Vernadsky Blvd., 03680 Kyiv-142 (Ukraine)], E-mail: taras.radchenko@gmail.com; Tatarenko, V.A. [Department of Solid State Theory, Institute for Metal Physics, N.A.S. of Ukraine, 36 Acad. Vernadsky Blvd., 03680 Kyiv-142 (Ukraine); Zapolsky, H.; Blavette, D. [UMR 6634, CNRS, Faculte des Sciences et Techniques, Universite de Rouen, 76801 Saint-Etienne du Rouvray Cedex (France)

    2008-03-06

    Within the framework of the self-consistent field approximation and the static concentration waves approach, a statistical-thermodynamic description of D0{sub 19}-type superstructure in Ti-Al alloy is developed. A model of order-disorder phase transformation is applied for the non-stoichiometric intermetallic Ti{sub 3}Al phase. Interatomic-interaction parameters are estimated for both approximations. One model supposes temperature-independent interatomic-interaction parameters, while the other includes the temperature dependence of mixing energies. The partial phase diagrams (equilibrium compositions for the coexistent ordered {alpha}{sub 2}-phase and disordered {alpha}-phase) are evaluated for both cases.

  6. An MGF-based unified framework to determine the joint statistics of partial sums of ordered i.n.d. random variables

    KAUST Repository

    Nam, Sungsik

    2014-08-01

    The joint statistics of partial sums of ordered random variables (RVs) are often needed for the accurate performance characterization of a wide variety of wireless communication systems. A unified analytical framework to determine the joint statistics of partial sums of ordered independent and identically distributed (i.i.d.) random variables was recently presented. However, the identical distribution assumption may not be valid in several real-world applications. With this motivation in mind, we consider in this paper the more general case in which the random variables are independent but not necessarily identically distributed (i.n.d.). More specifically, we extend the previous analysis and introduce a new more general unified analytical framework to determine the joint statistics of partial sums of ordered i.n.d. RVs. Our mathematical formalism is illustrated with an application on the exact performance analysis of the capture probability of generalized selection combining (GSC)-based RAKE receivers operating over frequency-selective fading channels with a non-uniform power delay profile. © 1991-2012 IEEE.

  7. Ultimate 100m World Records Through Extreme-Value Theory

    OpenAIRE

    Einmahl, J.H.J.; Smeets, S.G.W.R.

    2009-01-01

    We use extreme-value theory to estimate the ultimate world records for the 100m running, for both men and women. For this aim we collected the fastest personal best times set between January 1991 and June 2008. Estimators of the extreme-value index are based on a certain number of upper order statistics. To optimize this number of order statistics we minimize the asymptotic mean squared error of the moment estimator. Using the thus obtained estimate for the extreme-value index, the right endp...

  8. Quantum statistical model of nuclear multifragmentation in the canonical ensemble method 24.60.-k; 24.60.Ky; 25.70.Pq; 25.70.-z; Nuclear multifragmentation; First-order phase transition; Quantum statistics; Canonical ensemble

    CERN Document Server

    Parvan, A S; Ploszajczak, M

    2000-01-01

    A quantum statistical model of nuclear multifragmentation is proposed. The recurrence equation method used within the canonical ensemble makes the model solvable and transparent to physical assumptions and allows to get results without involving the Monte Carlo technique. The model exhibits the first-order phase transition. Quantum statistics effects are clearly seen on the microscopic level of occupation numbers but are almost washed out for global thermodynamic variables and the averaged observables studied. In the latter case, the recurrence relations for multiplicity distributions of both intermediate-mass and all fragments are derived and the specific changes in the shape of multiplicity distributions in the narrow region of the transition temperature is stressed. The temperature domain favorable to search for the HBT effect is noted.

  9. The statistical trade-off between word order and word structure - Large-scale evidence for the principle of least effort.

    Science.gov (United States)

    Koplenig, Alexander; Meyer, Peter; Wolfer, Sascha; Müller-Spitzer, Carolin

    2017-01-01

    Languages employ different strategies to transmit structural and grammatical information. While, for example, grammatical dependency relationships in sentences are mainly conveyed by the ordering of the words for languages like Mandarin Chinese, or Vietnamese, the word ordering is much less restricted for languages such as Inupiatun or Quechua, as these languages (also) use the internal structure of words (e.g. inflectional morphology) to mark grammatical relationships in a sentence. Based on a quantitative analysis of more than 1,500 unique translations of different books of the Bible in almost 1,200 different languages that are spoken as a native language by approximately 6 billion people (more than 80% of the world population), we present large-scale evidence for a statistical trade-off between the amount of information conveyed by the ordering of words and the amount of information conveyed by internal word structure: languages that rely more strongly on word order information tend to rely less on word structure information and vice versa. Or put differently, if less information is carried within the word, more information has to be spread among words in order to communicate successfully. In addition, we find that-despite differences in the way information is expressed-there is also evidence for a trade-off between different books of the biblical canon that recurs with little variation across languages: the more informative the word order of the book, the less informative its word structure and vice versa. We argue that this might suggest that, on the one hand, languages encode information in very different (but efficient) ways. On the other hand, content-related and stylistic features are statistically encoded in very similar ways.

  10. Extremes of order statistics of self-similar processes%自相似随机过程的顺序统计量的极值

    Institute of Scientific and Technical Information of China (English)

    凌成秀; 彭作祥

    2016-01-01

    设{Xi(t),t≥0}(1≤i≤n)为独立且与随机过程{X(t),t≥0}具有相同的任意有限维分布的随机过程.给定门限u>0和第r个上端顺序统计量Xmin,定义第r个关联点集为G(u):={t∈[0,1]:Xmin(t)>u}.计算pr(u)=P{Cr(u)≠Φ}在大脑图像处理和数字交互系统等领域中有广泛的应用背景.本文考虑具有概率连续的样本轨道的实值自相似过程X,满足一定的Albin条件,当u→∞时pr(u)的渐近式,同时得到Xmin超过递增门限的平均逗留时间的渐近式.最后,这些理论结果应用到广义偏Gauss自相似过程(包括)x过程、双分式Brown运动和子分式Brown运动)等重要的自相似过程.

  11. Statistical Inference for Binomial-generalized Pareto Compound Extreme Value Distribution Model%二项-广义Pareto复合极值分布模型的统计推断

    Institute of Scientific and Technical Information of China (English)

    张香云; 程维虎

    2012-01-01

    Extreme value theory is mainly the study on extreme events of small probability & major impact. At present, the compound extreme value distribution has been widely used in hydrology, meteorology, earthquake, insurance, finance and other fields. In this paper, we establish binomial-generalized Pareto compound extreme value distribution model based on extreme value type theorem and PBDH theorem, derive parameter estimation of the established compound model by probability weighted moments, get critical values of Kolmogorov-Smirnov test statistic.%极值理论主要研究小概率、大影响的极端事件.当前,复合极值分布已经广泛应用于水文、气象、地震、保险、金融等领域.本文以极值类型定理和PBDH定理为理论依据,构建了二项-广义Pareto复合极值分布模型;使用概率加权矩方法,对所建立的复合模型推导参数估计式;利用计算机模拟,得到了Kolmogorov-Smirnov(简称KS)检验统计量的临界值.

  12. Statistical second-order two-scale analysis and computation for heat conduction problem with radiation boundary condition in porous materials

    Science.gov (United States)

    Yang, Zhi-Qiang; Liu, Shi-Wei; Sun, Yi

    2016-09-01

    This paper discusses a statistical second-order two-scale (SSOTS) analysis and computation for a heat conduction problem with a radiation boundary condition in random porous materials. Firstly, the microscopic configuration for the structure with random distribution is briefly characterized. Secondly, the SSOTS formulae for computing the heat transfer problem are derived successively by means of the construction way for each cell. Then, the statistical prediction algorithm based on the proposed two-scale model is described in detail. Finally, some numerical experiments are proposed, which show that the SSOTS method developed in this paper is effective for predicting the heat transfer performance of porous materials and demonstrating its significant applications in actual engineering computation. Project supported by the China Postdoctoral Science Foundation (Grant Nos. 2015M580256 and 2016T90276).

  13. Exactly soluble local bosonic cocycle models, statistical transmutation, and simplest time-reversal symmetric topological orders in 3+1 dimensions

    Science.gov (United States)

    Wen, Xiao-Gang

    2017-05-01

    We propose a generic construction of exactly soluble local bosonic models that realize various topological orders with gappable boundaries. In particular, we construct an exactly soluble bosonic model that realizes a (3+1)-dimensional [(3+1)D] Z2-gauge theory with emergent fermionic Kramers doublet. We show that the emergence of such a fermion will cause the nucleation of certain topological excitations in space-time without pin+ structure. The exactly soluble model also leads to a statistical transmutation in (3+1)D. In addition, we construct exactly soluble bosonic models that realize 2 types of time-reversal symmetry-enriched Z2 topological orders in 2+1 dimensions, and 20 types of simplest time-reversal symmetry-enriched topological (SET) orders which have only one nontrivial pointlike and stringlike topological excitation. Many physical properties of those topological states are calculated using the exactly soluble models. We find that some time-reversal SET orders have pointlike excitations that carry Kramers doublet, a fractionalized time-reversal symmetry. We also find that some Z2 SET orders have stringlike excitations that carry anomalous (nononsite) Z2 symmetry, which can be viewed as a fractionalization of Z2 symmetry on strings. Our construction is based on cochains and cocycles in algebraic topology, which is very versatile. In principle, it can also realize emergent topological field theory beyond the twisted gauge theory.

  14. Comparing Chains of Order Statistics

    CERN Document Server

    Hoffman, Charles

    2012-01-01

    Fix $0\\leq k\\leq m\\leq n$, and let $X_1,...,X_m,Y_1,...,Y_n$ be continuous, independent, and identically distributed random variables. We derive a probability distribution that compares the performance of a $k$-out-of-$m$ system to a $k$-out-of-$n$ system. By virtue of uniformity, we may recast our method of comparison to enumerating lattice paths of a certain exceedance, invoking the Chung-Feller Theorem and Ballot Numbers in our derivation. Another bijection shows that our probability distribution describes the proportion of the first $2k$ steps lying above $x=0$, for a $(m+n)$-step integer random walk, starting at $x=0$ and terminating at $x=m-n$.

  15. Structure and mechanism in a second-order statistical state dynamics model of self-sustaining turbulence in plane Couette flow

    CERN Document Server

    Farrell, Brian F

    2016-01-01

    This paper describes a study of the self-sustaining process (SSP) that maintains turbulence in wall-bounded shear flow. The study uses Couette flow and is based on a statistical state dynamics (SSD) model closed at second order with state variables the streamwise mean (first cumulant) and the covariance of perturbations (second cumulant). The SSD is closed by either neglecting or stochastically parameterizing the perturbation--perturbation nonlinearity in the perturbation covariance equation. This class of quasi-linear SSD models, which are referred to as RNL models, are a second order SSD systems that includes the stochastic structural stability theory (S3T or equivalently RNL$_\\infty$) model which is used in this study. Comparisons of turbulence maintained in DNS and RNL simulations have demonstrated that RNL systems self-sustain turbulence with a mean flow and perturbation structure consistent with DNS. The current results isolate the dynamical components sustaining turbulence in the S3T system concentrati...

  16. Statistical methods for transverse beam position diagnostics with higher order modes in third harmonic 3.9 GHz superconducting accelerating cavities at FLASH

    CERN Document Server

    Zhang, P; Jones, R M

    2014-01-01

    Beam-excited higher order modes (HOM) can be used to provide beam diagnostics. Here we focus on 3.9 GHz superconducting accelerating cavities. In particular we study dipole mode excitation and its application to beam position determinations. In order to extract beam position information, linear regression can be used. Due to a large number of sampling points in the waveforms, statistical methods are used to effectively reduce the dimension of the system, such as singular value decomposition (SVD) and k-means clustering. These are compared with the direct linear regression (DLR) on the entire waveforms. A cross-validation technique is used to study the sample independent precisions of the position predictions given by these three methods. A RMS prediction error in the beam position of approximately 50 micron can be achieved by DLR and SVD, while k-means clustering suggests 70 micron.

  17. 0.8 /spl mu/m CMOS implementation of weighted-order statistic image filter based on cellular neural network architecture.

    Science.gov (United States)

    Kowalski, J

    2003-01-01

    In this paper, a very large scale integration chip of an analog image weighted-order statistic (WOS) filter based on cellular neural network (CNN) architecture for real-time applications is described. The chip has been implemented in CMOS AMS 0.8 /spl mu/m technology. CNN-based filter consists of feedforward nonlinear template B operating within the window of 3 /spl times/ 3 pixels around the central pixel being filtered. The feedforward nonlinear CNN coefficients have been realized using programmable nonlinear coupler circuits. The WOS filter chip allows for processing of images with 300 pixels horizontal resolution. The resolution can be increased by cascading of the chips. Experimental results of basic circuit building blocks measurements are presented. Functional tests of the chip have been performed using a special test setup for PAL composite video signal processing. Using the setup real images have been filtered by WOS filter chip under test.

  18. Application of the Second-Order Statistics for Estimation of the Pure Spectra of Individual Components from the Visible Hyperspectral Images of Their Mixture

    CERN Document Server

    Jong, Sung-Ho; Sin, Kye-Ryong

    2016-01-01

    The second-order statistics (SOS) can be applied in estimation of the pure spectra of chemical components from the spectrum of their mixture, when SOS seems to be good at estimation of spectral patterns, but their peak directions are opposite in some cases. In this paper, one method for judgment of the peak direction of the pure spectra was proposed, where the base line of the pure spectra was drawn by using their histograms and the peak directions were chosen so as to make all of the pure spectra located upwards over the base line. Results of the SOS analysis on the visible hyperspectral images of the mixture composed of two or three chemical components showed that the present method offered the reasonable shape and direction of the pure spectra of its components.

  19. Bacterial genomes lacking long-range correlations may not be modeled by low-order Markov chains: the role of mixing statistics and frame shift of neighboring genes.

    Science.gov (United States)

    Cocho, Germinal; Miramontes, Pedro; Mansilla, Ricardo; Li, Wentian

    2014-12-01

    We examine the relationship between exponential correlation functions and Markov models in a bacterial genome in detail. Despite the well known fact that Markov models generate sequences with correlation function that decays exponentially, simply constructed Markov models based on nearest-neighbor dimer (first-order), trimer (second-order), up to hexamer (fifth-order), and treating the DNA sequence as being homogeneous all fail to predict the value of exponential decay rate. Even reading-frame-specific Markov models (both first- and fifth-order) could not explain the fact that the exponential decay is very slow. Starting with the in-phase coding-DNA-sequence (CDS), we investigated correlation within a fixed-codon-position subsequence, and in artificially constructed sequences by packing CDSs with out-of-phase spacers, as well as altering CDS length distribution by imposing an upper limit. From these targeted analyses, we conclude that the correlation in the bacterial genomic sequence is mainly due to a mixing of heterogeneous statistics at different codon positions, and the decay of correlation is due to the possible out-of-phase between neighboring CDSs. There are also small contributions to the correlation from bases at the same codon position, as well as by non-coding sequences. These show that the seemingly simple exponential correlation functions in bacterial genome hide a complexity in correlation structure which is not suitable for a modeling by Markov chain in a homogeneous sequence. Other results include: use of the (absolute value) second largest eigenvalue to represent the 16 correlation functions and the prediction of a 10-11 base periodicity from the hexamer frequencies.

  20. Practical Statistics

    CERN Document Server

    Lyons, L

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  1. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    2005-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  2. Extreme Heat

    Science.gov (United States)

    ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ... Landslides & Debris Flow Nuclear Blast Nuclear Power Plants Power Outages Pandemic Radiological Dispersion Device Severe Weather Snowstorms & Extreme ...

  3. Do intermediate- and higher-order principal components contain useful information to detect subtle changes in lower extremity biomechanics during running?

    Science.gov (United States)

    Phinyomark, Angkoon; Hettinga, Blayne A; Osis, Sean; Ferber, Reed

    2015-12-01

    Recently, a principal component analysis (PCA) approach has been used to provide insight into running pathomechanics. However, researchers often account for nearly all of the variance from the original data using only the first few, or lower-order principal components (PCs), which are often associated with the most dominant movement patterns. In contrast, intermediate- and higher-order PCs are generally associated with subtle movement patterns and may contain valuable information about between-group variation and specific test conditions. Few investigations have evaluated the utility of intermediate- and higher-order PCs based on observational cross-sectional analyses of different cohorts, and no prior studies have evaluated longitudinal changes in an intervention study. This study was designed to test the utility of intermediate- and higher-order PCs in identifying differences in running patterns between different groups based on three-dimensional bilateral lower-limb kinematics. The results reveal that differences between sex- and age-groups of 128 runners were observed in the lower- and intermediate-order PCs scores (p<0.05) while differences between baseline and following a 6-week muscle strengthening program for 24 runners with patellofemoral pain were observed in the higher-order PCs scores (p<0.05), which exhibited a moderate correlation with self-reported pain scores (r=-0.43; p<0.05).

  4. Mandelbrot's Extremism

    NARCIS (Netherlands)

    Beirlant, J.; Schoutens, W.; Segers, J.J.J.

    2004-01-01

    In the sixties Mandelbrot already showed that extreme price swings are more likely than some of us think or incorporate in our models.A modern toolbox for analyzing such rare events can be found in the field of extreme value theory.At the core of extreme value theory lies the modelling of maxima

  5. Detection of Step-Structure Edge Based on Order Statistic Filter%基于次序统计滤波器的阶跃边缘检测

    Institute of Scientific and Technical Information of China (English)

    马洪; 余勇; 马黎; 梅田三千雄

    2001-01-01

    在二值图像的边缘检测理论中,经典的方法是用固定的卷积核构成边缘检测算子,如Sobel算子,Prewitt算子,Kirsch算子和Roberts算子等。它们对数字图像边缘的复杂几何结构缺乏算法自适应性。现作者将随机滤波的思想引入边缘检测,用次序统计滤波器构造随机卷积核,从而引入一种新的随机边缘检测算子,即OSF边缘算子,并分别对雷达图像和文本图像实施边缘检测。实验结果证明了OSF边缘算子的有效性。%In the theory of edge detection of binary image, the classical method is to use regular convolution kernel to construct edge detect operator, such as Sobel operator, Prewitt operator, Kirsch operator and Roberts operator etc. However, these operators for the complex geometric structure of digital image edge lack algorithm adaptability. The authors propose stochastic filtering to edge detection. Using order statistic filter to construct stochastic convolution kernel, the authors yield a new kind of stochastic edge detect operator-OSF edge operator. The authors carry out the edge detection for radar images and document images, and the experimental results show the efficency of the OSF edge operator.

  6. Effect of Second-Order and Fully Nonlinear Wave Kinematics on a Tension-Leg-Platform Wind Turbine in Extreme Wave Conditions: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jonkman, Jason [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-02

    In this study, we assess the impact of different wave kinematics models on the dynamic response of a tension-leg-platform wind turbine. Aero-hydro-elastic simulations of the floating wind turbine are carried out employing linear, second-order, and fully nonlinear kinematics using the Morison equation for the hydrodynamic forcing. The wave kinematics are computed from either theoretical or measured signals of free-surface elevation. The numerical results from each model are compared to results from wave basin tests on a scaled prototype. The comparison shows that sub and superharmonic responses can be introduced by second-order and fully nonlinear wave kinematics. The response at the wave frequency range is better reproduced when kinematics are generated from the measured surface elevation. In the future, the numerical response may be further improved by replacing the global, constant damping coefficients in the model by a more detailed, customizable definition of the user-defined numerical damping.

  7. Depth statistics

    OpenAIRE

    2012-01-01

    In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...

  8. Capturing rogue waves by multi-point statistics

    CERN Document Server

    Hadjihosseini, Ali; Hoffmann, Norbert P; Peinke, Joachim

    2015-01-01

    As an example for complex systems with extreme events we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows grasping extreme rogue wave events in a statistically highly satisfactory manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated allowing to work out arbitrary statistical features of the complex sea state in general and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence pro...

  9. A dual CFT for a 4D extremal rotating regular black hole in Kerr/CFT correspondence by treating the regularization effect up to the first cubic-order expansion

    CERN Document Server

    Takeuchi, Shingo

    2016-01-01

    In this study, as an extension of Kerr/CFT correspondence to more realistic black holes, we carry out Kerr/CFT correspondence in a four-dimensional rotating regular black hole in which there is no singularities at the extremal limit. Then if we treat the equation to obtain the horizon radii, it turns out that it is given as a fifth-order equation due to the regularization effect for the core of black holes. In order to handle this situation, in this study we treat the regularization effect in perturbative way by expanding the regularized mass in terms of the parameter for the regularization effect up to the first cubic-order. We call this expansion as {\\it first cubic-order expansion}. As a result, the equation becomes forth-order, and we can obtain the NHEK geometry up to the first cubic-order expansion. Then obtaining the Virasolo algebra with the central charge and the Frolov-Thorne temperature in chiral half of 2D CFT dual for the NHEK geometry with the ASG, we compute entropy in the dual CFT using the Ca...

  10. Combined large field-of-view MRA and time-resolved MRA of the lower extremities: Impact of acquisition order on image quality

    Energy Technology Data Exchange (ETDEWEB)

    Riffel, Philipp, E-mail: Philipp.Riffel@umm.de [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Haneder, Stefan; Attenberger, Ulrike I. [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Brade, Joachim [Department of Medical Statistics and Biomathematics, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Schoenberg, Stefan O.; Michaely, Henrik J. [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany)

    2012-10-15

    Purpose: Different approaches exist for hybrid MRA of the calf station. So far, the order of the acquisition of the focused calf MRA and the large field-of-view MRA has not been scientifically evaluated. Therefore the aim of this study was to evaluate if the quality of the combined large field-of-view MRA (CTM MR angiography) and time-resolved MRA with stochastic interleaved trajectories (TWIST MRA) depends on the order of acquisition of the two contrast-enhanced studies. Methods: In this retrospective study, 40 consecutive patients (mean age 68.1 ± 8.7 years, 29 male/11 female) who had undergone an MR angiographic protocol that consisted of CTM-MRA (TR/TE, 2.4/1.0 ms; 21° flip angle; isotropic resolution 1.2 mm; gadolinium dose, 0.07 mmol/kg) and TWIST-MRA (TR/TE 2.8/1.1; 20° flip angle; isotropic resolution 1.1 mm; temporal resolution 5.5 s, gadolinium dose, 0.03 mmol/kg), were included. In the first group (group 1) TWIST-MRA of the calf station was performed 1–2 min after CTM-MRA. In the second group (group 2) CTM-MRA was performed 1–2 min after TWIST-MRA of the calf station. The image quality of CTM-MRA and TWIST-MRA were evaluated by 2 two independent radiologists in consensus according to a 4-point Likert-like rating scale assessing overall image quality on a segmental basis. Venous overlay was assessed per examination. Results: In the CTM-MRA, 1360 segments were included in the assessment of image quality. CTM-MRA was diagnostic in 95% (1289/1360) of segments. There was a significant difference (p < 0.0001) between both groups with regard to the number of segments rated as excellent and moderate. The image quality was rated as excellent in group 1 in 80% (514/640 segments) and in group 2 in 67% (432/649), respectively (p < 0.0001). In contrast, the image quality was rated as moderate in the first group in 5% (33/640) and in the second group in 19% (121/649) respectively (p < 0.0001). The venous overlay was disturbing in 10% in group 1 and 20% in group

  11. Stabilization of a high-order harmonic generation seeded extreme ultraviolet free electron laser by time-synchronization control with electro-optic sampling

    Institute of Scientific and Technical Information of China (English)

    H.Tomizawa; T.Sato; K.Ogawa; K.Togawa; T.Tanaka; T.Hara; M.Yabashi; H.Tanaka; T.Ishikawa; T.Togashi; S.Matsubara; Y.Okayasu; T.Watanabe; E.J.Takahashi; K.Midorikawa; M.Aoyama; K.Yamakawa; S.Owada; A.Iwasaki; K.Yamanouchi

    2015-01-01

    A fully coherent free electron laser(FEL) seeded with a higher-order harmonic(HH) pulse from high-order harmonic generation(HHG) is successfully operated for a sufficiently prolonged time in pilot user experiments by using a timing drift feedback. For HHG-seeded FELs, the seeding laser pulses have to be synchronized with electron bunches. Despite seeded FELs being non-chaotic light sources in principle, external laser-seeded FELs are often unstable in practice because of a timing jitter and a drift between the seeding laser pulses and the accelerated electron bunches. Accordingly,we constructed a relative arrival-timing monitor based on non-invasive electro-optic sampling(EOS). The EOS monitor made uninterrupted shot-to-shot monitoring possible even during the seeded FEL operation. The EOS system was then used for arrival-timing feedback with an adjustability of 100 fs for continual operation of the HHG-seeded FEL. Using the EOS-based beam drift controlling system, the HHG-seeded FEL was operated over half a day with an effective hit rate of 20%–30%. The output pulse energy was 20 μJ at the 61.2 nm wavelength. Towards seeded FELs in the water window region, we investigated our upgrade plan to seed high-power FELs with HH photon energy of 30–100 e V and lase at shorter wavelengths of up to 2 nm through high-gain harmonic generation(HGHG) at the energy-upgraded SPring-8Compact SASE Source(SCSS) accelerator. We studied a benefit as well as the feasibility of the next HHG-seeded FEL machine with single-stage HGHG with tunability of a lasing wavelength.

  12. Statistical analysis on extreme wave height

    Digital Repository Service at National Institute of Oceanography (India)

    Teena, N.V.; SanilKumar, V.; Sudheesh, K.; Sajeev, R.

    the distributions fitted to the GEV with annual maximum approach and GPD with peaks over threshold approach have indicated that both GEV and GPD models gave similar or comparable wave height for the study area since there is no multiple storm event in a year...

  13. Flooding hazards from sea extremes and subsidence

    DEFF Research Database (Denmark)

    Sørensen, Carlo; Vognsen, Karsten; Broge, Niels

    2015-01-01

    If we do not understand the effects of climate change and sea level rise (SLR) we cannot live in low-lying coastal areas in the future. Permanent inundation may become a prevalent issue but more often floods related to extreme events have the largest damage potential, and the management of flooding...... hazards needs to integrate the water loading from various sources. Furthermore, local subsidence must be accounted for in order to evaluate current and future flooding hazards and management options. We present the methodology (Figure) and preliminary results from the research project “Coastal Flooding...... Hazards due to Storm Surges and Subsidence” (2014-2017) with the objective to develop and test a practice oriented methodology for combining extreme water level statistics and land movement in coastal flooding hazard mapping and in climate change adaptation schemes in Denmark. From extreme value analysis...

  14. Extreme cosmos

    CERN Document Server

    Gaensler, Bryan

    2011-01-01

    The universe is all about extremes. Space has a temperature 270°C below freezing. Stars die in catastrophic supernova explosions a billion times brighter than the Sun. A black hole can generate 10 million trillion volts of electricity. And hypergiants are stars 2 billion kilometres across, larger than the orbit of Jupiter. Extreme Cosmos provides a stunning new view of the way the Universe works, seen through the lens of extremes: the fastest, hottest, heaviest, brightest, oldest, densest and even the loudest. This is an astronomy book that not only offers amazing facts and figures but also re

  15. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values a

  16. Arc Statistics

    CERN Document Server

    Meneghetti, M; Dahle, H; Limousin, M

    2013-01-01

    The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...

  17. Extremes in nature

    CERN Document Server

    Salvadori, Gianfausto; Kottegoda, Nathabandu T

    2007-01-01

    This book is about the theoretical and practical aspects of the statistics of Extreme Events in Nature. Most importantly, this is the first text in which Copulas are introduced and used in Geophysics. Several topics are fully original, and show how standard models and calculations can be improved by exploiting the opportunities offered by Copulas. In addition, new quantities useful for design and risk assessment are introduced.

  18. Statistics of fermions in the Randall-Wilkins model for kinetics of general order; Estadistica de fermiones en el modelo de Randall-Wilkins para cinetica de orden general

    Energy Technology Data Exchange (ETDEWEB)

    Nieto H, B.; Azorin N, J.; Vazquez C, G.A. [UAM-I, 09340 Mexico D.F. (Mexico)

    2004-07-01

    As a theoretical planning of the thermoluminescence phenomena (Tl), we study the behavior of the systems formed by fermions, which are related with this phenomenon establishing a generalization of the Randall-Wilkins model, as for first order kinetics as for general order (equation of May and Partridge) in which we consider a of Fermi-Dirac statistics. As consequence of this study a new variable is manifested: the chemical potential, also we establish its relationship with some of the other magnitudes already known in Tl. (Author)

  19. Exploring connections between statistical mechanics and Green's functions for realistic systems. Temperature dependent entropy and internal energy from a self-consistent second-order Green's function

    CERN Document Server

    Welden, Alicia Rae; Zgid, Dominika

    2016-01-01

    Including finite-temperature effects into electronic structure calculations of semiconductors and metals is frequently necessary, but can become cumbersome using zero-temperature methods, which require an explicit evaluation of excited states to extend the approach to finite-temperature. Using a Matsubara Green's function formalism, it is easy to include the effects of temperature and to connect dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. We evaluate the thermodynamic quantities with a self-consistent Green's function where the self-energy is approximated to second-order (GF2). To validate our method, we benchmark it against finite temperature full configuration interaction (FCI) calculations for a hydrogen fluoride (HF) molecule and find excellent agreement at high temperatures and very good agreement at low temperatures. Then, we proceed to evaluate thermodynamic quantities for a one-dimension hydrogen solid at v...

  20. Exploring connections between statistical mechanics and Green's functions for realistic systems: Temperature dependent electronic entropy and internal energy from a self-consistent second-order Green's function.

    Science.gov (United States)

    Welden, Alicia Rae; Rusakov, Alexander A; Zgid, Dominika

    2016-11-28

    Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.

  1. Exploring connections between statistical mechanics and Green's functions for realistic systems: Temperature dependent electronic entropy and internal energy from a self-consistent second-order Green's function

    Science.gov (United States)

    Welden, Alicia Rae; Rusakov, Alexander A.; Zgid, Dominika

    2016-11-01

    Including finite-temperature effects from the electronic degrees of freedom in electronic structure calculations of semiconductors and metals is desired; however, in practice it remains exceedingly difficult when using zero-temperature methods, since these methods require an explicit evaluation of multiple excited states in order to account for any finite-temperature effects. Using a Matsubara Green's function formalism remains a viable alternative, since in this formalism it is easier to include thermal effects and to connect the dynamic quantities such as the self-energy with static thermodynamic quantities such as the Helmholtz energy, entropy, and internal energy. However, despite the promising properties of this formalism, little is known about the multiple solutions of the non-linear equations present in the self-consistent Matsubara formalism and only a few cases involving a full Coulomb Hamiltonian were investigated in the past. Here, to shed some light onto the iterative nature of the Green's function solutions, we self-consistently evaluate the thermodynamic quantities for a one-dimensional (1D) hydrogen solid at various interatomic separations and temperatures using the self-energy approximated to second-order (GF2). At many points in the phase diagram of this system, multiple phases such as a metal and an insulator exist, and we are able to determine the most stable phase from the analysis of Helmholtz energies. Additionally, we show the evolution of the spectrum of 1D boron nitride to demonstrate that GF2 is capable of qualitatively describing the temperature effects influencing the size of the band gap.

  2. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  3. 极限波浪下半潜平台气隙和波浪爬升的统计分析%Statistical analysis of airgap and wave run-up for a semi-submersible platform under extreme waves

    Institute of Scientific and Technical Information of China (English)

    闫发锁; 杨慧; 沈鹏飞; 赵九龙

    2015-01-01

    气隙是半潜式平台设计的关键参数之一。对一座深水半潜式平台在极限海况下的气隙响应进行了模型试验,研究了测点气隙的严重程度和量值的概率分布。通过试验证实,斜浪情况下平台后立柱附近为气隙最严重区域。试验结果与三维势流方法的数值预报的比较表明,测点的气隙极小值总体上低于数值计算值,亦即线性势流方法低估了气隙的严重程度。通过信号的能量谱密度分析,低频成分在气隙响应中占有较大的比例。对气隙测点的时历进行跨零统计和概率分布拟合,结果表明高斯模型能在总体上反映极限海况下气隙量值的分布规律,但在极值点的分布上需要拟合修正。%The airgap is one of the key parameters in the design of semi-submersible platforms.A series of model tests were performed to investigate the airgap responses of a deepwater semisubmersible platform under extreme sea conditions, deriving the severity of airgaps and the probability distribution of the airgap values at 11 locations on the deck.It is proven in the experiment that the severest region of airgap lies in the vicinity of post in the back of the platform under the oblique wave circumstances.The experimental results were compared with the numerical predic-tion by the three-dimensional potential flow method, showing that the minimal values of airgaps at the measure points are generally lower than the numerical calculation results, namely the linear potential flow method underesti-mates the severity of airgaps.The analyses of signals′energy spectrum density proved the low frequency elements account for a considerable proportion in the airgap responses.The cross-zero statistics and probability distribution fitting on the time history of airgap measure points indicate that the Gaussian model can reflect general distribution of airgap values, but the distribution of extremal points needs to be fitted for

  4. Wind simulation for extreme and fatigue loads

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Larsen, G.C.; Mann, J.; Ott, S.; Hansen, K.S.; Pedersen, B.J.

    2004-01-01

    Measurements of atmospheric turbulence have been studied and found to deviate from a Gaussian process, in particular regarding the velocity increments over small time steps, where the tails of the pdf are exponential rather than Gaussian. Principles for extreme event counting and the occurrence of cascading events are presented. Empirical extreme statistics agree with Rices exceedence theory, when it is assumed that the velocity and its time derivative are independent. Prediction based on the assumption that the velocity is a Gaussian process underpredicts the rate of occurrence of extreme events by many orders of magnitude, mainly because the measured pdf is non-Gaussian. Methods for simulation of turbulent signals have been developed and their computational efficiency are considered. The methods are applicable for multiple processes with individual spectra and probability distributions. Non-Gaussian processes are simulated by the correlation-distortion method. Non-stationary processes are obtained by Bezier interpolation between a set of stationary simulations with identical random seeds. Simulation of systems with some signals available is enabled by conditional statistics. A versatile method for simulation of extreme events has been developed. This will generate gusts, velocity jumps, extreme velocity shears, and sudden changes of wind direction. Gusts may be prescribed with a specified ensemble average shape, and it is possible to detect the critical gust shape for a given construction. The problem is formulated as the variational problem of finding the most probable adjustment of a standard simulation of a stationary Gaussian process subject to relevant event conditions, which are formulated as linear combination of points in the realization. The method is generalized for multiple correlated series, multiple simultaneous conditions, and 3D fields of all velocity components. Generalization are presented for a single non-Gaussian process subject to relatively

  5. Laboratory for Engineering Man/Machine Systems (LEMS): System identification, model reduction and deconvolution filtering using Fourier based modulating signals and high order statistics

    Science.gov (United States)

    Pan, Jianqiang

    1992-01-01

    Several important problems in the fields of signal processing and model identification, such as system structure identification, frequency response determination, high order model reduction, high resolution frequency analysis, deconvolution filtering, and etc. Each of these topics involves a wide range of applications and has received considerable attention. Using the Fourier based sinusoidal modulating signals, it is shown that a discrete autoregressive model can be constructed for the least squares identification of continuous systems. Some identification algorithms are presented for both SISO and MIMO systems frequency response determination using only transient data. Also, several new schemes for model reduction were developed. Based upon the complex sinusoidal modulating signals, a parametric least squares algorithm for high resolution frequency estimation is proposed. Numerical examples show that the proposed algorithm gives better performance than the usual. Also, the problem was studied of deconvolution and parameter identification of a general noncausal nonminimum phase ARMA system driven by non-Gaussian stationary random processes. Algorithms are introduced for inverse cumulant estimation, both in the frequency domain via the FFT algorithms and in the domain via the least squares algorithm.

  6. The genealogy of extremal particles of Branching Brownian Motion

    CERN Document Server

    Arguin, Louis-Pierre; Kistler, Nicola

    2010-01-01

    Branching Brownian Motion describes a system of particles which diffuse in space and split into offsprings according to a certain random mechanism. In virtue of the groundbreaking work by M. Bramson on the convergence of solutions of the Fisher-KPP equation to traveling waves, the law of the rightmost particle in the limit of large times is rather well understood. In this work, we address the full statistics of the extremal particles (first-, second-, third- etc. largest). In particular, we prove that in the large $t-$limit, such particles descend with overwhelming probability from ancestors having split either within a distance of order one from time $0$, or within a distance of order one from time $t$. The approach relies on characterizing, up to a certain level of precision, the paths of the extremal particles. As a byproduct, a heuristic picture of Branching Brownian Motion ``at the edge'' emerges, which sheds light on the still unknown limiting extremal process.

  7. Precursors of extreme increments

    CERN Document Server

    Hallerberg, S; Holstein, D; Kantz, H; Hallerberg, Sarah; Altmann, Eduardo G.; Holstein, Detlef; Kantz, Holger

    2006-01-01

    We investigate precursors and predictability of extreme events in time series, which consist in large increments within successive time steps. In order to understand the predictability of this class of extreme events, we study analytically the prediction of extreme increments in AR(1)-processes. The resulting strategies are then applied to predict sudden increases in wind speed recordings. In both cases we evaluate the success of predictions via creating receiver operator characteristics (ROC-plots). Surprisingly, we obtain better ROC-plots for completely uncorrelated Gaussian random numbers than for AR(1)-correlated data. Furthermore, we observe an increase of predictability with increasing event size. Both effects can be understood by using the likelihood ratio as a summary index for smooth ROC-curves.

  8. Temperature extremes in Western Europe and associated atmospheric anomalies

    Science.gov (United States)

    Carvalho, V. A.; Santos, J. A.

    2009-09-01

    This worḱs focal point is the analysis of temperature extremes over Western Europe in the period 1957-2007 and their relationship to large-scale anomalies in the atmospheric circulation patterns. The study is based on temperature daily time series recorded at a set of meteorological stations covering the target area. The large-scale anomalies are analyzed using data from the National Centers for Environmental Prediction reanalysis project. Firstly, a preliminary statistical analysis was undertaken in order to identify data gaps and erroneous values and to check the homogeneity of the time series, using not only elementary statistical approaches (e.g., chronograms, box-plots, scatter-plots), but also a set of non-parametric statistical tests particularly suitable for the analysis of monthly and seasonal mean temperature time series (e.g., Wald-Wolfowitz serial correlation test, Spearman and Mann-Kendall trend tests). Secondly, based on previous results, a selection of the highest quality time series was carried out. Aiming at identifying temperature extremes, we then proceed to the isolation of months with temperature values above or below pre-selected thresholds based on the empirical distribution of each time series. In particular, thresholds are based on percentiles specifically computed for each individual temperature record (data adaptive) and not on fixed values. As a result, a calendar of extremely high and extremely low monthly mean temperatures is obtained and the large-scale atmospheric conditions during each extreme are analyzed. Several atmospheric fields are considered in this study (e.g., 2-m maximum and minimum air temperature, sea level pressure, geopotential height, zonal and meridional wind components, vorticity, relative humidity) at different isobaric levels. Results show remarkably different synoptic conditions for temperature extremes in different parts of Western Europe, highlighting the different dynamical mechanisms underlying their

  9. Analogies for Understanding Statistics

    Science.gov (United States)

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  10. Adaptive approximation of higher order posterior statistics

    KAUST Repository

    Lee, Wonjung

    2014-02-01

    Filtering is an approach for incorporating observed data into time-evolving systems. Instead of a family of Dirac delta masses that is widely used in Monte Carlo methods, we here use the Wiener chaos expansion for the parametrization of the conditioned probability distribution to solve the nonlinear filtering problem. The Wiener chaos expansion is not the best method for uncertainty propagation without observations. Nevertheless, the projection of the system variables in a fixed polynomial basis spanning the probability space might be a competitive representation in the presence of relatively frequent observations because the Wiener chaos approach not only leads to an accurate and efficient prediction for short time uncertainty quantification, but it also allows to apply several data assimilation methods that can be used to yield a better approximate filtering solution. The aim of the present paper is to investigate this hypothesis. We answer in the affirmative for the (stochastic) Lorenz-63 system based on numerical simulations in which the uncertainty quantification method and the data assimilation method are adaptively selected by whether the dynamics is driven by Brownian motion and the near-Gaussianity of the measure to be updated, respectively. © 2013 Elsevier Inc.

  11. 基于SAP系统的采购订单的统计和进度跟踪%Statistics and Progress Tracking of the Purchase Order Based on SAP

    Institute of Scientific and Technical Information of China (English)

    仇博

    2016-01-01

    The modules and characteristics of the SAP system are introduced firstly. Then by exploiting the powerful functions of the SAP system, three procedures about purchase order query sheet, cargo arrival plan tracking and cargo arrival progress tracking and querying are developed, thus, beautifully realizing the application of the statistics and progress tracking of the purchase order.%介绍SAP系统的各种模块及特点,利用SAP系统强大的功能开发出采购订单查询表、采购订单到货计划跟踪、采购订单到货进度跟踪查询三个程序,很好地实现了对采购订单统计和进度跟踪的应用。

  12. Algebraic Statistics

    OpenAIRE

    Norén, Patrik

    2013-01-01

    Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

  13. Extreme wind turbine response during operation

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Nielsen, S.R.K.

    2007-01-01

    Estimation of extreme response values is very important for structural design of wind turbines. Due to the influence of control system and nonlinear structural behavior the extreme response is usually assessed based on simulation of turbulence time series. In this paper the problem of statistical...... provides a tool to obtain consistent estimates incl. the statistical uncertainty. An illustrative example indicates that the statistical uncertainty is important compared to the coefficient of variation of the extreme response when the number of 10 minutes simulations at each mean wind speed is limited...

  14. The Edgeworth expansion for distributions of extreme values

    Institute of Scientific and Technical Information of China (English)

    CHENG; Shihong(

    2001-01-01

    [1] Gnedenko, B. V., Sur la distribution limite du terme maximum d'une serie aleatoire, Annals of Math., 1943, 44: 423.[2] de Haan, L., On regular variation and its application to the weak convergence of sample extreme, Math. Centre Tract 32, Amsterdam: Math. Centrum, 1970.[3] Hall, P., On the rate of convergence of normal extremes, J. Appl. Probab., 1979, 16: 433.[4] Hall, W. J., Wellner, J. A., The rate of convergence in law of the maximum of an exponential samples, Statist. Neerlandica, 1979, 33: 151.[5] Davis, R., The rate of convergence in distribution of the maxima, Statist. Neerlandica, 1982, 36: 31.[6] Smith, R., Uniform rates of convergence in extreme value theory, Adv. in Appl. Probab., 1982, 14: 543.[7] Resnick, S. I., Uniform rates of convergence to extreme value distribution, Probability and Statistis: Essays in Honor of Franklin Graybill, Amsterdam: North-Holland, 1986.[8] Resnick, S. I., Extreme Values, Regular variation, and Point Processes, New York: Springer, 1987.[9] de Haan, L., Resnick, S. I., Second order regular variation and rates of convergence in extreme value theory, Annals of Probab., 1996, 24: 97.[10] Geluk, J. L., de Haan, L., Regular variation, extensions and Tauberian theorems, CWI Tract 40, Amsterdam: Center for Mathematics and Computer Science, 1987.[11] de Haan, L., Stadtmuller, U., Generalized regular variation of second order, J. Austral. Math. Soc. Ser. A, 1998, 61: 381.[12] Omey, E., Willekens, E., Π-variation with remainder, J. London Math. Soc., 1988, 37: 105.[13] Vervaat, W., Functional central limit theorems for processes with positive drift and their inverses, Z. Wahrsch. Verw. Gebiete, 1972, 23: 249.

  15. Statistical Neurodynamics.

    Science.gov (United States)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  16. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  17. Spatio-temporal trend and statistical distribution of extreme precipitation events in Huaihe River Basin during 1960-2009%近50年来淮河流域极端降水的时空变化及统计特征

    Institute of Scientific and Technical Information of China (English)

    XIA Jun; SHE Dunxian; ZHANG Yongyong; DU Hong

    2012-01-01

    @@%Based on the daily precipitation data of 27 meteorological stations from 1960 to 2009 in the Huaihe River Basin,spatio-temporal trend and statistical distribution of extreme precipitation events in this area are analyzed.Annual maximum series (AM) and peak over threshold series (POT) are selected to simulate the probability distribution of extreme precipitation.The results show that positive trend of annual maximum precipitation is detected at most of used stations,only a small number of stations are found to depict a negative trend during the past five decades,and none of the positive or negative trend is significant.The maximum precipitation event almost occurred in the flooding period during the 1960s and 1970s.By the L-moments method,the parameters of three extreme distributions,i.e.,Generalized extreme value distribution (GEV),Generalized Pareto distribution (GP) and Gamma distribution are estimated.From the results of goodness of fit test and Kolmogorov-Smirnov (K-S) test,AM series can be better fitted by GEV model and POT series can be better fitted by GP model.By the comparison of the precipitation amounts under different return levels,it can be found that the values obtained from POT series are a little larger than the values from AM series,and they can better simulate the observed values in the Huaihe River Basin.

  18. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  19. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  20. Finite Temperature Field Theory of "Extreme Black Holes"

    OpenAIRE

    Degura, Yoshitaka; Shiraishi, Kiyoshi

    2000-01-01

    We treat the model which describes "extreme black holes" moving slowly. We derive an effective lagrangian in the low energy for this model and then investigate a statistical behavior of "extreme black holes" in the finite temperature.

  1. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  2. Measuring extremal dependencies in web graphs

    NARCIS (Netherlands)

    Volkovich, Y.; Litvak, Nelli; Zwart, B.

    We analyze dependencies in power law graph data (Web sample, Wikipedia sample and a preferential attachment graph) using statistical inference for multivariate regular variation. The well developed theory of regular variation is widely applied in extreme value theory, telecommunications and

  3. Observed variability and trends in extreme rainfall indices and Peaks-Over-Threshold series

    Directory of Open Access Journals (Sweden)

    H. Saidi

    2013-05-01

    Full Text Available Intensification of heavy precipitation as discussed in climate change studies has become a public concern, but it has not yet been examined well with observed data, particularly with data at short temporal scale like hourly and sub-hourly data. In this research we digitalized sub-hourly precipitation recorded at the stations of Vercelli (since 1927, Bra (since 1933, Lombriasco (since 1939 and Pallanza (since 1950 in order to investigate historical change in extreme short precipitations. These stations are located in the northwest of Italy. Besides seasonal and yearly maximum of precipitation we adopted two indices of extreme rainfall: the number of events above an extreme threshold (extreme frequency, and the average intensity of rainfall from extreme events (extreme intensity. The results showed a statistically significant increase of the extreme frequency index and spring maximum precipitation for Bra and Lombriasco. The extreme intensity index presented by the means of events above 95th percentile is decreasing for Bra regarding hourly precipitation and increasing for Lombriasco regarding 20 min extreme events. In Pallanza, we noticed only a positive trend of the extreme frequency and extreme intensity indices of events with duration of 30 min. For the analyses presented in this paper, a peak-over-threshold approach was chosen. Investigation presented showed that extreme events have risen in the last 20 yr only for short duration. Here it cannot be said that in our study area recent sub-hourly and hourly precipitation have become unprecedently strong or frequent for all the stations and for all the extreme events duration.

  4. Extreme lattices: symmetries and decorrelation

    Science.gov (United States)

    Andreanov, A.; Scardicchio, A.; Torquato, S.

    2016-11-01

    We study statistical and structural properties of extreme lattices, which are the local minima in the density landscape of lattice sphere packings in d-dimensional Euclidean space {{{R}}d} . Specifically, we ascertain statistics of the densities and kissing numbers as well as the numbers of distinct symmetries of the packings for dimensions 8 through 13 using the stochastic Voronoi algorithm. The extreme lattices in a fixed dimension of space d (d≥slant 8 ) are dominated by typical lattices that have similar packing properties, such as packing densities and kissing numbers, while the best and the worst packers are in the long tails of the distribution of the extreme lattices. We also study the validity of the recently proposed decorrelation principle, which has important implications for sphere packings in general. The degree to which extreme-lattice packings decorrelate as well as how decorrelation is related to the packing density and symmetry of the lattices as the space dimension increases is also investigated. We find that the extreme lattices decorrelate with increasing dimension, while the least symmetric lattices decorrelate faster.

  5. Extremely Preterm Birth

    Science.gov (United States)

    ... Events Advocacy For Patients About ACOG Extremely Preterm Birth Home For Patients Search FAQs Extremely Preterm Birth ... Spanish FAQ173, June 2016 PDF Format Extremely Preterm Birth Pregnancy When is a baby considered “preterm” or “ ...

  6. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  7. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  8. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  9. Histoplasmosis Statistics

    Science.gov (United States)

    ... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...

  10. Statistical distributions

    CERN Document Server

    Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.

    2010-01-01

    A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

  11. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  12. Improved Statistics Handling

    OpenAIRE

    2009-01-01

    Ericsson is a global provider of telecommunications systems equipment and related services for mobile and fixed network operators.  3Gsim is a tool used by Ericsson in tests of the 3G RNC node. In order to validate the tests, statistics are constantly gathered within 3Gsim and users can use telnet to access the statistics using some system specific 3Gsim commands. The statistics can be retrieved but is unstructured for the human eye and needs parsing and arranging to be readable.  The statist...

  13. Record Statistics and Dynamics

    DEFF Research Database (Denmark)

    Sibani, Paolo; Jensen, Henrik J.

    2009-01-01

    The term record statistics covers the statistical properties of records within an ordered series of numerical data obtained from observations or measurements. A record within such series is simply a value larger (or smaller) than all preceding values. The mathematical properties of records strongly...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....

  14. Methodology for assessing probability of extreme hydrologic events coincidence

    Directory of Open Access Journals (Sweden)

    Prohaska Stevan

    2010-01-01

    Full Text Available The aim of the presented research is improvement of methodology for probability calculation of coinciding occurrence of historic floods and droughts in the same year. The original procedure was developed in order to determine the occurrence probability of such an extreme historic event. There are two phases in calculation procedure for assessment of both extreme drought and flood occurrence probability in the same year. In the first phase outliers are detected as indicators of extreme events, their return periods are calculated and series' statistics adjusted. In the second phase conditional probabilities are calculated: empirical points are plotted, and both extreme drought and flood occurrence probability in the same year is assessed based on the plot. Outlier detection is performed for the territory of Serbia. Results are shown as maps of regions (basins prone to floods, hydrologic drought, or both. Step-by-step numeric example is given for assessing conditional probability of occurrence of flood and drought for GS Raska on the river Raska. Results of assessment of conditional probability in two more cases are given for combination of extreme flood and 30 day minimum flow.

  15. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  16. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  17. Cyclic changes in endometrial echotexture of cows using a computer-assisted program for the analysis of first- and second-order grey level statistics of B-Mode ultrasound images.

    Science.gov (United States)

    Schmauder, Sandra; Weber, Frank; Kiossis, Evangelos; Bollwein, Heinrich

    2008-06-01

    The aim of this study was to test the suitability of a computer-assisted echotexture analysis program for analysing first- and second-order grey level statistics of grey levels of B-Mode ultrasound images to examine morphologic changes in the endometrium during oestrous cycle in cows. Four Simmental cows were examined for two consecutive oestrous cycles. Echotexture of the endometrium was assessed by mean grey level (MGL) and homogeneity (HOM) of digitised B-Mode images of the uterine body and both uterine horns. As there were no differences (P>0.05) in MGL and HOM, respectively, between the images of the uterine body and the uterine horns, the mean values of all endometrial images were used for subsequent analyses of MGL and HOM. The factor 'day of oestrous cycle' showed a highly significant (P0.05) in both echotexture parameters were measured between oestrous cycles within cows. MGL was negatively related to HOM (r=-0.66; P0.05) low between Days -3 and -1, significant changes of HOM with maximum levels on Day -2 (P0.05) high between Days 4 and 13 while HOM was consistently (P>0.05) low between Days 2 and 13. From Day -3 to Day -1 (r=0.48; Plevels were correlated with HOM, but not with MGL (P<0.05). The results of this study show that changes in endometrial morphology of cows can be measured using a computer-assisted texture analysis program.

  18. Multidimensional extremal dependence coefficients

    OpenAIRE

    2017-01-01

    Extreme values modeling has attracting the attention of researchers in diverse areas such as the environment, engineering, or finance. Multivariate extreme value distributions are particularly suitable to model the tails of multidimensional phenomena. The analysis of the dependence among multivariate maxima is useful to evaluate risk. Here we present new multivariate extreme value models, as well as, coefficients to assess multivariate extremal dependence.

  19. An extreme value model for maximum wave heights based on weather types

    Science.gov (United States)

    Rueda, Ana; Camus, Paula; Méndez, Fernando J.; Tomás, Antonio; Luceño, Alberto

    2016-02-01

    Extreme wave heights are climate-related events. Therefore, special attention should be given to the large-scale weather patterns responsible for wave generation in order to properly understand wave climate variability. We propose a classification of weather patterns to statistically downscale daily significant wave height maxima to a local area of interest. The time-dependent statistical model obtained here is based on the convolution of the stationary extreme value model associated to each weather type. The interdaily dependence is treated by a climate-related extremal index. The model's ability to reproduce different time scales (daily, seasonal, and interannual) is presented by means of its application to three locations in the North Atlantic: Mayo (Ireland), La Palma Island, and Coruña (Spain).

  20. The European Extreme Right and Religious Extremism

    Directory of Open Access Journals (Sweden)

    Jean-Yves Camus

    2007-12-01

    Full Text Available The ideology of the Extreme Right in Western Europe is rooted in Catholic fundamentalism and Counter-Revolutionary ideas. However, the Extreme Right, like all other political families, has had to adjust to an increasingly secular society. The old link between religion and the Extreme Right has thus been broken and in fact already was when Fascism overtook Europe: Fascism was secular, sometimes even anti-religious, in its essence. Although Catholic fundamentalists still retain strong positions within the apparatus of several Extreme Right parties (Front National, the vote for the Extreme Right is generally weak among regular churchgoers and strong among non-believers. In several countries, the vote for the Extreme Right is stronger among Protestant voters than among Catholics, since while Catholics may support Christian-Democratic parties, there are very few political parties linked to Protestant churches. Presently, it also seems that Paganism is becoming the dominant religious creed within the Extreme Right. In a multicultural Europe, non-Christian forms of religious fundamentalism such as Islamism also exist with ideological similarities to the Extreme Right, but this is not sufficient to categorize Islamism as a form of Fascism. Some Islamist groups seek alliances with the Extreme Right on the basis of their common dislike for Israel and the West, globalization and individual freedom of thought.

  1. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2005-01-01

    In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin

  2. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2010-01-01

    In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this

  3. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  4. Statistical physics

    CERN Document Server

    Wannier, Gregory H

    2010-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  5. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  6. Extreme conditions (p, T, H)

    Energy Technology Data Exchange (ETDEWEB)

    Mesot, J. [Lab. for Neutron Scattering ETH Zurich, Zurich (Switzerland) and Paul Scherrer Institute, Villigen (Switzerland)

    1996-11-01

    The aim of this paper is to summarize the sample environment which will be accessible at the SINQ. In order to illustrate the type of experiments which will be feasible under extreme conditions of temperature, magnetic field and pressure at the SINQ a few selected examples are also given. (author) 7 figs., 14 refs.

  7. SEER Statistics

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  8. Cancer Statistics

    Science.gov (United States)

    ... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...

  9. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  10. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  11. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  12. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  13. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  14. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  15. Book review: Extreme ocean waves

    Science.gov (United States)

    Geist, Eric L.

    2017-01-01

    Extreme Ocean Waves”, edited by E. Pelinovsky and C. Kharif, second edition, Springer International Publishing, 2016; ISBN: 978-3-319-21574-7, ISBN (eBook): 978-3-319-21575-4The second edition of “Extreme Ocean Waves” published by Springer is an update of a collection of 12 papers edited by Efim Pelinovsky and Christian Kharif following the April 2007 meeting of the General Assembly of the European Geosciences Union. In this edition, three new papers have been added and three more have been substantially revised. Color figures are now included, which greatly aids in reading several of the papers, and is especially helpful in visualizing graphs as in the paper on symbolic computation of nonlinear wave resonance (Tobisch et al.). A note on terminology: extreme waves in this volume broadly encompass different types of waves, including deep-water and shallow-water rogue waves (which are alternatively termed freak waves), and internal waves. One new paper on tsunamis (Viroulet et al.) is now included in the second edition of this volume. Throughout the book, the reader will find a combination of laboratory, theoretical, and statistical/empirical treatment necessary for the complete examination of this subject. In the Introduction, the editors underscore the importance of studying extreme waves, documenting a dramatic instance of damaging extreme waves that recently occurred in 2014.

  16. Charged Matter Tests of Cosmic Censorship for Extremal and Nearly-Extremal Black Holes

    Science.gov (United States)

    Sorce, Jonathan; Wald, Robert

    2017-01-01

    We investigate scenarios in which adding electrically charged matter to a black hole may cause it to become over-extremal, violating cosmic censorship. It has previously been shown that when the matter is localized as a point particle, no violation occurs for extremal black holes to lowest nonvanishing order in the particle's charge and mass. However, recent work has suggested that violations may be possible when the black hole deviates from extremality. We show that these potential violations always occur above lowest nonvanishing order, and conclude that no lowest-order violation can occur in the nearly-extremal case unless a violation also occurs in the extremal case. We also extend the previous results on point particles to show that no violations occur to second order in charge when an arbitrary charged matter configuration is added to an extremal Kerr black hole, provided only that the matter satisfies the null energy condition.

  17. Impact of temperature and precipitation extremes on the flowering dates of four German wildlife shrub species

    Science.gov (United States)

    Siegmund, Jonatan F.; Wiedermann, Marc; Donges, Jonathan F.; Donner, Reik V.

    2016-10-01

    Ongoing climate change is known to cause an increase in the frequency and amplitude of local temperature and precipitation extremes in many regions of the Earth. While gradual changes in the climatological conditions have already been shown to strongly influence plant flowering dates, the question arises if and how extremes specifically impact the timing of this important phenological phase. Studying this question calls for the application of statistical methods that are tailored to the specific properties of event time series. Here, we employ event coincidence analysis, a novel statistical tool that allows assessing whether or not two types of events exhibit similar sequences of occurrences in order to systematically quantify simultaneities between meteorological extremes and the timing of the flowering of four shrub species across Germany. Our study confirms previous findings of experimental studies by highlighting the impact of early spring temperatures on the flowering of the investigated plants. However, previous studies solely based on correlation analysis do not allow deriving explicit estimates of the strength of such interdependencies without further assumptions, a gap that is closed by our analysis. In addition to direct impacts of extremely warm and cold spring temperatures, our analysis reveals statistically significant indications of an influence of temperature extremes in the autumn preceding the flowering.

  18. New advances in statistical modeling and applications

    CERN Document Server

    Santos, Rui; Oliveira, Maria; Paulino, Carlos

    2014-01-01

    This volume presents selected papers from the XIXth Congress of the Portuguese Statistical Society, held in the town of Nazaré, Portugal, from September 28 to October 1, 2011. All contributions were selected after a thorough peer-review process. It covers a broad range of papers in the areas of statistical science, probability and stochastic processes, extremes and statistical applications.

  19. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  20. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  1. Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania

    Science.gov (United States)

    Alzbutas, Robertas; Šeputytė, Ilona

    2015-04-01

    Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30

  2. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...

  3. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  4. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  5. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  6. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  7. Statistical methods

    CERN Document Server

    Freund, Rudolf J; Wilson, William J

    2010-01-01

    Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

  8. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  9. Extreme storm surges: a comparative study of frequency analysis approaches

    Science.gov (United States)

    Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.

    2014-08-01

    In France, nuclear facilities were designed around very low probabilities of failure. Nevertheless, some extreme climatic events have given rise to exceptional observed surges (outliers) much larger than other observations, and have clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches to extreme value analysis, including the annual maxima (AM), the peaks-over-threshold (POT) and the r-largest order statistics (r-LOS). These methods are illustrated in a real analysis case study. All data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameter stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on (i) the uncertainty degrees, (ii) the adequacy criteria and tests, and (iii) the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distribution parameters and return level estimates and have systematically shown values of the 100 and 500-year return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fit at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativeness of outliers in data sets. Findings are of practical relevance, not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  10. Extreme Precipitation and High-Impact Landslides

    Science.gov (United States)

    Kirschbaum, Dalia; Adler, Robert; Huffman, George; Peters-Lidard, Christa

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides; however, there remain large uncertainties in characterizing the distribution of these hazards and meteorological triggers at the global scale. Researchers have evaluated the spatiotemporal distribution of extreme rainfall and landslides at local and regional scale primarily using in situ data, yet few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This research uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from Tropical Rainfall Measuring Mission (TRMM) data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurence of precipitation and rainfall-triggered landslides globally. The GLC, available from 2007 to the present, contains information on reported rainfall-triggered landslide events around the world using online media reports, disaster databases, etc. When evaluating this database, we observed that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This research also considers the sources for this extreme rainfall, citing

  11. Seasonal Climate Extremes : Mechanism, Predictability and Responses to Global Warming

    NARCIS (Netherlands)

    Shongwe, M.E.

    2010-01-01

    Climate extremes are rarely occurring natural phenomena in the climate system. They often pose one of the greatest environmental threats to human and natural systems. Statistical methods are commonly used to investigate characteristics of climate extremes. The fitted statistical properties are often

  12. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  13. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  14. Statistical Mechancis

    CERN Document Server

    Gallavotti, Giovanni

    2011-01-01

    C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.

  15. Legacy to the extreme

    NARCIS (Netherlands)

    A. van Deursen (Arie); T. Kuipers (Tobias); L.M.F. Moonen (Leon)

    2000-01-01

    textabstractWe explore the differences between developing a system using extreme programming techniques, and maintaining a legacy system. We investigate whether applying extreme programming techniques to legacy maintenance is useful and feasible.

  16. Legacy to the extreme

    NARCIS (Netherlands)

    Deursen, A. van; Kuipers, T.; Moonen, L.M.F.

    2000-01-01

    We explore the differences between developing a system using extreme programming techniques, and maintaining a legacy system. We investigate whether applying extreme programming techniques to legacy maintenance is useful and feasible.

  17. Changes in extreme dry and wet precipitation spell

    Science.gov (United States)

    Papalexiou, Simon Michael; Foufoula-Georgiou, Efi; Onof, Chris

    2016-04-01

    Global warming is expected to alter the behavior of hydroclimatic variables in various ways. Therefore, it is of great importance not only to identify which hydroclimatic variables are going through changes but also which of their specific characteristics change and in what way. For example the major focus regarding precipitation has been on changes or trends in extreme events or in annual totals, obviously, not without a reason. Yet one of the aspects of precipitation we believe is of equal importance and has not been extensively studied is extreme dry and wet spells. Changes in dry and wet spells can severely impact all aspects of human lives, ranging from infrastructure planning and water resources management to agriculture and infectious disease spread. In this study we perform an extensive analysis of extreme dry and wet precipitation spells using tenths of thousands of daily precipitation records in order to identify trends or variability changes in the maximum number of consecutive dry or wet days of each year. Our final goal is to evaluate the percentage of stations globally with positive/negative trends either in the mean value or in variability of extreme dry and wet spells and assess if this percentage is statistically justifiable.

  18. Extreme environment electronics

    CERN Document Server

    Cressler, John D

    2012-01-01

    Unfriendly to conventional electronic devices, circuits, and systems, extreme environments represent a serious challenge to designers and mission architects. The first truly comprehensive guide to this specialized field, Extreme Environment Electronics explains the essential aspects of designing and using devices, circuits, and electronic systems intended to operate in extreme environments, including across wide temperature ranges and in radiation-intense scenarios such as space. The Definitive Guide to Extreme Environment Electronics Featuring contributions by some of the world's foremost exp

  19. Deficiently Extremal Gorenstein Algebras

    Indian Academy of Sciences (India)

    Pavinder Singh

    2011-08-01

    The aim of this article is to study the homological properties of deficiently extremal Gorenstein algebras. We prove that if / is an odd deficiently extremal Gorenstein algebra with pure minimal free resolution, then the codimension of / must be odd. As an application, the structure of pure minimal free resolution of a nearly extremal Gorenstein algebra is obtained.

  20. Visuanimation in statistics

    KAUST Repository

    Genton, Marc G.

    2015-04-14

    This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Technology of planetary extreme environment simulation

    Science.gov (United States)

    Wakefield, M. E.; Apodaca, L. E.; Hall, C. A.

    1972-01-01

    Four test chamber systems were devleoped to simulate the extreme atmospheric environs of Venus and Jupiter, in order to assure satisfactory performance of scientific entry probes and their experiments.

  2. Extreme value distributions

    CERN Document Server

    Ahsanullah, Mohammad

    2016-01-01

    The aim of the book is to give a through account of the basic theory of extreme value distributions. The book cover a wide range of materials available to date. The central ideas and results of extreme value distributions are presented. The book rwill be useful o applied statisticians as well statisticians interrested to work in the area of extreme value distributions.vmonograph presents the central ideas and results of extreme value distributions.The monograph gives self-contained of theory and applications of extreme value distributions.

  3. 任意支撑上5阶凸随机序的极值分布及其在保险精算中的应用%Extremal distributions for 5-convex stochastic orderings with arbitrary discrete support and applications in actuarial sciences

    Institute of Scientific and Technical Information of China (English)

    田有功; 刘转玲

    2014-01-01

    在离散的5阶凸随机序意义下,研究了在R+上的任意离散子集上取值的随机变量的极值分布①。在保险精算应用中,估计了Lunberg调整系数的界。%In the sense of discrete 5-convex stochastic ordering, the extremal distribution for random variables valued in an arbitrary discrete subset of the half-positive real line R+was considered.The bounds of Lundberg’s adjustment coef-ficient in actuarial sciences were estimated.

  4. Rainfall statistics changes in Sicily

    Directory of Open Access Journals (Sweden)

    E. Arnone

    2013-02-01

    Full Text Available Changes in rainfall characteristics are one of the most relevant signs of current climate alterations. Many studies have demonstrated an increase in rainfall intensity and a reduction of frequency in several areas of the world, including Mediterranean areas. Rainfall characteristics may be crucial for vegetation patterns formation and evolution in Mediterranean ecosystems, with important implications, for example, in vegetation water stress or coexistence and competition dynamics. At the same time, characteristics of extreme rainfall events are fundamental for the estimation of flood peaks and quantiles which can be used in many hydrological applications, such as design of the most common hydraulic structures, or planning and management of flood prone areas.

    In the past, Sicily has been screened for several signals of possible climate change. Annual, seasonal and monthly rainfall data in the entire Sicilian region have been analyzed, showing a global reduction of total annual rainfall. Moreover, annual maximum rainfall series for different durations have been rarely analyzed in order to detect the presence of trends. Results indicated that for short durations, historical series generally exhibit increasing trends while for longer durations the trends are mainly negative.

    Starting from these premises, the aim of this study is to investigate and quantify changes in rainfall statistics in Sicily, during the second half of the last century. Time series of about 60 stations over the region have been processed and screened by using the non parametric Mann–Kendall test.

    Particularly, extreme events have been analyzed using annual maximum rainfall series at 1, 3, 6, 12 and 24 h duration while daily rainfall properties have been analyzed in term of frequency and intensity, also characterizing seasonal rainfall features. Results of extreme events analysis confirmed an increasing trend for rainfall of short durations

  5. Rainfall statistics changes in Sicily

    Directory of Open Access Journals (Sweden)

    E. Arnone

    2013-07-01

    Full Text Available Changes in rainfall characteristics are one of the most relevant signs of current climate alterations. Many studies have demonstrated an increase in rainfall intensity and a reduction of frequency in several areas of the world, including Mediterranean areas. Rainfall characteristics may be crucial for vegetation patterns formation and evolution in Mediterranean ecosystems, with important implications, for example, in vegetation water stress or coexistence and competition dynamics. At the same time, characteristics of extreme rainfall events are fundamental for the estimation of flood peaks and quantiles that can be used in many hydrological applications, such as design of the most common hydraulic structures, or planning and management of flood-prone areas. In the past, Sicily has been screened for several signals of possible climate change. Annual, seasonal and monthly rainfall data in the entire Sicilian region have been analyzed, showing a global reduction of total annual rainfall. Moreover, annual maximum rainfall series for different durations have been rarely analyzed in order to detect the presence of trends. Results indicated that for short durations, historical series generally exhibit increasing trends, while for longer durations the trends are mainly negative. Starting from these premises, the aim of this study is to investigate and quantify changes in rainfall statistics in Sicily, during the second half of the last century. Time series of about 60 stations over the region have been processed and screened by using the nonparametric Mann–Kendall test. In particular, extreme events have been analyzed using annual maximum rainfall series at 1, 3, 6, 12 and 24 h duration, while daily rainfall properties have been analyzed in terms of frequency and intensity, also characterizing seasonal rainfall features. Results of extreme events analysis confirmed an increasing trend for rainfall of short durations, especially for 1 h rainfall

  6. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  7. Teaching Statistics in Integration with Psychology

    Science.gov (United States)

    Wiberg, Marie

    2009-01-01

    The aim was to revise a statistics course in order to get the students motivated to learn statistics and to integrate statistics more throughout a psychology course. Further, we wish to make students become more interested in statistics and to help them see the importance of using statistics in psychology research. To achieve this goal, several…

  8. Extreme Wave Statistics within the Mouth of the Columbia River

    Science.gov (United States)

    2014-12-01

    the Suez Canal closure from the Israel-Arab War (1967- 1975), a spike in maritime mishaps was noted in the vicinity of the Agulhas, as more ships were...Atmospheric Research, COMET, cited 2014. Wave Life Cycle II: Propagation & Dispersion. [available online at http://www.meted.ucar.edu/marine

  9. Instantons and Extreme Value Statistics of Random Matrices

    CERN Document Server

    Atkin, Max R

    2014-01-01

    We discuss the distribution of the largest eigenvalue of a random N x N Hermitian matrix. Utilising results from the quantum gravity and string theory literature it is seen that the orthogonal polynomials approach, first introduced by Majumdar and Nadal, can be extended to calculate both the left and right tail large deviations of the maximum eigenvalue. This framework does not only provide computational advantages when considering the left and right tail large deviations for general potentials, as is done explicitly for the first multi-critical potential, but it also offers an interesting interpretation of the results. In particular, it is seen that the left tail large deviations follow from a standard perturbative large N expansion of the free energy, while the right tail large deviations are related to the non-perturbative expansion and thus to instanton corrections. Considering the standard interpretation of instantons as tunnelling of eigenvalues, we see that the right tail rate function can be identifie...

  10. Mapping of extreme wind speed for landscape modelling of the Bohemian Forest, Czech Republic

    Directory of Open Access Journals (Sweden)

    L. Pop

    2014-01-01

    Full Text Available Extreme wind events are among the most damaging weather-related hazards in the Czech Republic, forestry is heavily affected. In order to successfully run a landscape model dealing with such effects, spatial distribution of extreme wind speed statistics is needed. The presented method suggests using sector-wise wind field calculations together with extreme value statistics fitted at a reference station. A special algorithm is proposed to provide the data in the form expected by the landscape model, i.e. raster data of annual wind speed maxima. The method is demonstrated on the area of Bohemian Forest that represents one of largest and most compact forested mountains in Central Europe. The reference meteorological station Churáňov is located within the selected domain. Numerical calculations were based on linear model of WAsP Engineering methodology. Observations were cleaned of inhomogeneity and classified into convective and non-convective cases using index CAPE. Due to disjunct sampling of synoptic data, appropriate corrections were applied to the observed extremes. Finally they were fitted with Gumbel distribution. The output of numerical simulation is presented for the windiest direction sector. Another map shows probability that annual extreme exceeds required threshold. The method offers a tool for generation of spatially variable annual maxima of wind speed. It assumes a small limited model domain containing a reliable wind measurement. We believe that this is typical setup for applications similar to one presented in the paper.

  11. 灰阶序列极值分析海洋污染贡献权重动态预测%Dynamic Prediction of Environmental Pollution with Contribution Weigh in Complex Sea Based on Extreme Analysis of Gray Order Sequence

    Institute of Scientific and Technical Information of China (English)

    赵海华

    2014-01-01

    提出了一种动态层次灰阶序列分析的灰色靶关联环境污染贡献权重动态预测算法,进行复杂环境海洋污染动态预测及相关监测实验和仿真研究,对环境监测数据进行分类和关联匹配,不需要人工干预即可计算污染物污染贡献程度权重值,根据计算模型对整体海域水域状况进行模拟分析。监测数据仿真实验表明模型可以准确预测海洋的污染状况,对污染区域污染源进行定位,对海洋整体的污染情况进行评判,为海洋环境保护提供有力的数据分析支持与科学的决策。%An improved dynamic gray order sequence analysis environmental pollution contribution weight dynamic predic-tion algorithm was proposed for dynamic prediction of sea environmental pollution. The environmental monitoring data was classified and matched with association, the pollution degree weights were computed without manual intervention. Experi-ment and simulation result shows that the constructed model can predict the pollution the pollution status of the sea. The re-gional pollution source can be positioned accurately. It provides scientific data and powerful decision for the protection of the marine environment.

  12. Extreme Velocity Wind Sensor

    Science.gov (United States)

    Perotti, Jose; Voska, Ned (Technical Monitor)

    2002-01-01

    This presentation provides an overview of the development of new hurricane wind sensor (Extreme Velocity Wind Sensor) for the Kennedy Space Center (KSC) which is designed to withstand winds of up to three hundred miles an hour. The proposed Extreme Velocity Wind Sensor contains no moveable components that would be exposed to extreme wind conditions. Topics covered include: need for new hurricane wind sensor, conceptual design, software applications, computational fluid dynamic simulations of design concept, preliminary performance tests, and project status.

  13. How extreme is extreme hourly precipitation?

    Science.gov (United States)

    Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos

    2016-04-01

    The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.

  14. Elementary statistical physics

    CERN Document Server

    Kittel, Charles

    2004-01-01

    Noteworthy for the philosophical subtlety of its foundations and the elegance of its problem-solving methods, statistical mechanics can be employed in a broad range of applications - among them, astrophysics, biology, chemistry, nuclear and solid state physics, communications engineering, metallurgy, and mathematics. Geared toward graduate students in physics, this text covers such important topics as stochastic processes and transport theory in order to provide students with a working knowledge of statistical mechanics.To explain the fundamentals of his subject, the author uses the method of

  15. Statistics on the distribution of contracts and purchase orders by country for the LHC Project based on payments and outstanding commitments for the period 1 January 1995 to 31 March 2000 (including adjudications by the Finance Committee up to March 2000).

    CERN Document Server

    2000-01-01

    Statistics on the distribution of contracts and purchase orders by country for the LHC Project based on payments and outstanding commitments for the period 1 January 1995 to 31 March 2000 (including adjudications by the Finance Committee up to March 2000).

  16. Extreme winds in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Kristensen, L.; Rathmann, O.; Hansen, S.O.

    1999-02-01

    Wind-speed data from four sites in Denmark have been analyzed in order to obtain estimates of the basic wind velocity which is defined as the 50-year wind speed under standard conditions, i.e. ten-minute averages at the height 10 m over a uniform terrain with the roughness length 0.05 m. The sites are, from west, Skjern (15 years), Kegnaes (7 years), Sprogoe (20 years), and Tystofte (15 years). The data are ten minute averages of wind speed, wind direction, temperature and pressure. The last two quantities are used to determine the air density {rho}. The data are cleaned for terrain effects by means of a slightly modified WASP technique where the sector speed-up factors and roughness lengths are linearly smoothed with a direction resolution of one degree. Assuming geotropic balance, all the wind-velocity data are transformed to friction velocity u{sub *} and direction at standard conditions by means of the geotropic drag law for neutral stratification. The basic wind velocity in 30 deg. sectors are obtained through ranking of the largest values of the friction velocity pressure 1/2{rho}u{sub *}{sup 2} taken both one every two months and once every year. The main conclusion is that the basic wind velocity is significantly larger at Skjern, close to the west coast of Jutland, than at any of the other sites. Irrespective of direction, the present standard estimates of 50-year wind are 25 {+-} 1 m/s at Skern and 22 {+-} 1 m/s at the other three sites. These results are in agreement with those obtained by Jensen and Franck (1970) and Abild (1994) and supports the conclusion that the wind climate at the west coast of Jutland is more extreme than in any other part of the country. Simple procedures to translate in a particular direction sector the standard basic wind velocity to conditions with a different roughness length and height are presented. It is shown that a simple scheme makes it possible to calculate the total 50-year extreme load on a general structure without

  17. Word Order

    DEFF Research Database (Denmark)

    Rijkhoff, Jan

    2015-01-01

    The way constituents are ordered in a linguistic expression is determined by general principles and language specific rules. This article is mostly concerned with general ordering principles and the three main linguistic categories that are relevant for constituent order research: formal, functio...

  18. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme tha...

  19. The impact of climate extremes on US agricultural production and the buffering impacts of irrigation

    Science.gov (United States)

    Troy, Tara J.; Kipgen, Chinpihoi; Pal, Indrani

    2014-05-01

    In recent years, droughts and floods have occurred over many of the major growing regions of the world, resulting in decreased agricultural production and increased global food prices. Many climate projections call for more frequent extreme events, which could have significant impacts on agricultural yields and water resources in irrigated agricultural regions. In order to better understand the potential impact of climate extremes and the spatial heterogeneity of those impacts, we examine the associations between climate and irrigated and rain fed crop yields, focusing on four main staple crops: wheat, rice, soy, and maize. Because the United States has high spatial resolution data for both yields and weather variables, the analysis focuses on the impact of multiple extremes over these four crops in the US using statistical methods that do not require any assumptions of functional relationships between yields and weather variables. Irrigated and rain fed agricultural yields are analyzed separately to understand the role irrigation plays either as a buffering against climate variability and extremes such as drought, heat waves, and extended dry spells or a mechanism that leads to varied relationships between extremes of climate and yield fluctuations. These results demonstrate that irrigation has varying effects depending on the region, growing season timing, crop type, and type of climate extreme. This work has important implications for future planning of the coupled water-food system and its vulnerabilities to climate.

  20. Non-Gaussian Information-Theoretical Analytics for Diagnostic and Inference of Hydroclimatic Extremes

    Science.gov (United States)

    Pires, Carlos A. L.; Perdigão, Rui A. P.

    2016-04-01

    Hydroclimatic spatiotemporal distributions exhibit significant non-Gaussianity with particular emphasis to overweight extremes, rendering their diagnostic and inference suboptimal with traditional statistical techniques. In order to overcome that limitation, we introduce and discuss a set of information-theoretic methodologies for statistical diagnostic and inference issued from exploratory variables of the general atmospheric and oceanic circulation in the cases of non-Gaussian joint probability distributions. Moreover, the nonlinear information among various large-scale ocean-atmospheric processes is explored, bringing out added predictability to elusive weather and hydrologic extremes relative to the current state of the art in nonlinear geophysics. The methodologies are illustrated with the analysis and prediction of resonant ocean-atmospheric thermodynamic anomaly spells underneath high-profile floods and droughts.

  1. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  2. Statistical distributions of air pollution concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Georgopoulos, P.G.; Seinfeld, J.H.

    1982-07-01

    Methodologies and limitations in describing air quality through statistical distributions of pollutant are discussed, and the use of extreme statistics in the evaluation of different forms of air quality standards are explained. In addition, the interpretation of rollback calculations with regard to air quality standards is discussed. (JMT)

  3. Extreme winds in the Western North Pacific

    DEFF Research Database (Denmark)

    Ott, Søren

    2006-01-01

    A statistical model for extreme winds in the western North Pacific is developed, the region on the Planet where tropical cyclones are most common. The model is based on best track data derived mostly from satellite images of tropical cyclones. The methodsused to estimate surface wind speeds from...

  4. Measuring extremal dependencies in web graphs

    NARCIS (Netherlands)

    Volkovich, Y.; Litvak, Nelli; Zwart, B.

    2008-01-01

    We analyze dependencies in power law graph data (Web sample, Wikipedia sample and a preferential attachment graph) using statistical inference for multivariate regular variation. The well developed theory of regular variation is widely applied in extreme value theory, telecommunications and mathemat

  5. Statistical physics ""Beyond equilibrium

    Energy Technology Data Exchange (ETDEWEB)

    Ecke, Robert E [Los Alamos National Laboratory

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  6. How does public opinion become extreme?

    CERN Document Server

    Ramos, Marlon; Reis, Saulo D S; Anteneodo, Celia; Andrade, José S; Havlin, Shlomo; Makse, Hernán A

    2014-01-01

    We investigate the emergence of extreme opinion trends in society by employing statistical physics modeling and analysis on polls that inquire about a wide range of issues such as religion, economics, politics, abortion, extramarital sex, books, movies, and electoral vote. The surveys lay out a clear indicator of the rise of extreme views. The precursor is a nonlinear relation between the fraction of individuals holding a certain extreme view and the fraction of individuals that includes also moderates, e.g., in politics, those who are "very conservative" versus "moderate to very conservative" ones. We propose an activation model of opinion dynamics with interaction rules based on the existence of individual "stubbornness" that mimics empirical observations. According to our modeling, the onset of nonlinearity can be associated to an abrupt bootstrap-percolation transition with cascades of extreme views through society. Therefore, it represents an early-warning signal to forecast the transition from moderate ...

  7. Extreme Value distribution for singular measures

    CERN Document Server

    Faranda, Davide; Turchetti, Giorgio; Vaienti, Sandro

    2011-01-01

    In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agree...

  8. Second order Standard Model

    Directory of Open Access Journals (Sweden)

    Johnny Espin

    2015-06-01

    Full Text Available It is known, though not commonly, that one can describe fermions using a second order in derivatives Lagrangian instead of the first order Dirac one. In this description the propagator is scalar, and the complexity is shifted to the vertex, which contains a derivative operator. In this paper we rewrite the Lagrangian of the fermionic sector of the Standard Model in such second order form. The new Lagrangian is extremely compact, and is obtained from the usual first order Lagrangian by integrating out all primed (or dotted 2-component spinors. It thus contains just half of the 2-component spinors that appear in the usual Lagrangian, which suggests a new perspective on unification. We sketch a natural in this framework SU(2×SU(4⊂SO(9 unified theory.

  9. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  10. Quasinormal modes of extremal BTZ black hole

    Energy Technology Data Exchange (ETDEWEB)

    Crisostomo, Juan; Lepe, Samuel; Saavedra, Joel [Instituto de FIsica, Facultad de Ciencias Basicas y Matematicas, Pontificia Universidad Catolica de ValparaIso, Avenida Brasil 2950, ValparaIso (Chile)

    2004-06-21

    Motivated by several pieces of evidence, in order to show that extremal black holes cannot be obtained as limits of non-extremal black holes, in this paper we calculate explicitly quasinormal modes for the Banados, Teitelboim and Zanelli (BTZ) extremal black hole and show that the imaginary part of the frequency is zero. We obtain exact result for the scalar and fermionic perturbations. We also showed that the frequency is bounded from below for the existence of the normal modes (non-dissipative modes)

  11. Moving in extreme environments

    DEFF Research Database (Denmark)

    Lucas, Samuel J E; Helge, Jørn W; Schütz, Uwe H W;

    2016-01-01

    This review addresses human capacity for movement in the context of extreme loading and with it the combined effects of metabolic, biomechanical and gravitational stress on the human body. This topic encompasses extreme duration, as occurs in ultra-endurance competitions (e.g. adventure racing...... and transcontinental races) and expeditions (e.g. polar crossings), to the more gravitationally limited load carriage (e.g. in the military context). Juxtaposed to these circumstances is the extreme metabolic and mechanical unloading associated with space travel, prolonged bedrest and sedentary lifestyle, which may...

  12. Extremal surface barriers

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, Netta; Wall, Aron C. [Department of Physics, University of California,Santa Barbara, CA 93106 (United States)

    2014-03-13

    We present a generic condition for Lorentzian manifolds to have a barrier that limits the reach of boundary-anchored extremal surfaces of arbitrary dimension. We show that any surface with nonpositive extrinsic curvature is a barrier, in the sense that extremal surfaces cannot be continuously deformed past it. Furthermore, the outermost barrier surface has nonnegative extrinsic curvature. Under certain conditions, we show that the existence of trapped surfaces implies a barrier, and conversely. In the context of AdS/CFT, these barriers imply that it is impossible to reconstruct the entire bulk using extremal surfaces. We comment on the implications for the firewall controversy.

  13. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  14. Extreme environments and exobiology.

    Science.gov (United States)

    Friedmann, E I

    1993-01-01

    Ecological research on extreme environments can be applied to exobiological problems such as the question of life on Mars. If life forms (fossil or extant) are found on Mars, their study will help to solve fundamental questions about the nature of life on Earth. Extreme environments that are beyond the range of adaptability of their inhabitants are defined as "absolute extreme". Such environments can serve as terrestrial models for the last stages of life in the history of Mars, when the surface cooled down and atmosphere and water disappeared. The cryptoendolithic microbial community in porous rocks of the Ross Desert in Antarctica and the microbial mats at the bottom of frozen Antarctic lakes are such examples. The microbial communities of Siberian permafrost show that, in frozen but stable communities, long-term survival is possible. In the context of terraforming Mars, selected microorganisms isolated from absolute extreme environments are considered for use in creation of a biological carbon cycle.

  15. Venous Ultrasound (Extremities)

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Ultrasound - Venous (Extremities) Venous ultrasound uses sound waves to ... limitations of Venous Ultrasound Imaging? What is Venous Ultrasound Imaging? Ultrasound is safe and painless, and produces ...

  16. Transforming Elementary Statistics To Enhance Student Learning.

    Science.gov (United States)

    Lane, Jill L.; Aleksic, Maja

    Undergraduate students often leave statistics courses not fully understanding how to apply statistical concepts (M. Bonsangue, 1994). In order to enhance student learning and improve the understanding and application of statistical concepts, an elementary statistics course was transformed from a lecture-based course into one that integrates…

  17. Extreme Geomagnetic Storms - 1868 - 2010

    Science.gov (United States)

    Vennerstrom, S.; Lefevre, L.; Dumbović, M.; Crosby, N.; Malandraki, O.; Patsou, I.; Clette, F.; Veronig, A.; Vršnak, B.; Leer, K.; Moretto, T.

    2016-05-01

    We present the first large statistical study of extreme geomagnetic storms based on historical data from the time period 1868 - 2010. This article is the first of two companion papers. Here we describe how the storms were selected and focus on their near-Earth characteristics. The second article presents our investigation of the corresponding solar events and their characteristics. The storms were selected based on their intensity in the aa index, which constitutes the longest existing continuous series of geomagnetic activity. They are analyzed statistically in the context of more well-known geomagnetic indices, such as the Kp and Dcx/Dst index. This reveals that neither Kp nor Dcx/Dst provide a comprehensive geomagnetic measure of the extreme storms. We rank the storms by including long series of single magnetic observatory data. The top storms on the rank list are the New York Railroad storm occurring in May 1921 and the Quebec storm from March 1989. We identify key characteristics of the storms by combining several different available data sources, lists of storm sudden commencements (SSCs) signifying occurrence of interplanetary shocks, solar wind in-situ measurements, neutron monitor data, and associated identifications of Forbush decreases as well as satellite measurements of energetic proton fluxes in the near-Earth space environment. From this we find, among other results, that the extreme storms are very strongly correlated with the occurrence of interplanetary shocks (91 - 100 %), Forbush decreases (100 %), and energetic solar proton events (70 %). A quantitative comparison of these associations relative to less intense storms is also presented. Most notably, we find that most often the extreme storms are characterized by a complexity that is associated with multiple, often interacting, solar wind disturbances and that they frequently occur when the geomagnetic activity is already elevated. We also investigate the semiannual variation in storm occurrence

  18. Rainfall Variability and the Recent Climate Extremes in Nigeria ...

    African Journals Online (AJOL)

    Weather patterns affecting the country are driven by the northward and southward ... Climatic and statistical analyses were employed to investigate two extreme ... of Nigeria have suffered from inter-annual to seasonal climatic variabilities and ...

  19. Analysis of multivariate extreme intakes of food chemicals

    NARCIS (Netherlands)

    Paulo, M.J.; Voet, van der H.; Wood, J.C.; Marion, G.R.; Klaveren, van J.D.

    2006-01-01

    A recently published multivariate Extreme Value Theory (EVT) model [Heffernan, J.E., Tawn, J.A., 2004. A conditional approach for multivariate extreme values (with discussion). Journal of the Royal Statistical Society Series B 66 (3), 497¿546] is applied to the estimation of population risks associa

  20. Extreme wind conditions for a Danish offshore site

    DEFF Research Database (Denmark)

    Hansen, Kurt S.

    2000-01-01

    This paper presents an analysis of extreme wind speed gust values measured at a shallow water offshore site and at a coastal onshore site in Denmark. An estimate of 50-year extreme values has been evaluated using a new statistical method. In addition a mean gust shape is determined, based on a la...

  1. Extreme and superextreme events in a loss-modulated CO2 laser: Nonlinear resonance route and precursors

    Science.gov (United States)

    Bonatto, Cristian; Endler, Antonio

    2017-07-01

    We investigate the occurrence of extreme and rare events, i.e., giant and rare light pulses, in a periodically modulated CO2 laser model. Due to nonlinear resonant processes, we show a scenario of interaction between chaotic bands of different orders, which may lead to the formation of extreme and rare events. We identify a crisis line in the modulation parameter space, and we show that, when the modulation amplitude increases, remaining in the vicinity of the crisis, some statistical properties of the laser pulses, such as the average and dispersion of amplitudes, do not change much, whereas the amplitude of extreme events grows enormously, giving rise to extreme events with much larger deviations than usually reported, with a significant probability of occurrence, i.e., with a long-tailed non-Gaussian distribution. We identify recurrent regular patterns, i.e., precursors, that anticipate the emergence of extreme and rare events, and we associate these regular patterns with unstable periodic orbits embedded in a chaotic attractor. We show that the precursors may or may not lead to the emergence of extreme events. Thus, we compute the probability of success or failure (false alarm) in the prediction of the extreme events, once a precursor is identified in the deterministic time series. We show that this probability depends on the accuracy with which the precursor is identified in the laser intensity time series.

  2. Impact of climate extremes on flowering dates of four shrub species

    Science.gov (United States)

    Siegmund, Jonatan; Wiedermann, Marc; Donges, Jonathan; Donner, Reik

    2016-04-01

    Ongoing climate change is known to cause an increase in frequency and amplitude of local temperature and precipitation extremes in central Europe. While gradual changes in the climatological conditions are known to strongly influence plant flowering dates, the question arises if and how extremes specifically impact the timing of this important phenological phase. In this study, we systematically quantify simultaneities between meteorological extremes and the timing of flowering of four shrub species across Germany by means of event coincidence analysis, a novel statistical tool that allows assessing whether or not two types of events exhibit similar sequences of occurrences. Additionally we perform a superimposed epoch analysis in order to investigate the impact of different magnitudes of extremes and to assess possible long term influences. Our systematic investigation supports previous findings of experimental studies by highlighting the impact of early spring temperatures on the flowering of wildlife plants. In addition, we find statistically significant indications for some long-term relations reaching back to the previous year.

  3. Magnetic storms and solar flares: can be analysed within similar mathematical framework with other extreme events?

    Science.gov (United States)

    Balasis, Georgios; Potirakis, Stelios M.; Papadimitriou, Constantinos; Zitis, Pavlos I.; Eftaxias, Konstantinos

    2015-04-01

    The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up to different extreme events, in order to support the suggestion that a dynamical analogy characterizes the generation of a single magnetic storm, solar flare, earthquake (in terms of pre-seismic electromagnetic signals) , epileptic seizure, and economic crisis. The analysis reveals that all the above mentioned different extreme events can be analyzed within similar mathematical framework. More precisely, we show that the populations of magnitudes of fluctuations included in all the above mentioned pulse-like-type time series follow the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar nonextensive q-parameter values. Moreover, based on a multidisciplinary statistical analysis we show that the extreme events are characterized by crucial common symptoms, namely: (i) high organization, high compressibility, low complexity, high information content; (ii) strong persistency; and (iii) existence of clear preferred direction of emerged activities. These symptoms clearly discriminate the appearance of the extreme events under study from the corresponding background noise.

  4. Extreme Programming: Maestro Style

    Science.gov (United States)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2009-01-01

    "Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme

  5. Distribution of extreme rainfall events over Ebro River basin

    Science.gov (United States)

    Saa, Antonio; Tarquis, Ana Maria; Valencia, Jose Luis; Gascó, Jose Maria

    2010-05-01

    The purpose of this work is to provide a description of the heavy rainfall phenomenon on statistical tools from a Spanish region. We want to quantify the effect of the climate change to verify the rapidity of its evolution across the variation of the probability distributions. Our conclusions have special interest for the agrarian insurances, which may make estimates of costs more realistically. In this work, the analysis mainly focuses on: The distribution of consecutive days without rain for each gauge stations and season. We estimate density Kernel functions and Generalized Pareto Distribution (GPD) for a network of station from the Ebro River basin until a threshold value u. We can establish a relation between distributional parameters and regional characteristics. Moreover we analyze especially the tail of the probability distribution. These tails are governed by law of power means that the number of events n can be expressed as the power of another quantity x : n(x) = x? . ? can be estimated as the slope of log-log plot the number of events and the size. The most convenient way to analyze n(x) is using the empirical probability distribution. Pr(X > x) ∞ x-?. The distribution of rainfall over percentile of order 0.95 from wet days at the seasonal scale and in a yearly scale with the same treatment of tails than in the previous section. The evolution of the distribution in the second XXth century and the impact on the extreme values model. After realized the analyses it does not appreciate difference in the distribution throughout the time which suggests that this region does not appreciate increase of the extreme values both for the number of dry consecutive days and for the value of the rainfall References: Coles, Stuart (2001). An Introduction to Statistical Modeling of Extreme Values,. Springer-Verlag Krishnamoorthy K. (2006), Handbook of Statistical Distributions with Applications, Chapman & Hall/CRC. Bodini A., Cossu A. (2010). Vulnerability assessment

  6. 基于统计降尺度模型的江淮流域极端气候的模拟与预估%Projection and simulation of climate extremes over the Yangtze and Huaihe River Basins based on a Statistical Downscaling Model

    Institute of Scientific and Technical Information of China (English)

    陈威霖; 江志红; 黄强

    2012-01-01

    Based on the observed daily maximum, minimum temperatures and daily precipitation data at the 29 meteorological stations which locate in the Yangtze and Huaihe River Basins as well as the daily NCEP reanalysis data, the statistical downscaling model (SDSM), which is a combination of multiple linear regression and stochastic weather generator, has been used to calibrate the parameters of SDSM at each station. Subsequently the statistical downscaling model is applied to construct scenarios of the cli- mate extremes during the end of the 21st century by using predictor sets simulated by two GCMs (i. e. HadCM3 and CGCM3, hereafter referred to as SDSM-HadCM3 and SDSM-CGCM3, respectively ) forced by the special report on emission scenarios (SRES) A2. Future scenarios of the climate extremes such as heat waves and intensity precipitation in the 21st century at the 29 meteorological stations are constructed. The evaluation of simulated extreme indices of temperature and precipitation for the current climate shows that the downscaled temperature-related indices match the observations well, and SDSM can modify the systematic cold biases of the AOGCMs. For example, compared with the raw CGCM3, SDSM can reduce the "cold bias" of the winter maximum and minimum temperature by 3 ℃ and 4.5 ℃, respectively. For indices of precipitation extremes, most AOGCMs tend to underestimate the intensi- ty, but SDSM improves this significantly. The bias of simple daily intensity index (ISDI) in summer for SDSM is -6% ,while that of CGCM3 is as high as -60. 6%. Overall,compared with the AOGCMs, the downscaling model really has "added values". Scenario results using A2 emissions show that in all seasons there is a significantly increase of mean daily-maximum and minimum temperature at the 29 meteorological stations, accompanied by a decrease of the number of frost days and an increase of the heat wave duration. For instance, the frost days in winter are projected to a significantly decrease

  7. Extreme Weather Events and Climate Change Attribution

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Katherine [National Academy of Sciences, Washington, DC (United States)

    2016-03-31

    A report from the National Academies of Sciences, Engineering, and Medicine concludes it is now possible to estimate the influence of climate change on some types of extreme events. The science of extreme event attribution has advanced rapidly in recent years, giving new insight to the ways that human-caused climate change can influence the magnitude or frequency of some extreme weather events. This report examines the current state of science of extreme weather attribution, and identifies ways to move the science forward to improve attribution capabilities. Confidence is strongest in attributing types of extreme events that are influenced by climate change through a well-understood physical mechanism, such as, the more frequent heat waves that are closely connected to human-caused global temperature increases, the report finds. Confidence is lower for other types of events, such as hurricanes, whose relationship to climate change is more complex and less understood at present. For any extreme event, the results of attribution studies hinge on how questions about the event's causes are posed, and on the data, modeling approaches, and statistical tools chosen for the analysis.

  8. Narcissism and birth order.

    Science.gov (United States)

    Eyring, W E; Sobelman, S

    1996-04-01

    The purpose of this investigation was to clarify the relationship between birth-order position and the development of narcissism, while refining research and theory. The relationship between birth-order status and narcissism was examined with a sample of 79 undergraduate students (55 women and 24 men). These subjects were placed in one of the four following birth-order categories of firstborn, second-born, last-born, and only children. These categories were chosen given their significance in Adlerian theory. Each subject completed the Narcissistic Personality Inventory and a demographic inventory. Based on psychodynamic theory, it was hypothesized that firstborn children were expected to score highest, but statistical significance was not found for an association between narcissism and birth order. Further research is urged to investigate personality theory as it relates to parenting style and birth order.

  9. Risk assessment of precipitation extremes in northern Xinjiang, China

    Science.gov (United States)

    Yang, Jun; Pei, Ying; Zhang, Yanwei; Ge, Quansheng

    2017-04-01

    This study was conducted using daily precipitation records gathered at 37 meteorological stations in northern Xinjiang, China, from 1961 to 2010. We used the extreme value theory model, generalized extreme value (GEV) and generalized Pareto distribution (GPD), statistical distribution function to fit outputs of precipitation extremes with different return periods to estimate risks of precipitation extremes and diagnose aridity-humidity environmental variation and corresponding spatial patterns in northern Xinjiang. Spatiotemporal patterns of daily maximum precipitation showed that aridity-humidity conditions of northern Xinjiang could be well represented by the return periods of the precipitation data. Indices of daily maximum precipitation were effective in the prediction of floods in the study area. By analyzing future projections of daily maximum precipitation (2, 5, 10, 30, 50, and 100 years), we conclude that the flood risk will gradually increase in northern Xinjiang. GEV extreme value modeling yielded the best results, proving to be extremely valuable. Through example analysis for extreme precipitation models, the GEV statistical model was superior in terms of favorable analog extreme precipitation. The GPD model calculation results reflect annual precipitation. For most of the estimated sites' 2 and 5-year T for precipitation levels, GPD results were slightly greater than GEV results. The study found that extreme precipitation reaching a certain limit value level will cause a flood disaster. Therefore, predicting future extreme precipitation may aid warnings of flood disaster. A suitable policy concerning effective water resource management is thus urgently required.

  10. EXTREMAL CONTROL FOR PHOTOVOLTAIC PANELS

    Directory of Open Access Journals (Sweden)

    Genevieve DAPHIN TANGUY

    2016-11-01

    Full Text Available In this paper a methodology for extremal control of photovoltaic panels has been designed through the use of an embedded polynomial controller using robust approaches and algorithms. Also, a framework for testing solar trackers in a hard ware in the loop (HIL configuration has been established. Efficient gradient based optimization methods were put in place in order to determine the parameters of the employed photovoltaic panel, as well as for computing the Maximum Power Point (MPP. Further a numerical RST controller has been computed in order to allow the panel to follow the movement of the sun to obtain a maximum energetic efficiency. A robustness analysis and correction procedure has been done on the RST polynomial algorithm. The hardware in the loop configuration allows for the development of a test and development platform which can be used for bringing improvements to the current design and also test different control approaches. For this, a microcontroller based solution was chosen. The achieved performances of the closed loop photovoltaic panel (PP system are validated in simulation using the MATLAB / SIMULINK environment and the WinPim & WinReg dedicated software. As it will be seen further in this paper, the extremal control of this design resides in a sequential set of computations used for obtaining the new Maximum Power Point at each change in the system.

  11. Extreme weather events and infectious disease outbreaks.

    Science.gov (United States)

    McMichael, Anthony J

    2015-01-01

    Human-driven climatic changes will fundamentally influence patterns of human health, including infectious disease clusters and epidemics following extreme weather events. Extreme weather events are projected to increase further with the advance of human-driven climate change. Both recent and historical experiences indicate that infectious disease outbreaks very often follow extreme weather events, as microbes, vectors and reservoir animal hosts exploit the disrupted social and environmental conditions of extreme weather events. This review article examines infectious disease risks associated with extreme weather events; it draws on recent experiences including Hurricane Katrina in 2005 and the 2010 Pakistan mega-floods, and historical examples from previous centuries of epidemics and 'pestilence' associated with extreme weather disasters and climatic changes. A fuller understanding of climatic change, the precursors and triggers of extreme weather events and health consequences is needed in order to anticipate and respond to the infectious disease risks associated with human-driven climate change. Post-event risks to human health can be constrained, nonetheless, by reducing background rates of persistent infection, preparatory action such as coordinated disease surveillance and vaccination coverage, and strengthened disaster response. In the face of changing climate and weather conditions, it is critically important to think in ecological terms about the determinants of health, disease and death in human populations.

  12. Time series requirements and trends of temperature and precipitation extremes over Italy

    Science.gov (United States)

    Fioravanti, Guido; Desiato, Franco; Fraschetti, Piero; Perconti, Walter; Piervitali, Emanuela

    2013-04-01

    Extreme climate events have strong impacts on society and economy; accordingly,the knowledge of their trends on long period is crucial for the definition and implementation of a national adaptation strategy to climate change. The Research Programme on Climate Variability and Predictability (CLIVAR) identified a set of temperature and precipitation indices suited to investigate variability and trends of climate extremes. It is well known that extreme indices calculation is more demanding than first and second order statistics are: daily temperature and precipitation data are required and strict constrains in terms of continuity and completeness must be met. In addition, possible dishomogeneities affecting time series must be identified and adjusted before indices calculation. When metadata are not available, statistical methods can provide scientist a relevant support for homogeneity check; however, ad-hoc decision criteria (sometimes subjective) must be applied whenever contradictory results characterize different statistical homogeneity tests. In this work, a set of daily (minimum and maximum) temperature and precipitation time series for the period 1961-2011 were selected in order to guarantee a quite uniform spatial distribution of the stations over the Italian territory and according to the afore-said continuity and completeness criteria. Following the method described by Vincent, the homogeneity check of temperature time series was run at annual level. Two well-documented tests were employed (F-test and T-test), both implemented in the free R-package RHtestV3. The Vincent method was also used for a further investigation of time series homogeneity. Temperature dishomogeneous series were discarded. For precipitation series, no homogeneity check was run. The selected series were employed at daily level to calculate a reliable set of extreme indices. For each station, a linear model was employed for indices trend estimation. Finally, single station results were

  13. Frequency Analysis of High Flow Extremes in the Yingluoxia Watershed in Northwest China

    Directory of Open Access Journals (Sweden)

    Zhanling Li

    2016-05-01

    Full Text Available Statistical modeling of hydrological extremes is significant to the construction of hydraulic engineering. This paper, taking the Yingluoxia watershed as the study area, compares the annual maximum (AM series and the peaks over a threshold (POT series in order to study the hydrological extremes, examines the stationarity and independence assumptions for the two series, and discusses the estimations and uncertainties of return levels from the two series using the Generalized Extreme Value (GEV and Generalized Pareto distribution (GPD models. For comparison, the return levels from all threshold excesses with considering the extremal index are also estimated. For the POT series, the threshold is selected by examining the mean excess plot and the stability of the parameter estimates and by using common-sense. The serial correlation is reduced by filtering out a set of dependent threshold excesses. Results show that both series are approximately stationary and independent. The GEV model fits the AM series well and the GPD model fits the POT series well. The estimated return levels are fairly comparable for the AM series, the POT series, and all threshold excesses with considering the extremal index, with the difference being less than 10% for return periods longer than 10 years. The uncertainties of the estimated return levels are the highest for the AM series, and next for the POT series and then for all threshold excesses series in turn.

  14. Extremely deformable structures

    CERN Document Server

    2015-01-01

    Recently, a new research stimulus has derived from the observation that soft structures, such as biological systems, but also rubber and gel, may work in a post critical regime, where elastic elements are subject to extreme deformations, though still exhibiting excellent mechanical performances. This is the realm of ‘extreme mechanics’, to which this book is addressed. The possibility of exploiting highly deformable structures opens new and unexpected technological possibilities. In particular, the challenge is the design of deformable and bi-stable mechanisms which can reach superior mechanical performances and can have a strong impact on several high-tech applications, including stretchable electronics, nanotube serpentines, deployable structures for aerospace engineering, cable deployment in the ocean, but also sensors and flexible actuators and vibration absorbers. Readers are introduced to a variety of interrelated topics involving the mechanics of extremely deformable structures, with emphasis on ...

  15. Weather and Climate Extremes.

    Science.gov (United States)

    1997-09-01

    Antarctica’s highest (New Zealand Antarctic Society, 1974). This extreme exceeded the record of 58°F (14.4°C) that occurred on 20 October 1956 at Esperanza ... Esperanza (also known as Bahia Esperanza , Hope Bay) was in operation from 1945 through the early 1960s. Meteorological/Climatological Factors: This extreme...cm) Location: Grand Ilet, La R’eunion Island [21°00’S, 55°30’E] Date: 26 January 1980 WORLD’S GREATEST 24-HOUR RAINFALL 72 in (182.5 cm

  16. Adventure and Extreme Sports.

    Science.gov (United States)

    Gomez, Andrew Thomas; Rao, Ashwin

    2016-03-01

    Adventure and extreme sports often involve unpredictable and inhospitable environments, high velocities, and stunts. These activities vary widely and include sports like BASE jumping, snowboarding, kayaking, and surfing. Increasing interest and participation in adventure and extreme sports warrants understanding by clinicians to facilitate prevention, identification, and treatment of injuries unique to each sport. This article covers alpine skiing and snowboarding, skateboarding, surfing, bungee jumping, BASE jumping, and whitewater sports with emphasis on epidemiology, demographics, general injury mechanisms, specific injuries, chronic injuries, fatality data, and prevention. Overall, most injuries are related to overuse, trauma, and environmental or microbial exposure.

  17. Extremal graph theory

    CERN Document Server

    Bollobas, Bela

    2004-01-01

    The ever-expanding field of extremal graph theory encompasses a diverse array of problem-solving methods, including applications to economics, computer science, and optimization theory. This volume, based on a series of lectures delivered to graduate students at the University of Cambridge, presents a concise yet comprehensive treatment of extremal graph theory.Unlike most graph theory treatises, this text features complete proofs for almost all of its results. Further insights into theory are provided by the numerous exercises of varying degrees of difficulty that accompany each chapter. A

  18. Ultracold Ordered Electron Beam

    Science.gov (United States)

    Habs, D.; Kramp, J.; Krause, P.; Matl, K.; Neumann, R.; Schwalm, D.

    1988-01-01

    We have started an experimental program to develop an ultracold electron beam, which can be used together with a standard electron cooling device in the Heidelberg Test Storage Ring TSR. In contrast to the standard-type design using electron beam extraction from a heated cathode, the ultracold beam is produced by photoemission of electrons from a cooled semiconductor crystal irradiated with an intense near-infrared laser light beam. Adiabatic acceleration is expected to provide ordering of the electron beam itself. Besides the cooling of ion beams to extremely low temperatures, with the aim of obtaining crystallization, the ultracold beam will constitute an excellent target for atomic physics experiments.

  19. Ultracold ordered electron beam

    Energy Technology Data Exchange (ETDEWEB)

    Habs, D.; Kramp, J.; Krause, P.; Matl, K.; Neumann, R.; Schwalm, D.

    1988-01-01

    We have started an experimental program to develop an ultracold electron beam, which can be used together with a standard electron cooling device in the Heidelberg Test Storage Ring TSR. In contrast to the standard-type design using electron beam extraction beam extraction from a heated cathode, the ultracold beam is produced by photoemission of electrons from a cooled semiconductor crystal irradiated with an intense near-infrared laser light beam. Adiabatic acceleration is expected to provide ordering of the electron beam itself. Besides the cooling of ion beams to extremely low temperatures, with the aim of obtaining crystallization, the ultracold beam will constitute an excellent target for atomic physics experiments.

  20. Extreme storm surges: a comparative study of frequency analysis approaches

    Directory of Open Access Journals (Sweden)

    Y. Hamdi

    2013-11-01

    Full Text Available In France, nuclear facilities were designed to very low probabilities of failure. Nevertheless, exceptional climatic events have given rise to surges much larger than observations (outliers and had clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches including the Annual Maxima (AM, the Peaks-Over Threshold (POT and the r-Largest Order Statistics (r-LOS. These methods are illustrated in a real analysis case study. All the data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameters stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on: (i the uncertainty degrees, (ii the adequacy criteria and tests and (iii the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distributions parameters and return level estimates and have systematically shown values of the 100 and 500 yr return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fitting at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativity of outliers in data sets. Findings are of practical relevance not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  1. Extremes in random fields a theory and its applications

    CERN Document Server

    Yakir, Benjamin

    2013-01-01

    Presents a useful new technique for analyzing the extreme-value behaviour of random fields Modern science typically involves the analysis of increasingly complex data. The extreme values that emerge in the statistical analysis of complex data are often of particular interest. This book focuses on the analytical approximations of the statistical significance of extreme values. Several relatively complex applications of the technique to problems that emerge in practical situations are presented.  All the examples are difficult to analyze using classical methods, and as a result, the author pr

  2. Extreme wave analysis in the space-time domain: from observations to applications

    Science.gov (United States)

    Barbariol, Francesco; Alves, Jose-Henrique; Benetazzo, Alvise; Bergamasco, Filippo; Carniel, Sandro; Chao, Yung Y.; Chawla, Arun; Ricchi, Antonio; Sclavo, Mauro

    2016-04-01

    The occurrence of extreme waves is one of the most dangerous marine hazards and one of the most challenging sea surface phenomena to be understood. Many severe accidents and casualties at sea are ascribed to the occurrence of abnormally high waves. Despite significant efforts to investigate their occurrence, up to now research has not yet provided exhaustive experimental and theoretical frameworks able to fully explain the development of extremely large waves (i.e. waves that are outlier from standard wave statistics). Recently, relying on the stereo-photogrammetric instrumentation known as "Wave Acquisition Stereo System", it was observed that the number of waves that can be labeled as "freak" increases significantly if the domain of observation is extended from the time (i.e. the classical point time series), to the space-time (i.e. a time sequence of sea surface snapshots covering an area). The empirical statistics of such extremely high waves gathered during a sea state over an area, outlying standard linear and nonlinear extreme value models, have been found in fair agreement with a statistical model accounting for the probability of a maximum crest height occurring in a space-time domain of given size. This model, developed by Fedele (2012) and extended to second order nonlinear waves by Benetazzo et al (2015), relies upon the Euler Characteristics approach of Adler and Taylor (2007), and upon the knowledge of kinematic and geometric properties of the sea state that can be obtained from the directional spectrum of the sea surface. Therefore, new efforts have been put on applying this approach to provide an interpretation of the occurrence of extreme crest heights in sea states, observed via stereo photography. Results have allowed the development of applications in ocean engineering and weather forecasting. In the former, the statistical model of Fedele has been used to investigate the role of metocean forcings on the space-time extremes of sea states. To

  3. A Statistical Model for the Prediction of Wind-Speed Probabilities in the Atmospheric Surface Layer

    Science.gov (United States)

    Efthimiou, G. C.; Hertwig, D.; Andronopoulos, S.; Bartzis, J. G.; Coceal, O.

    2016-11-01

    Wind fields in the atmospheric surface layer (ASL) are highly three-dimensional and characterized by strong spatial and temporal variability. For various applications such as wind-comfort assessments and structural design, an understanding of potentially hazardous wind extremes is important. Statistical models are designed to facilitate conclusions about the occurrence probability of wind speeds based on the knowledge of low-order flow statistics. Being particularly interested in the upper tail regions we show that the statistical behaviour of near-surface wind speeds is adequately represented by the Beta distribution. By using the properties of the Beta probability density function in combination with a model for estimating extreme values based on readily available turbulence statistics, it is demonstrated that this novel modelling approach reliably predicts the upper margins of encountered wind speeds. The model's basic parameter is derived from three substantially different calibrating datasets of flow in the ASL originating from boundary-layer wind-tunnel measurements and direct numerical simulation. Evaluating the model based on independent field observations of near-surface wind speeds shows a high level of agreement between the statistically modelled horizontal wind speeds and measurements. The results show that, based on knowledge of only a few simple flow statistics (mean wind speed, wind-speed fluctuations and integral time scales), the occurrence probability of velocity magnitudes at arbitrary flow locations in the ASL can be estimated with a high degree of confidence.

  4. How historical information can improve extreme coastal water levels probability prediction: application to the Xynthia event at La Rochelle (France

    Directory of Open Access Journals (Sweden)

    T. Bulteau

    2014-11-01

    Full Text Available The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces the issue of outliers, those particularly extreme values distant from the others which increase the uncertainty on the results. In this study, we investigate how historical information, even partial, of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. A Bayesian Markov Chain Monte Carlo method is developed to tackle this issue. We apply this method to the site of La Rochelle (France, where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events, the analysis shows that: (1 integrating historical information in the analysis greatly reduces statistical uncertainties on return levels (2 Xynthia's water level no longer appears as an outlier, (3 we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data till end of 2009 of the same order of magnitude as the standard estimative probability using data till end of 2010. Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.

  5. An atmospheric-to-marine synoptic classification for statistical downscaling marine climate

    Science.gov (United States)

    Camus, Paula; Rueda, Ana; Méndez, Fernando J.; Losada, Iñigo J.

    2016-12-01

    A regression-guided classification is implemented in statistical downscaling models based on weather types for downscaling multivariate wave climate and modelling extreme events. The semi-supervised method classifies the atmospheric circulation conditions (predictor) and the estimations from a regression model linking the circulation with local marine climate (filtered predictand). A weighted factor controls the influence of the predictor and predictand in the weather patterns to improve the performance of the classification to reflect local marine climate characteristics. In addition to the analysis of the variance explained by the predictor and the predictand, the selection of the optimal value of the weighted factor is based on the predictand response in order to avoid subjectivity in the solution. The statistical models using the guided classification are applied in the North Atlantic. The new technique reduces the dispersion of the multivariate predictand within weather types and improves the model skill to downscale waves and to reproduce extremes.

  6. Plastic Surgery Statistics

    Science.gov (United States)

    ... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...

  7. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  8. Extremity perfusion for sarcoma

    NARCIS (Netherlands)

    Hoekstra, Harald Joan

    2008-01-01

    For more than 50 years, the technique of extremity perfusion has been explored in the limb salvage treatment of local, recurrent, and multifocal sarcomas. The "discovery" of tumor necrosis factor-or. in combination with melphalan was a real breakthrough in the treatment of primarily irresectable ext

  9. Hydrological extremes and security

    Science.gov (United States)

    Kundzewicz, Z. W.; Matczak, P.

    2015-04-01

    Economic losses caused by hydrological extremes - floods and droughts - have been on the rise. Hydrological extremes jeopardize human security and impact on societal livelihood and welfare. Security can be generally understood as freedom from threat and the ability of societies to maintain their independent identity and their functional integrity against forces of change. Several dimensions of security are reviewed in the context of hydrological extremes. The traditional interpretation of security, focused on the state military capabilities, has been replaced by a wider understanding, including economic, societal and environmental aspects that get increasing attention. Floods and droughts pose a burden and serious challenges to the state that is responsible for sustaining economic development, and societal and environmental security. The latter can be regarded as the maintenance of ecosystem services, on which a society depends. An important part of it is water security, which can be defined as the availability of an adequate quantity and quality of water for health, livelihoods, ecosystems and production, coupled with an acceptable level of water-related risks to people, environments and economies. Security concerns arise because, over large areas, hydrological extremes - floods and droughts - are becoming more frequent and more severe. In terms of dealing with water-related risks, climate change can increase uncertainties, which makes the state's task to deliver security more difficult and more expensive. However, changes in population size and development, and level of protection, drive exposure to hydrological hazards.

  10. Acute lower extremity ischaemia

    African Journals Online (AJOL)

    tend to impact at arterial bifurcations, the commonest site being the ... Other ominous signs of advanced ischaemia include bluish ... Recommended standards for lower extremity ischaemia*. Doppler signals ... of the embolectomy procedure. An ... in a cath-lab or angio-suite under local ... We serially measure the aPTT and.

  11. Extremity perfusion for sarcoma

    NARCIS (Netherlands)

    Hoekstra, Harald Joan

    2008-01-01

    For more than 50 years, the technique of extremity perfusion has been explored in the limb salvage treatment of local, recurrent, and multifocal sarcomas. The "discovery" of tumor necrosis factor-or. in combination with melphalan was a real breakthrough in the treatment of primarily irresectable

  12. de Sitter Extremal Surfaces

    CERN Document Server

    Narayan, K

    2015-01-01

    We study extremal surfaces in de Sitter space in the Poincare slicing in the upper patch, anchored on spatial subregions at the future boundary ${\\cal I}^+$, restricted to constant boundary Euclidean time slices (focussing on strip subregions). We find real extremal surfaces of minimal area as the boundaries of past lightcone wedges of the subregions in question: these are null surfaces with vanishing area. We find also complex extremal surfaces as complex extrema of the area functional, and the area is not always real-valued. In $dS_4$ the area is real and has some structural resemblance with entanglement entropy in a dual $CFT_3$. There are parallels with analytic continuation from the Ryu-Takayanagi expressions for holographic entanglement entropy in $AdS$. We also discuss extremal surfaces in the $dS$ black brane and the de Sitter "bluewall" studied previously. The $dS_4$ black brane complex surfaces exhibit a real finite cutoff-independent extensive piece. In the bluewall geometry, there are real surface...

  13. Moving in extreme environments

    DEFF Research Database (Denmark)

    Lucas, Samuel J E; Helge, Jørn W; Schütz, Uwe H W

    2016-01-01

    and transcontinental races) and expeditions (e.g. polar crossings), to the more gravitationally limited load carriage (e.g. in the military context). Juxtaposed to these circumstances is the extreme metabolic and mechanical unloading associated with space travel, prolonged bedrest and sedentary lifestyle, which may...

  14. Extreme gust wind estimation using mesoscale modeling

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kruger, Andries

    2014-01-01

    through turbulent eddies. This process is modeled using the mesoscale Weather Forecasting and Research (WRF) model. The gust at the surface is calculated as the largest winds over a layer where the averaged turbulence kinetic energy is greater than the averaged buoyancy force. The experiments have been......Currently, the existing estimation of the extreme gust wind, e.g. the 50-year winds of 3 s values, in the IEC standard, is based on a statistical model to convert the 1:50-year wind values from the 10 min resolution. This statistical model assumes a Gaussian process that satisfies the classical...... done for Denmark and two areas in South Africa. For South Africa, the extreme gust atlases from South Africa were created from the output of the mesoscale modelling using Climate Forecasting System Reanalysis (CFSR) forcing for the period 1998 – 2010. The extensive measurements including turbulence...

  15. Nursing Aides, Orderlies, and Attendants

    Science.gov (United States)

    ... Projected Employment, 2024 Change, 2014-24 Employment by Industry Percent Numeric SOURCE: U.S. Bureau of Labor Statistics, Employment Projections program Nursing assistants and orderlies — 1, ...

  16. Effects of extreme spring temperatures on phenology: a case study from Munich and Ingolstadt

    Science.gov (United States)

    Jochner, Susanne; Menzel, Annette

    2010-05-01

    Extreme events - e.g. warm spells or heavy precipitation events - are likely to increase in the future both in frequency and intensity. Therefore, research on extreme events gains new importance; also in terms of plant development which is mostly triggered by temperatures. An arising question is how plants respond to an extreme warm spell when following an extreme cold winter season. This situation could be studied in spring 2009 in the greater area of Munich and Ingolstadt by phenological observations of flowering and leaf unfolding of birch (Betula pendula L.) and flowering of horse chestnut (Aesculus hippocastanum L.). The long chilling period of winter 2008 and spring 2009 was followed by an immediate strong forcing of flowering and leaf unfolding, especially for birch. This extreme weather situation diminished the difference between urban and rural dates of onset. Another important fact that could be observed in the proceeding period of December 2008 to April 2009 was the reduced temperature difference among urban and rural sites (urban heat island effect). Long-term observations (1951-2008) of the phenological network of the German Meteorological Service (DWD) were used to identify years with reduced urban-rural differences between onset times in the greater area of Munich in the past. Statistical analyses were conducted in order to answer the question whether the sequence of extreme warm and cold events leads to a decreased difference in phenological onset times or if this behaviour can be attributed to extreme warm springs themselves or to the decreased urban heat island effect which is mostly affected by general atmospheric circulation patterns.

  17. Extreme wave and wind response predictions

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher; Olsen, Anders S.; Mansour, Alaa E.

    2011-01-01

    The aim of the paper is to advocate effective stochastic procedures, based on the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), for extreme value predictions related to wave and wind-induced loads.Due to the efficient optimization procedures implemented in standard FORM...

  18. Mediatized Extreme Right Activism and Discourse

    DEFF Research Database (Denmark)

    Peters, Rikke Alberg

    2015-01-01

    and mediatized activism. In order to analyse of the protest form, the visual aesthetics and the discourse of ‘The Immortals’, the paper mobilises two concepts from media and communication studies: mediation and mediatization. It will be argued that that the current transformation of the extreme right: that is...

  19. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects......This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  20. Birth Order and Psychopathology

    Directory of Open Access Journals (Sweden)

    Ajay Risal

    2012-01-01

    Full Text Available Context: Ordinal position the child holds within the sibling ranking of a family is related to intellectual functioning, personality, behavior, and development of psychopathology. Aim: To study the association between birth order and development of psychopathology in patients attending psychiatry services in a teaching hospital. Settings and Design: Hospital-based cross-sectional study. Materials and Methods: Retrospective file review of three groups of patients was carried out. Patient-related variables like age of onset, birth order, family type, and family history of mental illness were compared with psychiatry diagnosis (ICD-10 generated. Statistical Analysis: SPSS 13; descriptive statistics and one-way analysis of variance (ANOVA were used. Results: Mean age of onset of mental illness among the adult general psychiatry patients (group I, n = 527 was found to be 33.01 ± 15.073, while it was 11.68 ± 4.764 among the child cases (group II, n = 47 and 26.74 ± 7.529 among substance abuse cases (group III, n = 110. Among group I patients, commonest diagnosis was depression followed by anxiety and somatoform disorders irrespective of birth order. Dissociative disorders were most prevalent in the first born child (36.7% among group II patients. Among group III patients, alcohol dependence was maximum diagnosis in all birth orders. Conclusions: Depression and alcohol dependence was the commonest diagnosis in adult group irrespective of birth order.