WorldWideScience

Sample records for bivariate normal distribution

  1. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  2. On bivariate geometric distribution

    Directory of Open Access Journals (Sweden)

    K. Jayakumar

    2013-05-01

    Full Text Available Characterizations of bivariate geometric distribution using univariate and bivariate geometric compounding are obtained. Autoregressive models with marginals as bivariate geometric distribution are developed. Various bivariate geometric distributions analogous to important bivariate exponential distributions like, Marshall-Olkin’s bivariate exponential, Downton’s bivariate exponential and Hawkes’ bivariate exponential are presented.

  3. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  4. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  5. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  6. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  7. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  8. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  9. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  10. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xdistributions when X and Y are independent random variables belonging to the same univariate family. In this paper, we consider forms of R when ( X,Y follows a bivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  11. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xdistributions when X and Y are independent random variables belonging to the same univariate family. In this paper, we consider forms of R when ( X,Y follows a bivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  12. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  13. Two new bivariate zero-inflated generalized Poisson distributions with a flexible correlation structure

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2015-05-01

    Full Text Available To model correlated bivariate count data with extra zero observations, this paper proposes two new bivariate zero-inflated generalized Poisson (ZIGP distributions by incorporating a multiplicative factor (or dependency parameter λ, named as Type I and Type II bivariate ZIGP distributions, respectively. The proposed distributions possess a flexible correlation structure and can be used to fit either positively or negatively correlated and either over- or under-dispersed count data, comparing to the existing models that can only fit positively correlated count data with over-dispersion. The two marginal distributions of Type I bivariate ZIGP share a common parameter of zero inflation while the two marginal distributions of Type II bivariate ZIGP have their own parameters of zero inflation, resulting in a much wider range of applications. The important distributional properties are explored and some useful statistical inference methods including maximum likelihood estimations of parameters, standard errors estimation, bootstrap confidence intervals and related testing hypotheses are developed for the two distributions. A real data are thoroughly analyzed by using the proposed distributions and statistical methods. Several simulation studies are conducted to evaluate the performance of the proposed methods.

  14. Comparison between two bivariate Poisson distributions through the ...

    African Journals Online (AJOL)

    These two models express themselves by their probability mass function. ... To remedy this problem, Berkhout and Plug proposed a bivariate Poisson distribution accepting the correlation as well negative, equal to zero, that positive.

  15. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  16. Ventilation-perfusion distribution in normal subjects.

    Science.gov (United States)

    Beck, Kenneth C; Johnson, Bruce D; Olson, Thomas P; Wilson, Theodore A

    2012-09-01

    Functional values of LogSD of the ventilation distribution (σ(V)) have been reported previously, but functional values of LogSD of the perfusion distribution (σ(q)) and the coefficient of correlation between ventilation and perfusion (ρ) have not been measured in humans. Here, we report values for σ(V), σ(q), and ρ obtained from wash-in data for three gases, helium and two soluble gases, acetylene and dimethyl ether. Normal subjects inspired gas containing the test gases, and the concentrations of the gases at end-expiration during the first 10 breaths were measured with the subjects at rest and at increasing levels of exercise. The regional distribution of ventilation and perfusion was described by a bivariate log-normal distribution with parameters σ(V), σ(q), and ρ, and these parameters were evaluated by matching the values of expired gas concentrations calculated for this distribution to the measured values. Values of cardiac output and LogSD ventilation/perfusion (Va/Q) were obtained. At rest, σ(q) is high (1.08 ± 0.12). With the onset of ventilation, σ(q) decreases to 0.85 ± 0.09 but remains higher than σ(V) (0.43 ± 0.09) at all exercise levels. Rho increases to 0.87 ± 0.07, and the value of LogSD Va/Q for light and moderate exercise is primarily the result of the difference between the magnitudes of σ(q) and σ(V). With known values for the parameters, the bivariate distribution describes the comprehensive distribution of ventilation and perfusion that underlies the distribution of the Va/Q ratio.

  17. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  18. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  19. Selection effects in the bivariate brightness distribution for spiral galaxies

    International Nuclear Information System (INIS)

    Phillipps, S.; Disney, M.

    1986-01-01

    The joint distribution of total luminosity and characteristic surface brightness (the bivariate brightness distribution) is investigated for a complete sample of spiral galaxies in the Virgo cluster. The influence of selection and physical limits of various kinds on the apparent distribution are detailed. While the distribution of surface brightness for bright galaxies may be genuinely fairly narrow, faint galaxies exist right across the (quite small) range of accessible surface brightnesses so no statement can be made about the true extent of the distribution. The lack of high surface brightness bright galaxies in the Virgo sample relative to an overall RC2 sample (mostly field galaxies) supports the contention that the star-formation rate is reduced in the inner region of the cluster for environmental reasons. (author)

  20. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  1. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...... coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate...

  2. On minimum divergence adaptation of discrete bivariate distributions to given marginals

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; van der Meulen, E. C.

    2005-01-01

    Roč. 51, č. 1 (2005), s. 313-320 ISSN 0018-9448 R&D Projects: GA ČR GA201/02/1391; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : approximation of contingency tables * bivariate discrete distributions * minimization of divergences Subject RIV: BD - Theory of Information Impact factor: 2.183, year: 2005

  3. Carbon and oxygen isotopic ratio bi-variate distribution for marble artifacts quarry assignment

    International Nuclear Information System (INIS)

    Pentia, M.

    1995-01-01

    Statistical description, by a Gaussian bi-variate probability distribution of 13 C/ 12 C and 18 O/ 16 O isotopic ratios in the ancient marble quarries has been done and the new method for obtaining the confidence level quarry assignment for marble artifacts has been presented. (author) 8 figs., 3 tabs., 4 refs

  4. Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2017-11-01

    Full Text Available A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, ρ and Kendall’s τ .

  5. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  6. An Affine Invariant Bivariate Version of the Sign Test.

    Science.gov (United States)

    1987-06-01

    words: affine invariance, bivariate quantile, bivariate symmetry, model,. generalized median, influence function , permutation test, normal efficiency...calculate a bivariate version of the influence function , and the resulting form is bounded, as is the case for the univartate sign test, and shows the...terms of a blvariate analogue of IHmpel’s (1974) influence function . The latter, though usually defined as a von-Mises derivative of certain

  7. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  8. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying; Hering, Amanda S.; Browning, Joshua M.

    2017-01-01

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  9. Bivariate copula in fitting rainfall data

    Science.gov (United States)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  10. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    Science.gov (United States)

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  12. Bivariate- distribution for transition matrix elements in Breit-Wigner to Gaussian domains of interacting particle systems.

    Science.gov (United States)

    Kota, V K B; Chavda, N D; Sahu, R

    2006-04-01

    Interacting many-particle systems with a mean-field one-body part plus a chaos generating random two-body interaction having strength lambda exhibit Poisson to Gaussian orthogonal ensemble and Breit-Wigner (BW) to Gaussian transitions in level fluctuations and strength functions with transition points marked by lambda = lambda c and lambda = lambda F, respectively; lambda F > lambda c. For these systems a theory for the matrix elements of one-body transition operators is available, as valid in the Gaussian domain, with lambda > lambda F, in terms of orbital occupation numbers, level densities, and an integral involving a bivariate Gaussian in the initial and final energies. Here we show that, using a bivariate-t distribution, the theory extends below from the Gaussian regime to the BW regime up to lambda = lambda c. This is well tested in numerical calculations for 6 spinless fermions in 12 single-particle states.

  13. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  14. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  15. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  16. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z {approx} 4-5

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Kuang-Han; Su, Jian [Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Ferguson, Henry C. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Ravindranath, Swara, E-mail: kuanghan@pha.jhu.edu [The Inter-University Center for Astronomy and Astrophysics, Pune University Campus, Pune 411007, Maharashtra (India)

    2013-03-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B {sub 435}-dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  17. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z ∼ 4-5

    International Nuclear Information System (INIS)

    Huang, Kuang-Han; Su, Jian; Ferguson, Henry C.; Ravindranath, Swara

    2013-01-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B 435 -dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  18. A bivariate model for analyzing recurrent multi-type automobile failures

    Science.gov (United States)

    Sunethra, A. A.; Sooriyarachchi, M. R.

    2017-09-01

    The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by

  19. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  20. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi; Gao, Xin; Huang, Jianhua Z.

    2012-01-01

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  1. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  2. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  3. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  4. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  5. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  6. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  7. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  8. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  9. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  10. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    Science.gov (United States)

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  11. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  12. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  13. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    BVRSS) is introduced and investigated. We show that this test is asymptotically more efficient than its counterpart sign test based on a bivariate simple random sample (BVSRS). The asymptotic null distribution and the efficiency of the test are derived.

  14. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  15. Bivariate discrete beta Kernel graduation of mortality data.

    Science.gov (United States)

    Mazza, Angelo; Punzo, Antonio

    2015-07-01

    Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.

  16. Genetics of Obesity Traits: A Bivariate Genome-Wide Association Analysis

    DEFF Research Database (Denmark)

    Wu, Yili; Duan, Haiping; Tian, Xiaocao

    2018-01-01

    Previous genome-wide association studies on anthropometric measurements have identified more than 100 related loci, but only a small portion of heritability in obesity was explained. Here we present a bivariate twin study to look for the genetic variants associated with body mass index and waist......-hip ratio, and to explore the obesity-related pathways in Northern Han Chinese. Cholesky decompositionmodel for 242monozygotic and 140 dizygotic twin pairs indicated a moderate genetic correlation (r = 0.53, 95%CI: 0.42–0.64) between body mass index and waist-hip ratio. Bivariate genome-wide association.......05. Expression quantitative trait loci analysis identified rs2242044 as a significant cis-eQTL in both the normal adipose-subcutaneous (P = 1.7 × 10−9) and adipose-visceral (P = 4.4 × 10−15) tissue. These findings may provide an important entry point to unravel genetic pleiotropy in obesity traits....

  17. The exp-normal distribution is infinitely divisible

    OpenAIRE

    Pinelis, Iosif

    2018-01-01

    Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.

  18. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  19. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  20. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  1. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    Science.gov (United States)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further

  2. Approximating Multivariate Normal Orthant Probabilities. ONR Technical Report. [Biometric Lab Report No. 90-1.

    Science.gov (United States)

    Gibbons, Robert D.; And Others

    The probability integral of the multivariate normal distribution (ND) has received considerable attention since W. F. Sheppard's (1900) and K. Pearson's (1901) seminal work on the bivariate ND. This paper evaluates the formula that represents the "n x n" correlation matrix of the "chi(sub i)" and the standardized multivariate…

  3. Technical note: Towards a continuous classification of climate using bivariate colour mapping

    NARCIS (Netherlands)

    Teuling, A.J.

    2011-01-01

    Climate is often defined in terms of discrete classes. Here I use bivariate colour mapping to show that the global distribution of K¨oppen-Geiger climate classes can largely be reproduced by combining the simple means of two key states of the climate system 5 (i.e., air temperature and relative

  4. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  5. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  6. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon

    2011-08-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew-normal distributions. In particular, we describe the characteristic function of skew-normal, skew-t, and other related distributions. © 2011 Elsevier Inc.

  7. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  8. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  9. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi; Zhou, Lan; Najibi, Seyed Morteza; Gao, Xin; Huang, Jianhua Z.

    2015-01-01

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  10. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  11. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  12. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  13. A non-Gaussian multivariate distribution with all lower-dimensional Gaussians and related families

    KAUST Repository

    Dutta, Subhajit

    2014-07-28

    Several fascinating examples of non-Gaussian bivariate distributions which have marginal distribution functions to be Gaussian have been proposed in the literature. These examples often clarify several properties associated with the normal distribution. In this paper, we generalize this result in the sense that we construct a pp-dimensional distribution for which any proper subset of its components has the Gaussian distribution. However, the jointpp-dimensional distribution is inconsistent with the distribution of these subsets because it is not Gaussian. We study the probabilistic properties of this non-Gaussian multivariate distribution in detail. Interestingly, several popular tests of multivariate normality fail to identify this pp-dimensional distribution as non-Gaussian. We further extend our construction to a class of elliptically contoured distributions as well as skewed distributions arising from selections, for instance the multivariate skew-normal distribution.

  14. A non-Gaussian multivariate distribution with all lower-dimensional Gaussians and related families

    KAUST Repository

    Dutta, Subhajit; Genton, Marc G.

    2014-01-01

    Several fascinating examples of non-Gaussian bivariate distributions which have marginal distribution functions to be Gaussian have been proposed in the literature. These examples often clarify several properties associated with the normal distribution. In this paper, we generalize this result in the sense that we construct a pp-dimensional distribution for which any proper subset of its components has the Gaussian distribution. However, the jointpp-dimensional distribution is inconsistent with the distribution of these subsets because it is not Gaussian. We study the probabilistic properties of this non-Gaussian multivariate distribution in detail. Interestingly, several popular tests of multivariate normality fail to identify this pp-dimensional distribution as non-Gaussian. We further extend our construction to a class of elliptically contoured distributions as well as skewed distributions arising from selections, for instance the multivariate skew-normal distribution.

  15. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  16. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  17. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  18. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  19. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research

    International Nuclear Information System (INIS)

    Currie, L.A.

    2001-01-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85 Kr and 14 C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger- Mueller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test - for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14 C- 12 C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban ''soot''. The third, environmentally skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85 Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom. (orig.)

  20. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research.

    Science.gov (United States)

    Currie, L A

    2001-07-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85Kr and 14C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger-Muller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test--for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14C-12C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban "soot". The third, environmentally, skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom.

  1. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  2. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  3. A dynamic bivariate Poisson model for analysing and forecasting match results in the English Premier League

    NARCIS (Netherlands)

    Koopman, S.J.; Lit, R.

    2015-01-01

    Summary: We develop a statistical model for the analysis and forecasting of football match results which assumes a bivariate Poisson distribution with intensity coefficients that change stochastically over time. The dynamic model is a novelty in the statistical time series analysis of match results

  4. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  5. Modified Normal Demand Distributions in (R,S)-Inventory Models

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    2003-01-01

    To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,

  6. Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.

    Science.gov (United States)

    Vetter, Thomas R; Mascha, Edward J

    2018-01-01

    Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The

  7. A bivariate contaminated binormal model for robust fitting of proper ROC curves to a pair of correlated, possibly degenerate, ROC datasets.

    Science.gov (United States)

    Zhai, Xuetong; Chakraborty, Dev P

    2017-06-01

    The objective was to design and implement a bivariate extension to the contaminated binormal model (CBM) to fit paired receiver operating characteristic (ROC) datasets-possibly degenerate-with proper ROC curves. Paired datasets yield two correlated ratings per case. Degenerate datasets have no interior operating points and proper ROC curves do not inappropriately cross the chance diagonal. The existing method, developed more than three decades ago utilizes a bivariate extension to the binormal model, implemented in CORROC2 software, which yields improper ROC curves and cannot fit degenerate datasets. CBM can fit proper ROC curves to unpaired (i.e., yielding one rating per case) and degenerate datasets, and there is a clear scientific need to extend it to handle paired datasets. In CBM, nondiseased cases are modeled by a probability density function (pdf) consisting of a unit variance peak centered at zero. Diseased cases are modeled with a mixture distribution whose pdf consists of two unit variance peaks, one centered at positive μ with integrated probability α, the mixing fraction parameter, corresponding to the fraction of diseased cases where the disease was visible to the radiologist, and one centered at zero, with integrated probability (1-α), corresponding to disease that was not visible. It is shown that: (a) for nondiseased cases the bivariate extension is a unit variances bivariate normal distribution centered at (0,0) with a specified correlation ρ 1 ; (b) for diseased cases the bivariate extension is a mixture distribution with four peaks, corresponding to disease not visible in either condition, disease visible in only one condition, contributing two peaks, and disease visible in both conditions. An expression for the likelihood function is derived. A maximum likelihood estimation (MLE) algorithm, CORCBM, was implemented in the R programming language that yields parameter estimates and the covariance matrix of the parameters, and other statistics

  8. Non-Normality and Testing that a Correlation Equals Zero

    Science.gov (United States)

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  9. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  10. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon; Genton, Marc G.

    2011-01-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew

  11. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  12. Building Bivariate Tables: The compareGroups Package for R

    Directory of Open Access Journals (Sweden)

    Isaac Subirana

    2014-05-01

    Full Text Available The R package compareGroups provides functions meant to facilitate the construction of bivariate tables (descriptives of several variables for comparison between groups and generates reports in several formats (LATEX, HTML or plain text CSV. Moreover, bivariate tables can be viewed directly on the R console in a nice format. A graphical user interface (GUI has been implemented to build the bivariate tables more easily for those users who are not familiar with the R software. Some new functions and methods have been incorporated in the newest version of the compareGroups package (version 1.x to deal with time-to-event variables, stratifying tables, merging several tables, and revising the statistical methods used. The GUI interface also has been improved, making it much easier and more intuitive to set the inputs for building the bivariate tables. The ?rst version (version 0.x and this version were presented at the 2010 useR! conference (Sanz, Subirana, and Vila 2010 and the 2011 useR! conference (Sanz, Subirana, and Vila 2011, respectively. Package compareGroups is available from the Comprehensive R Archive Network at http://CRAN.R-project.org/package=compareGroups.

  13. Computation of distribution of minimum resolution for log-normal distribution of chromatographic peak heights.

    Science.gov (United States)

    Davis, Joe M

    2011-10-28

    General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. The relative performance of bivariate causality tests in small samples

    NARCIS (Netherlands)

    Bult, J..R.; Leeflang, P.S.H.; Wittink, D.R.

    1997-01-01

    Causality tests have been applied to establish directional effects and to reduce the set of potential predictors, For the latter type of application only bivariate tests can be used, In this study we compare bivariate causality tests. Although the problem addressed is general and could benefit

  15. The bivariate probit model of uncomplicated control of tumor: a heuristic exposition of the methodology

    International Nuclear Information System (INIS)

    Herbert, Donald

    1997-01-01

    Purpose: To describe the concept, models, and methods for the construction of estimates of joint probability of uncomplicated control of tumors in radiation oncology. Interpolations using this model can lead to the identification of more efficient treatment regimens for an individual patient. The requirement to find the treatment regimen that will maximize the joint probability of uncomplicated control of tumors suggests a new class of evolutionary experimental designs--Response Surface Methods--for clinical trials in radiation oncology. Methods and Materials: The software developed by Lesaffre and Molenberghs is used to construct bivariate probit models of the joint probability of uncomplicated control of cancer of the oropharynx from a set of 45 patients for each of whom the presence/absence of recurrent tumor (the binary event E-bar 1 /E 1 ) and the presence/absence of necrosis (the binary event E 2 /E-bar 2 ) of the normal tissues of the target volume is recorded, together with the treatment variables dose, time, and fractionation. Results: The bivariate probit model can be used to select a treatment regime that will give a specified probability, say P(S) = 0.60, of uncomplicated control of tumor by interpolation within a set of treatment regimes with known outcomes of recurrence and necrosis. The bivariate probit model can be used to guide a sequence of clinical trials to find the maximum probability of uncomplicated control of tumor for patients in a given prognostic stratum using Response Surface methods by extrapolation from an initial set of treatment regimens. Conclusions: The design of treatments for individual patients and the design of clinical trials might be improved by use of a bivariate probit model and Response Surface Methods

  16. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  17. Normal and student´s t distributions and their applications

    CERN Document Server

    Ahsanullah, Mohammad; Shakil, Mohammad

    2014-01-01

    The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.

  18. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Ferreira, Clé cio S.; Genton, Marc G.

    2018-01-01

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down

  19. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  20. Normal distribution of standing balance for healthy Danish children

    DEFF Research Database (Denmark)

    Pedersen, Line Kjeldgaard; Ghasemi, Habib; Rahbek, Ole

    2013-01-01

    Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used in child......Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used...

  1. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  2. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  3. A Bivariate return period for levee failure monitoring

    Science.gov (United States)

    Isola, M.; Caporali, E.

    2017-12-01

    Levee breaches are strongly linked with the interaction processes among water, soil and structure, thus many are the factors that affect the breach development. One of the main is the hydraulic load, characterized by intensity and duration, i.e. by the flood event hydrograph. On the magnitude of the hydraulic load is based the levee design, generally without considering the fatigue failure due to the load duration. Moreover, many are the cases in which the levee breach are characterized by flood of magnitude lower than the design one. In order to implement the strategies of flood risk management, we built here a procedure based on a multivariate statistical analysis of flood peak and volume together with the analysis of the past levee failure events. Particularly, in order to define the probability of occurrence of the hydraulic load on a levee, a bivariate copula model is used to obtain the bivariate joint distribution of flood peak and volume. Flood peak is the expression of the load magnitude, while the volume is the expression of the stress over time. We consider the annual flood peak and the relative volume. The volume is given by the hydrograph area between the beginning and the end of event. The beginning of the event is identified as an abrupt rise of the discharge by more than 20%. The end is identified as the point from which the receding limb is characterized by the baseflow, using a nonlinear reservoir algorithm as baseflow separation technique. By this, with the aim to define warning thresholds we consider the past levee failure events and the relative bivariate return period (BTr) compared with the estimation of a traditional univariate model. The discharge data of 30 hydrometric stations of Arno River in Tuscany, Italy, in the period 1995-2016 are analysed. The database of levee failure events, considering for each event the location as well as the failure mode, is also created. The events were registered in the period 2000-2014 by EEA

  4. Evidence for bivariate linkage of obesity and HDL-C levels in the Framingham Heart Study.

    Science.gov (United States)

    Arya, Rector; Lehman, Donna; Hunt, Kelly J; Schneider, Jennifer; Almasy, Laura; Blangero, John; Stern, Michael P; Duggirala, Ravindranath

    2003-12-31

    Epidemiological studies have indicated that obesity and low high-density lipoprotein (HDL) levels are strong cardiovascular risk factors, and that these traits are inversely correlated. Despite the belief that these traits are correlated in part due to pleiotropy, knowledge on specific genes commonly affecting obesity and dyslipidemia is very limited. To address this issue, we first conducted univariate multipoint linkage analysis for body mass index (BMI) and HDL-C to identify loci influencing variation in these phenotypes using Framingham Heart Study data relating to 1702 subjects distributed across 330 pedigrees. Subsequently, we performed bivariate multipoint linkage analysis to detect common loci influencing covariation between these two traits. We scanned the genome and identified a major locus near marker D6S1009 influencing variation in BMI (LOD = 3.9) using the program SOLAR. We also identified a major locus for HDL-C near marker D2S1334 on chromosome 2 (LOD = 3.5) and another region near marker D6S1009 on chromosome 6 with suggestive evidence for linkage (LOD = 2.7). Since these two phenotypes have been independently mapped to the same region on chromosome 6q, we used the bivariate multipoint linkage approach using SOLAR. The bivariate linkage analysis of BMI and HDL-C implicated the genetic region near marker D6S1009 as harboring a major gene commonly influencing these phenotypes (bivariate LOD = 6.2; LODeq = 5.5) and appears to improve power to map the correlated traits to a region, precisely. We found substantial evidence for a quantitative trait locus with pleiotropic effects, which appears to influence both BMI and HDL-C phenotypes in the Framingham data.

  5. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  7. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  8. A power study of bivariate LOD score analysis of a complex trait and fear/discomfort with strangers.

    Science.gov (United States)

    Ji, Fei; Lee, Dayoung; Mendell, Nancy Role

    2005-12-30

    Complex diseases are often reported along with disease-related traits (DRT). Sometimes investigators consider both disease and DRT phenotypes separately and sometimes they consider individuals as affected if they have either the disease or the DRT, or both. We propose instead to consider the joint distribution of the disease and the DRT and do a linkage analysis assuming a pleiotropic model. We evaluated our results through analysis of the simulated datasets provided by Genetic Analysis Workshop 14. We first conducted univariate linkage analysis of the simulated disease, Kofendrerd Personality Disorder and one of its simulated associated traits, phenotype b (fear/discomfort with strangers). Subsequently, we considered the bivariate phenotype, which combined the information on Kofendrerd Personality Disorder and fear/discomfort with strangers. We developed a program to perform bivariate linkage analysis using an extension to the Elston-Stewart peeling method of likelihood calculation. Using this program we considered the microsatellites within 30 cM of the gene pleiotropic for this simulated disease and DRT. Based on 100 simulations of 300 families we observed excellent power to detect linkage within 10 cM of the disease locus using the DRT and the bivariate trait.

  9. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  10. Bioelectrical impedance vector distribution in the first year of life.

    Science.gov (United States)

    Savino, Francesco; Grasso, Giulia; Cresi, Francesco; Oggero, Roberto; Silvestro, Leandra

    2003-06-01

    We assessed the bioelectrical impedance vector distribution in a sample of healthy infants in the first year of life, which is not available in literature. The study was conducted as a cross-sectional study in 153 healthy Caucasian infants (90 male and 63 female) younger than 1 y, born at full term, adequate for gestational age, free from chronic diseases or growth problems, and not feverish. Z scores for weight, length, cranial circumference, and body mass index for the study population were within the range of +/-1.5 standard deviations according to the Euro-Growth Study references. Concurrent anthropometrics (weight, length, and cranial circumference), body mass index, and bioelectrical impedance (resistance and reactance) measurements were made by the same operator. Whole-body (hand to foot) tetrapolar measurements were performed with a single-frequency (50 kHz), phase-sensitive impedance analyzer. The study population was subdivided into three classes of age for statistical analysis: 0 to 3.99 mo, 4 to 7.99 mo, and 8 to 11.99 mo. Using the bivariate normal distribution of resistance and reactance components standardized by the infant's length, the bivariate 95% confidence limits for the mean impedance vector separated by sex and age groups were calculated and plotted. Further, the bivariate 95%, 75%, and 50% tolerance intervals for individual vector measurements in the first year of life were plotted. Resistance and reactance values often fluctuated during the first year of life, particularly as raw measurements (without normalization by subject's length). However, 95% confidence ellipses of mean vectors from the three age groups overlapped each other, as did confidence ellipses by sex for each age class, indicating no significant vector migration during the first year of life. We obtained an estimate of mean impedance vector in a sample of healthy infants in the first year of life and calculated the bivariate values for an individual vector (95%, 75%, and 50

  11. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  12. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  13. Vector wind and vector wind shear models 0 to 27 km altitude for Cape Kennedy, Florida, and Vandenberg AFB, California

    Science.gov (United States)

    Smith, O. E.

    1976-01-01

    The techniques are presented to derive several statistical wind models. The techniques are from the properties of the multivariate normal probability function. Assuming that the winds can be considered as bivariate normally distributed, then (1) the wind components and conditional wind components are univariate normally distributed, (2) the wind speed is Rayleigh distributed, (3) the conditional distribution of wind speed given a wind direction is Rayleigh distributed, and (4) the frequency of wind direction can be derived. All of these distributions are derived from the 5-sample parameter of wind for the bivariate normal distribution. By further assuming that the winds at two altitudes are quadravariate normally distributed, then the vector wind shear is bivariate normally distributed and the modulus of the vector wind shear is Rayleigh distributed. The conditional probability of wind component shears given a wind component is normally distributed. Examples of these and other properties of the multivariate normal probability distribution function as applied to Cape Kennedy, Florida, and Vandenberg AFB, California, wind data samples are given. A technique to develop a synthetic vector wind profile model of interest to aerospace vehicle applications is presented.

  14. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  15. A general approach to double-moment normalization of drop size distributions

    NARCIS (Netherlands)

    Lee, G.W.; Zawadzki, I.; Szyrmer, W.; Sempere Torres, D.; Uijlenhoet, R.

    2004-01-01

    Normalization of drop size distributions (DSDs) is reexamined here. First, an extension of the scaling normalization that uses one moment of the DSD as a scaling parameter to a more general scaling normalization that uses two moments as scaling parameters of the normalization is presented. In

  16. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  17. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain.

    Science.gov (United States)

    Rabbani, Hossein; Sonka, Milan; Abramoff, Michael D

    2013-01-01

    In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture the heavy-tailed property and inter- and intrascale dependencies of coefficients. In addition, based on the special structure of OCT images, we use an anisotropic windowing procedure for local parameters estimation that results in visual quality improvement. On this base, several OCT despeckling algorithms are obtained based on using Gaussian/two-sided Rayleigh noise distribution and homomorphic/nonhomomorphic model. In order to evaluate the performance of the proposed algorithm, we use 156 selected ROIs from 650 × 512 × 128 OCT dataset in the presence of wet AMD pathology. Our simulations show that the best MMSE estimator using local bivariate mixture prior is for the nonhomomorphic model in the presence of Gaussian noise which results in an improvement of 7.8 ± 1.7 in CNR.

  18. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain

    Directory of Open Access Journals (Sweden)

    Hossein Rabbani

    2013-01-01

    Full Text Available In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture the heavy-tailed property and inter- and intrascale dependencies of coefficients. In addition, based on the special structure of OCT images, we use an anisotropic windowing procedure for local parameters estimation that results in visual quality improvement. On this base, several OCT despeckling algorithms are obtained based on using Gaussian/two-sided Rayleigh noise distribution and homomorphic/nonhomomorphic model. In order to evaluate the performance of the proposed algorithm, we use 156 selected ROIs from 650 × 512 × 128 OCT dataset in the presence of wet AMD pathology. Our simulations show that the best MMSE estimator using local bivariate mixture prior is for the nonhomomorphic model in the presence of Gaussian noise which results in an improvement of 7.8 ± 1.7 in CNR.

  19. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  20. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  1. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  2. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  3. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain

    OpenAIRE

    Rabbani, Hossein; Sonka, Milan; Abramoff, Michael D.

    2013-01-01

    In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture th...

  4. Generating log-normally distributed random numbers by using the Ziggurat algorithm

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2016-01-01

    Uncertainty analyses are usually based on the Monte Carlo method. Using an efficient random number generator(RNG) is a key element in success of Monte Carlo simulations. Log-normal distributed variates are very typical in NPP PSAs. This paper proposes an approach to generate log normally distributed variates based on the Ziggurat algorithm and evaluates the efficiency of the proposed Ziggurat RNG. The proposed RNG can be helpful to improve the uncertainty analysis of NPP PSAs. This paper focuses on evaluating the efficiency of the Ziggurat algorithm from a NPP PSA point of view. From this study, we can draw the following conclusions. - The Ziggurat algorithm is one of perfect random number generators to product normal distributed variates. - The Ziggurat algorithm is computationally much faster than the most commonly used method, Marsaglia polar method

  5. The retest distribution of the visual field summary index mean deviation is close to normal.

    Science.gov (United States)

    Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz

    2016-09-01

    When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  6. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z.

    2011-01-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  7. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  8. Idealized models of the joint probability distribution of wind speeds

    Science.gov (United States)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  9. Austenite Grain Size Estimtion from Chord Lengths of Logarithmic-Normal Distribution

    Directory of Open Access Journals (Sweden)

    Adrian H.

    2017-12-01

    Full Text Available Linear section of grains in polyhedral material microstructure is a system of chords. The mean length of chords is the linear grain size of the microstructure. For the prior austenite grains of low alloy structural steels, the chord length is a random variable of gamma- or logarithmic-normal distribution. The statistical grain size estimation belongs to the quantitative metallographic problems. The so-called point estimation is a well known procedure. The interval estimation (grain size confidence interval for the gamma distribution was given elsewhere, but for the logarithmic-normal distribution is the subject of the present contribution. The statistical analysis is analogous to the one for the gamma distribution.

  10. Kullback–Leibler Divergence of the γ–ordered Normal over t–distribution

    OpenAIRE

    Toulias, T-L.; Kitsos, C-P.

    2012-01-01

    The aim of this paper is to evaluate and study the Kullback–Leibler divergence of the γ–ordered Normal distribution, a generalization of Normal distribution emerged from the generalized Fisher’s information measure, over the scaled t–distribution. We investigate this evaluation through a series of bounds and approximations while the asymptotic behavior of the divergence is also studied. Moreover, we obtain a generalization of the known Kullback–Leibler information measure betwe...

  11. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2018-02-26

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down the theoretical foundations for subsequent inference with this model. In particular, we study linear transformations, marginal distributions, selection representations, stochastic representations and hierarchical representations. We also describe an EM-type algorithm for maximum likelihood estimation of the parameters of the model and demonstrate its implementation on a wind dataset. Our family of multivariate distributions unifies and extends many existing models of the literature that can be seen as submodels of our proposal.

  12. Improved Root Normal Size Distributions for Liquid Atomization

    Science.gov (United States)

    2015-11-01

    ANSI Std. Z39.18 j CONVERSION TABLE Conversion Factors for U.S. Customary to metric (SI) units of measurement. MULTIPLY BY TO...Gray (Gy) coulomb /kilogram (C/kg) second (s) kilogram (kg) kilo pascal (kPa) 1 Improved Root Normal Size Distributions for Liquid

  13. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  14. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  15. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    Science.gov (United States)

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  16. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  17. A general approach to double-moment normalization of drop size distributions

    Science.gov (United States)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  18. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  19. Asymptotics of bivariate generating functions with algebraic singularities

    Science.gov (United States)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  20. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  1. Distributive justice and cognitive enhancement in lower, normal intelligence.

    Science.gov (United States)

    Dunlop, Mikael; Savulescu, Julian

    2014-01-01

    There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at

  2. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  3. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  4. A New Measure Of Bivariate Asymmetry And Its Evaluation

    International Nuclear Information System (INIS)

    Ferreira, Flavio Henn; Kolev, Nikolai Valtchev

    2008-01-01

    In this paper we propose a new measure of bivariate asymmetry, based on conditional correlation coefficients. A decomposition of the Pearson correlation coefficient in terms of its conditional versions is studied and an example of application of the proposed measure is given.

  5. Spectrum-based estimators of the bivariate Hurst exponent

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 90, č. 6 (2014), art. 062802 ISSN 1539-3755 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : bivariate Hurst exponent * power- law cross-correlations * estimation Subject RIV: AH - Economics Impact factor: 2.288, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0436818.pdf

  6. Radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo; Tsujimura, Norio

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( 90 Sr -90 Y), gamma rays ( 137 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 -5 % and 5.4x10 -4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  7. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  8. Cost-offsets of prescription drug expenditures: data analysis via a copula-based bivariate dynamic hurdle model.

    Science.gov (United States)

    Deb, Partha; Trivedi, Pravin K; Zimmer, David M

    2014-10-01

    In this paper, we estimate a copula-based bivariate dynamic hurdle model of prescription drug and nondrug expenditures to test the cost-offset hypothesis, which posits that increased expenditures on prescription drugs are offset by reductions in other nondrug expenditures. We apply the proposed methodology to data from the Medical Expenditure Panel Survey, which have the following features: (i) the observed bivariate outcomes are a mixture of zeros and continuously measured positives; (ii) both the zero and positive outcomes show state dependence and inter-temporal interdependence; and (iii) the zeros and the positives display contemporaneous association. The point mass at zero is accommodated using a hurdle or a two-part approach. The copula-based approach to generating joint distributions is appealing because the contemporaneous association involves asymmetric dependence. The paper studies samples categorized by four health conditions: arthritis, diabetes, heart disease, and mental illness. There is evidence of greater than dollar-for-dollar cost-offsets of expenditures on prescribed drugs for relatively low levels of spending on drugs and less than dollar-for-dollar cost-offsets at higher levels of drug expenditures. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Computer program determines exact two-sided tolerance limits for normal distributions

    Science.gov (United States)

    Friedman, H. A.; Webb, S. R.

    1968-01-01

    Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.

  10. Distribution of Different Sized Ocular Surface Vessels in Diabetics and Normal Individuals.

    Science.gov (United States)

    Banaee, Touka; Pourreza, Hamidreza; Doosti, Hassan; Abrishami, Mojtaba; Ehsaei, Asieh; Basiry, Mohsen; Pourreza, Reza

    2017-01-01

    To compare the distribution of different sized vessels using digital photographs of the ocular surface of diabetic and normal individuals. In this cross-sectional study, red-free conjunctival photographs of diabetic and normal individuals, aged 30-60 years, were taken under defined conditions and analyzed using a Radon transform-based algorithm for vascular segmentation. The image areas occupied by vessels (AOV) of different diameters were calculated. The main outcome measure was the distribution curve of mean AOV of different sized vessels. Secondary outcome measures included total AOV and standard deviation (SD) of AOV of different sized vessels. Two hundred and sixty-eight diabetic patients and 297 normal (control) individuals were included, differing in age (45.50 ± 5.19 vs. 40.38 ± 6.19 years, P distribution curves of mean AOV differed between patients and controls (smaller AOV for larger vessels in patients; P distribution curve of vessels compared to controls. Presence of diabetes mellitus is associated with contraction of larger vessels in the conjunctiva. Smaller vessels dilate with diabetic retinopathy. These findings may be useful in the photographic screening of diabetes mellitus and retinopathy.

  11. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  12. Bivariational calculations for radiation transfer in an inhomogeneous participating media

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Machali, H.M.; Haggag, M.H.; Attia, M.T.

    1986-07-01

    Equations for radiation transfer are obtained for dispersive media with space dependent albedo. Bivariational bound principle is used to calculate the reflection and transmission coefficients for such media. Numerical results are given and compared. (author)

  13. Distribution of normal superficial ocular vessels in digital images.

    Science.gov (United States)

    Banaee, Touka; Ehsaei, Asieh; Pourreza, Hamidreza; Khajedaluee, Mohammad; Abrishami, Mojtaba; Basiri, Mohsen; Daneshvar Kakhki, Ramin; Pourreza, Reza

    2014-02-01

    To investigate the distribution of different-sized vessels in the digital images of the ocular surface, an endeavor which may provide useful information for future studies. This study included 295 healthy individuals. From each participant, four digital photographs of the superior and inferior conjunctivae of both eyes, with a fixed succession of photography (right upper, right lower, left upper, left lower), were taken with a slit lamp mounted camera. Photographs were then analyzed by a previously described algorithm for vessel detection in the digital images. The area (of the image) occupied by vessels (AOV) of different sizes was measured. Height, weight, fasting blood sugar (FBS) and hemoglobin levels were also measured and the relationship between these parameters and the AOV was investigated. These findings indicated a statistically significant difference in the distribution of the AOV among the four conjunctival areas. No significant correlations were noted between the AOV of each conjunctival area and the different demographic and biometric factors. Medium-sized vessels were the most abundant vessels in the photographs of the four investigated conjunctival areas. The AOV of the different sizes of vessels follows a normal distribution curve in the four areas of the conjunctiva. The distribution of the vessels in successive photographs changes in a specific manner, with the mean AOV becoming larger as the photos were taken from the right upper to the left lower area. The AOV of vessel sizes has a normal distribution curve and medium-sized vessels occupy the largest area of the photograph. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  14. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  15. Powerful Bivariate Genome-Wide Association Analyses Suggest the SOX6 Gene Influencing Both Obesity and Osteoporosis Phenotypes in Males

    Science.gov (United States)

    Liu, Yao-Zhong; Pei, Yu-Fang; Liu, Jian-Feng; Yang, Fang; Guo, Yan; Zhang, Lei; Liu, Xiao-Gang; Yan, Han; Wang, Liang; Zhang, Yin-Ping; Levy, Shawn; Recker, Robert R.; Deng, Hong-Wen

    2009-01-01

    Background Current genome-wide association studies (GWAS) are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically. Principal Findings To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI), with the osteoporosis risk phenotype, hip bone mineral density (BMD), scanning ∼380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6) gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82×10−7 and 1.47×10−6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the ∼380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS) cohort containing 3,355 Caucasians (1,370 males and 1,985 females) from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat. Conclusions Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis. PMID:19714249

  16. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2016-01-01

    a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...

  17. Radiation distribution sensing with normal optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo [Nagoya Univ., Dept. of Nuclear Engineering, Nagoya, Aichi (Japan); Tsujimura, Norio [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan)

    2002-12-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ({sup 90}Sr{sup -90}Y), gamma rays ({sup 137}Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10{sup -5}% and 5.4x10{sup -4}%, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  18. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  19. Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution

    NARCIS (Netherlands)

    Belitser, E.; Ghosal, S.

    2003-01-01

    We consider the problem of estimating the mean of an infinite-break dimensional normal distribution from the Bayesian perspective. Under the assumption that the unknown true mean satisfies a "smoothness condition," we first derive the convergence rate of the posterior distribution for a prior that

  20. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  1. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  2. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  3. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  4. Preparation and bivariate analysis of suspensions of human chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    van den Engh, G.J.; Trask, B.J.; Gray, J.W.; Langlois, R.G.; Yu, L.C.

    1985-01-01

    Chromosomes were isolated from a variety of human cell types using a HEPES-buffered hypotonic solution (pH 8.0) containing KCl, MgSO/sub 4/ dithioerythritol, and RNase. The chromosomes isolated by this procedure could be stained with a variety of fluorescent stains including propidium iodide, chromomycin A3, and Hoeschst 33258. Addition of sodium citrate to the stained chromosomes was found to improve the total fluorescence resolution. High-quality bivariate Hoeschst vs. chromomycin fluorescence distributions were obtained for chromosomes isolated from a human fibroblast cell strain, a human colon carcinoma cell line, and human peripheral blood lymphocyte cultures. Good flow karyotypes were also obtained from primary amniotic cell cultures. The Hoeschst vs. chromomycin flow karyotypes of a given cell line, made at different times and at dye concentrations varying over fourfold ranges, show little variation in the relative peak positions of the chromosomes. The size of the DNA in chromosomes isolated using this procedure ranges from 20 to 50 kilobases. The described isolation procedure is simple, it yields high-quality flow karyotypes, and it can be used to prepare chromosomes from clinical samples. 22 references, 7 figures, 1 table.

  5. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  6. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  7. An assessment on the use of bivariate, multivariate and soft computing techniques for collapse susceptibility in GIS environ

    Science.gov (United States)

    Yilmaz, Işik; Marschalko, Marian; Bednarik, Martin

    2013-04-01

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for collapse susceptibility modelling. Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index (TWI), stream power index (SPI), Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from the models, and they were then compared by means of their validations. However, Area Under Curve (AUC) values obtained from all three models showed that the map obtained from soft computing (ANN) model looks like more accurate than the other models, accuracies of all three models can be evaluated relatively similar. The results also showed that the conditional probability is an essential method in preparation of collapse susceptibility map and highly compatible with GIS operating features.

  8. Size distribution of interstellar particles. III. Peculiar extinctions and normal infrared extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.; Wallenhorst, S.G.

    1981-01-01

    The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists

  9. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  10. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    Science.gov (United States)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  11. The PDF of fluid particle acceleration in turbulent flow with underlying normal distribution of velocity fluctuations

    International Nuclear Information System (INIS)

    Aringazin, A.K.; Mazhitov, M.I.

    2003-01-01

    We describe a formal procedure to obtain and specify the general form of a marginal distribution for the Lagrangian acceleration of fluid particle in developed turbulent flow using Langevin type equation and the assumption that velocity fluctuation u follows a normal distribution with zero mean, in accord to the Heisenberg-Yaglom picture. For a particular representation, β=exp[u], of the fluctuating parameter β, we reproduce the underlying log-normal distribution and the associated marginal distribution, which was found to be in a very good agreement with the new experimental data by Crawford, Mordant, and Bodenschatz on the acceleration statistics. We discuss on arising possibilities to make refinements of the log-normal model

  12. Confidence Intervals for True Scores Using the Skew-Normal Distribution

    Science.gov (United States)

    Garcia-Perez, Miguel A.

    2010-01-01

    A recent comparative analysis of alternative interval estimation approaches and procedures has shown that confidence intervals (CIs) for true raw scores determined with the Score method--which uses the normal approximation to the binomial distribution--have actual coverage probabilities that are closest to their nominal level. It has also recently…

  13. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  14. Symmetric co-movement between Malaysia and Japan stock markets

    Science.gov (United States)

    Razak, Ruzanna Ab; Ismail, Noriszura

    2017-04-01

    The copula approach is a flexible tool known to capture linear, nonlinear, symmetric and asymmetric dependence between two or more random variables. It is often used as a co-movement measure between stock market returns. The information obtained from copulas such as the level of association of financial market during normal and bullish and bearish markets phases are useful for investment strategies and risk management. However, the study of co-movement between Malaysia and Japan markets are limited, especially using copulas. Hence, we aim to investigate the dependence structure between Malaysia and Japan capital markets for the period spanning from 2000 to 2012. In this study, we showed that the bivariate normal distribution is not suitable as the bivariate distribution or to present the dependence between Malaysia and Japan markets. Instead, Gaussian or normal copula was found a good fit to represent the dependence. From our findings, it can be concluded that simple distribution fitting such as bivariate normal distribution does not suit financial time series data, whose characteristics are often leptokurtic. The nature of the data is treated by ARMA-GARCH with heavy tail distributions and these can be associated with copula functions. Regarding the dependence structure between Malaysia and Japan markets, the findings suggest that both markets co-move concurrently during normal periods.

  15. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  16. A study of the up-and-down method for non-normal distribution functions

    DEFF Research Database (Denmark)

    Vibholm, Svend; Thyregod, Poul

    1988-01-01

    The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimates...

  17. The approximation of the normal distribution by means of chaotic expression

    International Nuclear Information System (INIS)

    Lawnik, M

    2014-01-01

    The approximation of the normal distribution by means of a chaotic expression is achieved by means of Weierstrass function, where, for a certain set of parameters, the density of the derived recurrence renders good approximation of the bell curve

  18. Dobinski-type relations and the log-normal distribution

    International Nuclear Information System (INIS)

    Blasiak, P; Penson, K A; Solomon, A I

    2003-01-01

    We consider sequences of generalized Bell numbers B(n), n = 1, 2, ..., which can be represented by Dobinski-type summation formulae, i.e. B(n) = 1/C Σ k =0 ∞ [P(k)] n /D(k), with P(k) a polynomial, D(k) a function of k and C = const. They include the standard Bell numbers (P(k) k, D(k) = k!, C = e), their generalizations B r,r (n), r = 2, 3, ..., appearing in the normal ordering of powers of boson monomials (P(k) (k+r)!/k!, D(k) = k!, C = e), variants of 'ordered' Bell numbers B o (p) (n) (P(k) = k, D(k) = (p+1/p) k , C = 1 + p, p = 1, 2 ...), etc. We demonstrate that for α, β, γ, t positive integers (α, t ≠ 0), [B(αn 2 + βn + γ)] t is the nth moment of a positive function on (0, ∞) which is a weighted infinite sum of log-normal distributions. (letter to the editor)

  19. The law of distribution of light beam direction fluctuations in telescopes. [normal density functions

    Science.gov (United States)

    Divinskiy, M. L.; Kolchinskiy, I. G.

    1974-01-01

    The distribution of deviations from mean star trail directions was studied on the basis of 105 star trails. It was found that about 93% of the trails yield a distribution in agreement with the normal law. About 4% of the star trails agree with the Charlier distribution.

  20. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  1. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  2. MR imaging of the bone marrow using short TI IR, 1. Normal and pathological intensity distribution of the bone marrow

    Energy Technology Data Exchange (ETDEWEB)

    Ishizaka, Hiroshi; Kurihara, Mikiko; Tomioka, Kuniaki; Kobayashi, Kanako; Sato, Noriko; Nagai, Teruo; Heshiki, Atsuko; Amanuma, Makoto; Mizuno, Hitomi.

    1989-02-01

    Normal vertebral bone marrow intensity distribution and its alteration in various anemias were evaluated on short TI IR sequences. Material consists of 73 individuals, 48 normals and 25 anemic patients excluding neoplastic conditions. All normal and reactive hypercellular bone marrow revealed characteristic intensity distribution; marginal high intensity and central low intensity, corresponding well to normal distribution of red and yellow marrows and their physiological or reactive conversion between red and yellow marrows. Aplastic anemia did not reveal normal intensity distribution, presumably due to autonomous condition.

  3. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  4. Radiation distribution sensing with normal optical fiber

    CERN Document Server

    Kawarabayashi, J; Naka, R; Uritani, A; Watanabe, K I; Iguchi, T; Tsujimura, N

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( sup 9 sup 0 Sr sup - sup 9 sup 0 Y), gamma rays ( sup 1 sup 3 sup 7 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 sup - sup 5 % and 5.4x10 sup - sup 4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that t...

  5. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    Science.gov (United States)

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  7. Chain Plot: A Tool for Exploiting Bivariate Temporal Structures

    OpenAIRE

    Taylor, CC; Zempeni, A

    2004-01-01

    In this paper we present a graphical tool useful for visualizing the cyclic behaviour of bivariate time series. We investigate its properties and link it to the asymmetry of the two variables concerned. We also suggest adding approximate confidence bounds to the points on the plot and investigate the effect of lagging to the chain plot. We conclude our paper by some standard Fourier analysis, relating and comparing this to the chain plot.

  8. Bivariate tensor product ( p , q $(p, q$ -analogue of Kantorovich-type Bernstein-Stancu-Schurer operators

    Directory of Open Access Journals (Sweden)

    Qing-Bo Cai

    2017-11-01

    Full Text Available Abstract In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of ( p , q $(p, q$ -integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  9. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  10. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  11. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  12. Spatial arrangement and size distribution of normal faults, Buckskin detachment upper plate, Western Arizona

    Science.gov (United States)

    Laubach, S. E.; Hundley, T. H.; Hooker, J. N.; Marrett, R. A.

    2018-03-01

    Fault arrays typically include a wide range of fault sizes and those faults may be randomly located, clustered together, or regularly or periodically located in a rock volume. Here, we investigate size distribution and spatial arrangement of normal faults using rigorous size-scaling methods and normalized correlation count (NCC). Outcrop data from Miocene sedimentary rocks in the immediate upper plate of the regional Buckskin detachment-low angle normal-fault, have differing patterns of spatial arrangement as a function of displacement (offset). Using lower size-thresholds of 1, 0.1, 0.01, and 0.001 m, displacements range over 5 orders of magnitude and have power-law frequency distributions spanning ∼ four orders of magnitude from less than 0.001 m to more than 100 m, with exponents of -0.6 and -0.9. The largest faults with >1 m displacement have a shallower size-distribution slope and regular spacing of about 20 m. In contrast, smaller faults have steep size-distribution slopes and irregular spacing, with NCC plateau patterns indicating imposed clustering. Cluster widths are 15 m for the 0.1-m threshold, 14 m for 0.01-m, and 1 m for 0.001-m displacement threshold faults. Results demonstrate normalized correlation count effectively characterizes the spatial arrangement patterns of these faults. Our example from a high-strain fault pattern above a detachment is compatible with size and spatial organization that was influenced primarily by boundary conditions such as fault shape, mechanical unit thickness and internal stratigraphy on a range of scales rather than purely by interaction among faults during their propagation.

  13. Stochastic Frontier Models with Dependent Errors based on Normal and Exponential Margins || Modelos de frontera estocástica con errores dependientes basados en márgenes normal y exponencial

    Directory of Open Access Journals (Sweden)

    Gómez-Déniz, Emilio

    2017-06-01

    Full Text Available Following the recent work of Gómez-Déniz and Pérez-Rodríguez (2014, this paper extends the results obtained there to the normal-exponential distribution with dependence. Accordingly, the main aim of the present paper is to enhance stochastic production frontier and stochastic cost frontier modelling by proposing a bivariate distribution for dependent errors which allows us to nest the classical models. Closed-form expressions for the error term and technical efficiency are provided. An illustration using real data from the econometric literature is provided to show the applicability of the model proposed. || Continuando el reciente trabajo de Gómez-Déniz y Pérez-Rodríguez (2014, el presente artículo extiende los resultados obtenidos a la distribución normal-exponencial con dependencia. En consecuencia, el principal propósito de este artículo es mejorar el modelado de la frontera estocástica tanto de producción como de coste proponiendo para ello una distribución bivariante para errores dependientes que nos permitan encajar los modelos clásicos. Se obtienen las expresiones en forma cerrada para el término de error y la eficiencia técnica. Se ilustra la aplicabilidad del modelo propouesto usando datos reales existentes en la literatura econométrica.

  14. Even-odd charged multiplicity distributions and energy dependence of normalized multiplicity moments in different rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    The even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows are calculated, starting from a simple picture for charge correlation with non-zero correlation length. The coincidence and separation of these distributions are explained. The calculated window-and energy-dependence of normalized moments recovered the behaviour found in experiments. A new definition for normalized moments is propossed, especially suitable for narrow rapidity windows

  15. Evolution of association between renal and liver functions while awaiting heart transplant: An application using a bivariate multiphase nonlinear mixed effects model.

    Science.gov (United States)

    Rajeswaran, Jeevanantham; Blackstone, Eugene H; Barnard, John

    2018-07-01

    In many longitudinal follow-up studies, we observe more than one longitudinal outcome. Impaired renal and liver functions are indicators of poor clinical outcomes for patients who are on mechanical circulatory support and awaiting heart transplant. Hence, monitoring organ functions while waiting for heart transplant is an integral part of patient management. Longitudinal measurements of bilirubin can be used as a marker for liver function and glomerular filtration rate for renal function. We derive an approximation to evolution of association between these two organ functions using a bivariate nonlinear mixed effects model for continuous longitudinal measurements, where the two submodels are linked by a common distribution of time-dependent latent variables and a common distribution of measurement errors.

  16. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  17. Normal cranial bone marrow MR imaging pattern with age-related ADC value distribution

    International Nuclear Information System (INIS)

    Li Qi; Pan Shinong; Yin Yuming; Li Wei; Chen Zhian; Liu Yunhui; Wu Zhenhua; Guo Qiyong

    2011-01-01

    Objective: To determine MRI appearances of normal age-related cranial bone marrow and the relationship between MRI patterns and apparent diffusion coefficient (ADC) values. Methods: Five hundred subjects were divided into seven groups based on ages. Cranial bone marrow MRI patterns were performed based on different thickness of the diploe and signal intensity distribution characteristics. ADC values of the frontal, parietal, occipital and temporal bones on DWI were measured and calculated. Correlations between ages and ADC values, between patterns and ADC values, as well as the distribution of ADC values were analyzed. Results: Normal cranial bone marrow was divided into four types and six subtypes, Type I, II, III and IV, which had positive correlation with age increasing (χ 2 = 266.36, P 0.05). In addition, there was significant negative correlation between the ADC values and MRI patterns in the normal parietal and occipital bones (r = -0.691 and -0.750, P < 0.01). Conclusion: The combination of MRI features and ADC values changes in different cranial bones showed significant correlation with age increasing. Familiar with the MRI appearance of the normal bone marrow conversion pattern in different age group and their ADC value will aid the diagnosis and differential of the cranial bone pathology.

  18. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  19. A bivariate space-time downscaler under space and time misalignment.

    Science.gov (United States)

    Berrocal, Veronica J; Gelfand, Alan E; Holland, David M

    2010-12-01

    Ozone and particulate matter PM(2.5) are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM(2.5) is typically less frequent than that for ozone.Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM(2.5) for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain.

  20. Breast cancer subtype distribution is different in normal weight, overweight, and obese women.

    Science.gov (United States)

    Gershuni, Victoria; Li, Yun R; Williams, Austin D; So, Alycia; Steel, Laura; Carrigan, Elena; Tchou, Julia

    2017-06-01

    Obesity is associated with tumor promoting pathways related to insulin resistance and chronic low-grade inflammation which have been linked to various disease states, including cancer. Many studies have focused on the relationship between obesity and increased estrogen production, which contributes to the pathogenesis of estrogen receptor-positive breast cancers. The link between obesity and other breast cancer subtypes, such as triple-negative breast cancer (TNBC) and Her2/neu+ (Her2+) breast cancer, is less clear. We hypothesize that obesity may be associated with the pathogenesis of specific breast cancer subtypes resulting in a different subtype distribution than normal weight women. A single-institution, retrospective analysis of tumor characteristics of 848 patients diagnosed with primary operable breast cancer between 2000 and 2013 was performed to evaluate the association between BMI and clinical outcome. Patients were grouped based on their BMI at time of diagnosis stratified into three subgroups: normal weight (BMI = 18-24.9), overweight (BMI = 25-29.9), and obese (BMI > 30). The distribution of breast cancer subtypes across the three BMI subgroups was compared. Obese and overweight women were more likely to present with TNBC and normal weight women with Her2+ breast cancer (p = 0.008). We demonstrated, for the first time, that breast cancer subtype distribution varied significantly according to BMI status. Our results suggested that obesity might activate molecular pathways other than the well-known obesity/estrogen circuit in the pathogenesis of breast cancer. Future studies are needed to understand the molecular mechanisms that drive the variation in subtype distribution across BMI subgroups.

  1. Carbon K-shell photoionization of CO: Molecular frame angular distributions of normal and conjugate shakeup satellites

    International Nuclear Information System (INIS)

    Jahnke, T.; Titze, J.; Foucar, L.; Wallauer, R.; Osipov, T.; Benis, E.P.; Jagutzki, O.; Arnold, W.; Czasch, A.; Staudte, A.; Schoeffler, M.; Alnaser, A.; Weber, T.; Prior, M.H.; Schmidt-Boecking, H.; Doerner, R.

    2011-01-01

    We have measured the molecular frame angular distributions of photoelectrons emitted from the Carbon K-shell of fixed-in-space CO molecules for the case of simultaneous excitation of the remaining molecular ion. Normal and conjugate shakeup states are observed. Photoelectrons belonging to normal Σ-satellite lines show an angular distribution resembling that observed for the main photoline at the same electron energy. Surprisingly a similar shape is found for conjugate shakeup states with Π-symmetry. In our data we identify shake rather than electron scattering (PEVE) as the mechanism producing the conjugate lines. The angular distributions clearly show the presence of a Σ shape resonance for all of the satellite lines.

  2. Distribution of the anticancer drugs doxorubicin, mitoxantrone and topotecan in tumors and normal tissues.

    Science.gov (United States)

    Patel, Krupa J; Trédan, Olivier; Tannock, Ian F

    2013-07-01

    Pharmacokinetic analyses estimate the mean concentration of drug within a given tissue as a function of time, but do not give information about the spatial distribution of drugs within that tissue. Here, we compare the time-dependent spatial distribution of three anticancer drugs within tumors, heart, kidney, liver and brain. Mice bearing various xenografts were treated with doxorubicin, mitoxantrone or topotecan. At various times after injection, tumors and samples of heart, kidney, liver and brain were excised. Within solid tumors, the distribution of doxorubicin, mitoxantrone and topotecan was limited to perivascular regions at 10 min after administration and the distance from blood vessels at which drug intensity fell to half was ~25-75 μm. Although drug distribution improved after 3 and 24 h, there remained a significant decrease in drug fluorescence with increasing distance from tumor blood vessels. Drug distribution was relatively uniform in the heart, kidney and liver with substantially greater perivascular drug uptake than in tumors. There was significantly higher total drug fluorescence in the liver than in tumors after 10 min, 3 and 24 h. Little to no drug fluorescence was observed in the brain. There are marked differences in the spatial distributions of three anticancer drugs within tumor tissue and normal tissues over time, with greater exposure to most normal tissues and limited drug distribution to many cells in tumors. Studies of the spatial distribution of drugs are required to complement pharmacokinetic data in order to better understand and predict drug effects and toxicities.

  3. REGRES: A FORTRAN-77 program to calculate nonparametric and ``structural'' parametric solutions to bivariate regression equations

    Science.gov (United States)

    Rock, N. M. S.; Duffy, T. R.

    REGRES allows a range of regression equations to be calculated for paired sets of data values in which both variables are subject to error (i.e. neither is the "independent" variable). Nonparametric regressions, based on medians of all possible pairwise slopes and intercepts, are treated in detail. Estimated slopes and intercepts are output, along with confidence limits, Spearman and Kendall rank correlation coefficients. Outliers can be rejected with user-determined stringency. Parametric regressions can be calculated for any value of λ (the ratio of the variances of the random errors for y and x)—including: (1) major axis ( λ = 1); (2) reduced major axis ( λ = variance of y/variance of x); (3) Y on Xλ = infinity; or (4) X on Y ( λ = 0) solutions. Pearson linear correlation coefficients also are output. REGRES provides an alternative to conventional isochron assessment techniques where bivariate normal errors cannot be assumed, or weighting methods are inappropriate.

  4. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  5. Neutron importance and the generalized Green function for the conventionally critical reactor with normalized neutron distribution

    International Nuclear Information System (INIS)

    Khromov, V.V.

    1978-01-01

    The notion of neutron importance when applied to nuclear reactor statics problems described by time-independent homogeneous equations of neutron transport with provision for normalization of neutron distribution is considered. An equation has been obtained for the function of neutron importance in a conditionally critical reactor with respect to an arbitrary nons linear functional determined for the normalized neutron distribution. Relation between this function and the generalized Green function of the selfconjugated operator of the reactor equation is determined and the formula of small perturbations for the functionals of a conditionally critical reactor is deduced

  6. Elastin distribution in the normal uterus, uterine leiomyomas, adenomyosis and adenomyomas: a comparison.

    Science.gov (United States)

    Zheng, Wei-Qiang; Ma, Rong; Zheng, Jian-Ming; Gong, Zhi-Jing

    2006-04-01

    To describe the histologic distribution of elastin in the nonpregnant human uterus, uterine leiomyomas, adenomyosis and adenomyomas. Uteri were obtained from women undergoing hysterectomy for benign conditions, including 26 cases of uterine leiomyomas, 24 cases of adenomyosis, 18 adenomyomas and 6 cases of autopsy specimens. Specific histochemical staining techniques were employed in order to demonstrate the distribution of elastin. The distribution of elastin components in the uterus was markedly uneven and showed a decreasing gradient from outer to inner myometrium. No elastin was present within leiomyomas, adenomyomas or adenomyosis. The distribution of elastin may help explain the normal function of the myometrium in labor. It implies that the uneven distribution of elastin components and absence of elastin within leiomyomas, adenomyomas and adenomyosis could be of some clinical significance. The altered elastin distribution in disease states may help explain such symptoms as dysmenorrhea in uterine endometriosis.

  7. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  8. The normal distribution of thoracoabdominal aorta small branch artery ostia

    International Nuclear Information System (INIS)

    Cronin, Paul; Williams, David M.; Vellody, Ranjith; Kelly, Aine Marie; Kazerooni, Ella A.; Carlos, Ruth C.

    2011-01-01

    The purpose of this study was to determine the normal distribution of aortic branch artery ostia. CT scans of 100 subjects were retrospectively reviewed. The angular distributions of the aorta with respect to the center of the T3 to L4 vertebral bodies, and of branch artery origins with respect to the center of the aorta were measured. At each vertebral body level the distribution of intercostal/lumbar arteries and other branch arteries were calculated. The proximal descending aorta is posteriorly placed becoming a midline structure, at the thoracolumbar junction, and remains anterior to the vertebral bodies within the abdomen. The intercostal and lumbar artery ostia have a distinct distribution. At each vertebral level from T3 caudally, one intercostal artery originates from the posterior wall of the aorta throughout the thoracic aorta, while the other intercostal artery originates from the medial wall of the descending thoracic aorta high in the chest, posteromedially from the mid-thoracic aorta, and from the posterior wall of the aorta low in the chest. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Lumbar branches originate only from the posterior wall of the abdominal aorta. Aortic branch artery origins arise with a bimodal distribution and have a characteristic location. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Knowing the location of aortic branch artery ostia may help distinguish branch artery pseudoaneurysms from penetrating ulcers.

  9. Probability distribution of atmospheric pollutants: comparison among four methods for the determination of the log-normal distribution parameters; La distribuzione di probabilita` degli inquinanti atmosferici: confronto tra quattro metodi per la determinazione dei parametri della distribuzione log-normale

    Energy Technology Data Exchange (ETDEWEB)

    Bellasio, R [Enviroware s.r.l., Agrate Brianza, Milan (Italy). Centro Direzionale Colleoni; Lanzani, G; Ripamonti, M; Valore, M [Amministrazione Provinciale, Como (Italy)

    1998-04-01

    This work illustrates the possibility to interpolate the measured concentrations of CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} during one year (1995) at the 13 stations of the air quality monitoring station network of the Provinces of Como and Lecco (Italy) by means of a log-normal distribution. Particular attention was given in choosing the method for the determination of the log-normal distribution parameters among four possible methods: I natural, II percentiles, III moments, IV maximum likelihood. In order to evaluate the goodness of fit a ranking procedure was carried out over the values of four indices: absolute deviation, weighted absolute deviation, Kolmogorov-Smirnov index and Cramer-von Mises-Smirnov index. The capability of the log-normal distribution to fit the measured data is then discussed as a function of the pollutant and of the monitoring station. Finally an example of application is given: the effect of an emission reduction strategy in Lombardy Region (the so called `bollino blu`) is evaluated using a log-normal distribution. [Italiano] In questo lavoro si discute la possibilita` di interpolare le concentrazioni misurate di CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} durante un anno solare (il 1995) nelle 13 stazioni della Rete di Rilevamento della qualita` dell`aria delle Provincie di Como e di Lecco mediante una funzione log-normale. In particolare si discute quale metodo e` meglio usare per l`individuazione dei 2 parametri caratteristici della log-normale, tra 4 teoreticamente possibili: I naturale, II dei percentili, III dei momenti, IV della massima verosimiglianza. Per valutare i risultati ottenuti si usano: la deviazione assoluta, la deviazione pesata, il parametro di Kolmogorov-Smirnov e quello di Cramer-von Mises-Smirnov effettuando un ranking tra i metodi in funzione degli inquinanti e della stazione di misura. Ancora in funzione degli inquinanti e delle diverse stazioni di misura si discute poi la capacita` della funzione log-normale di

  10. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming; Jin, Ick-Hoon

    2013-01-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  11. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  12. Externally studentized normal midrange distribution

    Directory of Open Access Journals (Sweden)

    Ben Dêivide de Oliveira Batista

    Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.

  13. Generalization of the normal-exponential model: exploration of a more accurate parametrisation for the signal distribution on Illumina BeadArrays.

    Science.gov (United States)

    Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv

    2012-12-11

    Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement

  14. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...... as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total...

  15. Energy dependence of angular distributions of sputtered particles by ion-beam bombardment at normal incidence

    International Nuclear Information System (INIS)

    Matsuda, Yoshinobu; Ueda, Yasutoshi; Uchino, Kiichiro; Muraoka, Katsunori; Maeda, Mitsuo; Akazaki, Masanori; Yamamura, Yasunori.

    1986-01-01

    The angular distributions of sputtered Fe-atoms were measured using the laser fluorescence technique during Ar-ion bombardment for energies of 0.6, 1, 2 and 3 keV at normal incidence. The measured cosine distribution at 0.6 keV progressively deviated to an over-cosine distribution at higher energies, and at 3 keV the angular distribution was an overcosine distribution of about 20 %. The experimental results agree qualitatively with calculations by a recent computer simulation code, ACAT. The results are explained by the competition between surface scattering and the effects of primary knock-on atoms, which tend to make the angular distributions over-cosine and under-cosine, respectively. (author)

  16. Normal distribution of /sup 111/In chloride on scintigram

    Energy Technology Data Exchange (ETDEWEB)

    Oyama, K; Machida, K; Hayashi, S; Watari, T; Akaike, A

    1977-05-01

    Indium-111-chloride (/sup 111/InCl/sub 3/) was used as a bone marrow imaging and a tumor-localizing agent in 38 patients (46 scintigrams), who were suspected of, or diagnosed as, having malignant disease, and who were irradiated for malignant disease. The regions of suspected malignant disease, of abnormally accumulated on scintigrams, and the target irradiated, were excluded to estimate the normal distribution of /sup 111/InCl/sub 3/. Scintigrams were taken 48 hrs after intravenous injection of /sup 111/InCl/sub 3/ 1 to 3 mCi. The percent and score distribution of /sup 111/InCl/sub 3/ were noted in 23 regions. As the liver showed the highest accumulation of /sup 111/In on all scintigrams, the liver was designated as 2+. Comparing with the radioactivity in the liver, other regions had similar (2+), moderately decreased (+), or severely decreased (-) accumulation on scintigram. The score is given one for 2+, 0.5 for +, 0 for -. The score and percentage distributions were: liver 100 (100%), lumbar vertebra 58.5 (100%), mediastinum 55 (100%), nasopharynx 50 (100%), testis 47.5 (59%), heart 44.5 (89%), and pelvis 43.5 (78%). Comparing this study with a previous study of /sup 111/In-BLM, score distribution in lumbar vertebra, pelvis, and skull were similar. /sup 111/In-BLM is excreted rapidly after injection, but little /sup 111/InCl/sub 3/ is excreted. Accumulation of /sup 111/In in bone marrow depends upon the amount of /sup 111/In-transferrin in blood. High accumulation in the lumbar vertebra and pelvis shows that /sup 111/InCl/sub 3/ would be effective as a bone marrow imaging agent.

  17. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase-type distribution. The latter property is partially important for the development of hypothesis testing in linear models. Finally, it is easy to simulate...

  18. Computational approach to Thornley's problem by bivariate operational calculus

    Science.gov (United States)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  19. Impact of foot progression angle on the distribution of plantar pressure in normal children.

    Science.gov (United States)

    Lai, Yu-Cheng; Lin, Huey-Shyan; Pan, Hui-Fen; Chang, Wei-Ning; Hsu, Chien-Jen; Renn, Jenn-Huei

    2014-02-01

    Plantar pressure distribution during walking is affected by several gait factors, most especially the foot progression angle which has been studied in children with neuromuscular diseases. However, this relationship in normal children has only been reported in limited studies. The purpose of this study is to clarify the correlation between foot progression angle and plantar pressure distribution in normal children, as well as the impacts of age and sex on this correlation. This study retrospectively reviewed dynamic pedobarographic data that were included in the gait laboratory database of our institution. In total, 77 normally developed children aged 5-16 years who were treated between 2004 and 2009 were included. Each child's footprint was divided into 5 segments: lateral forefoot, medial forefoot, lateral midfoot, medial midfoot, and heel. The percentages of impulse exerted at the medial foot, forefoot, midfoot, and heel were calculated. The average foot progression angle was 5.03° toe-out. Most of the total impulse was exerted on the forefoot (52.0%). Toe-out gait was positively correlated with high medial (r = 0.274; P plantar pressure as part of the treatment of various foot pathologies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. An empirical multivariate log-normal distribution representing uncertainty of biokinetic parameters for 137Cs

    International Nuclear Information System (INIS)

    Miller, G.; Martz, H.; Bertelli, L.; Melo, D.

    2008-01-01

    A simplified biokinetic model for 137 Cs has six parameters representing transfer of material to and from various compartments. Using a Bayesian analysis, the joint probability distribution of these six parameters is determined empirically for two cases with quite a lot of bioassay data. The distribution is found to be a multivariate log-normal. Correlations between different parameters are obtained. The method utilises a fairly large number of pre-determined forward biokinetic calculations, whose results are stored in interpolation tables. Four different methods to sample the multidimensional parameter space with a limited number of samples are investigated: random, stratified, Latin Hypercube sampling with a uniform distribution of parameters and importance sampling using a lognormal distribution that approximates the posterior distribution. The importance sampling method gives much smaller sampling uncertainty. No sampling method-dependent differences are perceptible for the uniform distribution methods. (authors)

  1. On the Use of the Log-Normal Particle Size Distribution to Characterize Global Rain

    Science.gov (United States)

    Meneghini, Robert; Rincon, Rafael; Liao, Liang

    2003-01-01

    Although most parameterizations of the drop size distributions (DSD) use the gamma function, there are several advantages to the log-normal form, particularly if we want to characterize the large scale space-time variability of the DSD and rain rate. The advantages of the distribution are twofold: the logarithm of any moment can be expressed as a linear combination of the individual parameters of the distribution; the parameters of the distribution are approximately normally distributed. Since all radar and rainfall-related parameters can be written approximately as a moment of the DSD, the first property allows us to express the logarithm of any radar/rainfall variable as a linear combination of the individual DSD parameters. Another consequence is that any power law relationship between rain rate, reflectivity factor, specific attenuation or water content can be expressed in terms of the covariance matrix of the DSD parameters. The joint-normal property of the DSD parameters has applications to the description of the space-time variation of rainfall in the sense that any radar-rainfall quantity can be specified by the covariance matrix associated with the DSD parameters at two arbitrary space-time points. As such, the parameterization provides a means by which we can use the spaceborne radar-derived DSD parameters to specify in part the covariance matrices globally. However, since satellite observations have coarse temporal sampling, the specification of the temporal covariance must be derived from ancillary measurements and models. Work is presently underway to determine whether the use of instantaneous rain rate data from the TRMM Precipitation Radar can provide good estimates of the spatial correlation in rain rate from data collected in 5(sup 0)x 5(sup 0) x 1 month space-time boxes. To characterize the temporal characteristics of the DSD parameters, disdrometer data are being used from the Wallops Flight Facility site where as many as 4 disdrometers have been

  2. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  3. Drug binding affinities and potencies are best described by a log-normal distribution and use of geometric means

    International Nuclear Information System (INIS)

    Stanisic, D.; Hancock, A.A.; Kyncl, J.J.; Lin, C.T.; Bush, E.N.

    1986-01-01

    (-)-Norepinephrine (NE) is used as an internal standard in their in vitro adrenergic assays, and the concentration of NE which produces a half-maximal inhibition of specific radioligand binding (affinity; K/sub I/), or half-maximal contractile response (potency; ED 50 ) has been measured numerous times. The goodness-of-fit test for normality was performed on both normal (Gaussian) or log 10 -normal frequency histograms of these data using the SAS Univariate procedure. Specific binding of 3 H-prazosin to rat liver (α 1 -), 3 H rauwolscine to rat cortex (α 2 -) and 3 H-dihydroalprenolol to rat ventricle (β 1 -) or rat lung (β 2 -receptors) was inhibited by NE; the distributions of NE K/sub I/'s at all these sites were skewed to the right, with highly significant (p 50 's of NE in isolated rabbit aorta (α 1 ), phenoxybenzamine-treated dog saphenous vein (α 2 ) and guinea pig atrium (β 1 ). The vasorelaxant potency of atrial natriuretic hormone in histamine-contracted rabbit aorta also was better described by a log-normal distribution, indicating that log-normalcy is probably a general phenomenon of drug-receptor interactions. Because data of this type appear to be log-normally distributed, geometric means should be used in parametric statistical analyses

  4. Bivariate Gaussian bridges: directional factorization of diffusion in Brownian bridge models.

    Science.gov (United States)

    Kranstauber, Bart; Safi, Kamran; Bartumeus, Frederic

    2014-01-01

    In recent years high resolution animal tracking data has become the standard in movement ecology. The Brownian Bridge Movement Model (BBMM) is a widely adopted approach to describe animal space use from such high resolution tracks. One of the underlying assumptions of the BBMM is isotropic diffusive motion between consecutive locations, i.e. invariant with respect to the direction. Here we propose to relax this often unrealistic assumption by separating the Brownian motion variance into two directional components, one parallel and one orthogonal to the direction of the motion. Our new model, the Bivariate Gaussian bridge (BGB), tracks movement heterogeneity across time. Using the BGB and identifying directed and non-directed movement within a trajectory resulted in more accurate utilisation distributions compared to dynamic Brownian bridges, especially for trajectories with a non-isotropic diffusion, such as directed movement or Lévy like movements. We evaluated our model with simulated trajectories and observed tracks, demonstrating that the improvement of our model scales with the directional correlation of a correlated random walk. We find that many of the animal trajectories do not adhere to the assumptions of the BBMM. The proposed model improves accuracy when describing the space use both in simulated correlated random walks as well as observed animal tracks. Our novel approach is implemented and available within the "move" package for R.

  5. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  6. Statistical properties of the normalized ice particle size distribution

    Science.gov (United States)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000

  7. Fitting statistical distributions the generalized lambda distribution and generalized bootstrap methods

    CERN Document Server

    Karian, Zaven A

    2000-01-01

    Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...

  8. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  9. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    Science.gov (United States)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  10. Comparing of Normal Stress Distribution in Static and Dynamic Soil-Structure Interaction Analyses

    International Nuclear Information System (INIS)

    Kholdebarin, Alireza; Massumi, Ali; Davoodi, Mohammad; Tabatabaiefar, Hamid Reza

    2008-01-01

    It is important to consider the vertical component of earthquake loading and inertia force in soil-structure interaction analyses. In most circumstances, design engineers are primarily concerned about the analysis of behavior of foundations subjected to earthquake-induced forces transmitted from the bedrock. In this research, a single rigid foundation with designated geometrical parameters located on sandy-clay soil has been modeled in FLAC software with Finite Different Method and subjected to three different vertical components of earthquake records. In these cases, it is important to evaluate effect of footing on underlying soil and to consider normal stress in soil with and without footing. The distribution of normal stress under the footing in static and dynamic states has been studied and compared. This Comparison indicated that, increasing in normal stress under the footing caused by vertical component of ground excitations, has decreased dynamic vertical settlement in comparison with static state

  11. Elastic microfibril distribution in the cornea: Differences between normal and keratoconic stroma.

    Science.gov (United States)

    White, Tomas L; Lewis, Philip N; Young, Robert D; Kitazawa, Koji; Inatomi, Tsutomu; Kinoshita, Shigeru; Meek, Keith M

    2017-06-01

    The optical and biomechanical properties of the cornea are largely governed by the collagen-rich stroma, a layer that represents approximately 90% of the total thickness. Within the stroma, the specific arrangement of superimposed lamellae provides the tissue with tensile strength, whilst the spatial arrangement of individual collagen fibrils within the lamellae confers transparency. In keratoconus, this precise stromal arrangement is lost, resulting in ectasia and visual impairment. In the normal cornea, we previously characterised the three-dimensional arrangement of an elastic fiber network spanning the posterior stroma from limbus-to-limbus. In the peripheral cornea/limbus there are elastin-containing sheets or broad fibers, most of which become microfibril bundles (MBs) with little or no elastin component when reaching the central cornea. The purpose of the current study was to compare this network with the elastic fiber distribution in post-surgical keratoconic corneal buttons, using serial block face scanning electron microscopy and transmission electron microscopy. We have demonstrated that the MB distribution is very different in keratoconus. MBs are absent from a region of stroma anterior to Descemet's membrane, an area that is densely populated in normal cornea, whilst being concentrated below the epithelium, an area in which they are absent in normal cornea. We contend that these latter microfibrils are produced as a biomechanical response to provide additional strength to the anterior stroma in order to prevent tissue rupture at the apex of the cone. A lack of MBs anterior to Descemet's membrane in keratoconus would alter the biomechanical properties of the tissue, potentially contributing to the pathogenesis of the disease. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Regression analysis for bivariate gap time with missing first gap time data.

    Science.gov (United States)

    Huang, Chia-Hui; Chen, Yi-Hau

    2017-01-01

    We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.

  13. Thermal modelling of normal distributed nanoparticles through thickness in an inorganic material matrix

    Science.gov (United States)

    Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.

    2017-10-01

    Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.

  14. Distribution and elimination of intravenously administered atrial natriuretic hormone(ANH) to normal and nephrectomized rats

    International Nuclear Information System (INIS)

    Devine, E.; Artman, L.; Budzik, G.; Bush, E.; Holleman, W.

    1986-01-01

    The 24 amino acid peptide, ANH(5-28), was N-terminally labeled with I-125 Bolton-Hunter reagent, iodo-N-succinimidyl 3-(4-hydroxyphenyl)propionate. The I-125 peptide plus 1μg/kg of the I-127 Bolton-Hunter peptide was injected into normal and nephrectomized anesthetized (Nembutal) rats. Blood samples were drawn into a cocktail developed to inhibit plasma induced degradation. Radiolabeled peptides were analyzed by HPLC. A biphasic curve of I-125 ANH(5-28) elimination was obtained, the first phase (t 1/2 = 15 sec) representing in vivo distribution and the second phase (t 1/2 = 7-10 min) a measurement of elimination. This biphasic elimination curve was similar in normal and nephrectomized rats. The apparent volumes of distribution were 15-20 ml for the first phase and > 300 ml for the second phase. In order to examine the tissue distribution of the peptide, animals were sacrificed at 2 minutes and the I-125 tissue contents were quantitated. The majority of the label was located in the liver (50%), kidneys (21%) and the lung (5%). The degradative peptides appearing in the plasma and urine of normal rats were identical. No intact radiolabeled ANH(5-28) was found in the urine. In conclusion, iodinated Bolton-Hunter labeled ANH(5-28) is rapidly removed from the circulation by the liver and to a lesser extent by the kidney, but the rate of elimination is not decreased by nephrectomy

  15. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  16. A comparison between multivariate and bivariate analysis used in marketing research

    Directory of Open Access Journals (Sweden)

    Constantin, C.

    2012-01-01

    Full Text Available This paper is about an instrumental research conducted in order to compare the information given by two multivariate data analysis in comparison with the usual bivariate analysis. The outcomes of the research reveal that sometimes the multivariate methods use more information from a certain variable, but sometimes they use only a part of the information considered the most important for certain associations. For this reason, a researcher should use both categories of data analysis in order to obtain entirely useful information.

  17. Effects of adipose tissue distribution on maximum lipid oxidation rate during exercise in normal-weight women.

    Science.gov (United States)

    Isacco, L; Thivel, D; Duclos, M; Aucouturier, J; Boisseau, N

    2014-06-01

    Fat mass localization affects lipid metabolism differently at rest and during exercise in overweight and normal-weight subjects. The aim of this study was to investigate the impact of a low vs high ratio of abdominal to lower-body fat mass (index of adipose tissue distribution) on the exercise intensity (Lipox(max)) that elicits the maximum lipid oxidation rate in normal-weight women. Twenty-one normal-weight women (22.0 ± 0.6 years, 22.3 ± 0.1 kg.m(-2)) were separated into two groups of either a low or high abdominal to lower-body fat mass ratio [L-A/LB (n = 11) or H-A/LB (n = 10), respectively]. Lipox(max) and maximum lipid oxidation rate (MLOR) were determined during a submaximum incremental exercise test. Abdominal and lower-body fat mass were determined from DXA scans. The two groups did not differ in aerobic fitness, total fat mass, or total and localized fat-free mass. Lipox(max) and MLOR were significantly lower in H-A/LB vs L-A/LB women (43 ± 3% VO(2max) vs 54 ± 4% VO(2max), and 4.8 ± 0.6 mg min(-1)kg FFM(-1)vs 8.4 ± 0.9 mg min(-1)kg FFM(-1), respectively; P normal-weight women, a predominantly abdominal fat mass distribution compared with a predominantly peripheral fat mass distribution is associated with a lower capacity to maximize lipid oxidation during exercise, as evidenced by their lower Lipox(max) and MLOR. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  18. Using an APOS Framework to Understand Teachers' Responses to Questions on the Normal Distribution

    Science.gov (United States)

    Bansilal, Sarah

    2014-01-01

    This study is an exploration of teachers' engagement with concepts embedded in the normal distribution. The participants were a group of 290 in-service teachers enrolled in a teacher development program. The research instrument was an assessment task that can be described as an "unknown percentage" problem, which required the application…

  19. On the possible ''normalization'' of experimental curves of 230Th vertical distribution in abyssal oceanic sediments

    International Nuclear Information System (INIS)

    Kuznetsov, Yu.V.; Al'terman, Eh.I.; Lisitsyn, A.P.; AN SSSR, Moscow. Inst. Okeanologii)

    1981-01-01

    The possibilities of the method of normalization of experimental ionic curves in reference to dating of abyssal sediments and establishing their accumulation rapidities are studied. The method is based on using correlation between ionic curves extrema and variations of Fe, Mn, C org., and P contents in abyssal oceanic sediments. It has been found that the above method can be successfully applied for correction of 230 Th vertical distribution data obtained by low-background γ-spectrometry. The method leads to most reliable results in those cases when the vertical distribution curves in sediments of elements concentrators of 230 Th are symbasic between themselves. The normalization of experimental ionic curves in many cases gives the possibility to realize the sediment age stratification [ru

  20. One-dimensional time-dependent conduction states and temperature distribution along a normal zone during a quench

    International Nuclear Information System (INIS)

    Lopez, G.

    1991-01-01

    The quench simulations of a superconducting (s.c.) magnet requires some assumptions about the evolution of the normal zone and its temperature profile. The axial evolution of the normal zone is considered through the longitudinal quench velocity. However, the transversal quench propagation may be considered through the transversal quench velocity or with the turn-to-turn time delay quench propagation. The temperature distribution has been assumed adiabatic-like or cosine-like in two different computer programs. Although both profiles are different, they bring about more or less the same qualitative quench results differing only in about 8%. Unfortunately, there are not experimental data for the temperature profile along the conductor in a quench event to have a realistic comparison. Little attention has received the temperature profile, mainly because it is not so critical parameter in the quench analysis. Nonetheless, a confident quench analysis requires that the temperature distribution along the normal zone be taken into account with good approximation. In this paper, an analytical study is made about the temperature profile

  1. Distribution of CD163-positive cell and MHC class II-positive cell in the normal equine uveal tract.

    Science.gov (United States)

    Sano, Yuto; Matsuda, Kazuya; Okamoto, Minoru; Takehana, Kazushige; Hirayama, Kazuko; Taniyama, Hiroyuki

    2016-02-01

    Antigen-presenting cells (APCs) in the uveal tract participate in ocular immunity including immune homeostasis and the pathogenesis of uveitis. In horses, although uveitis is the most common ocular disorder, little is known about ocular immunity, such as the distribution of APCs. In this study, we investigated the distribution of CD163-positive and MHC II-positive cells in the normal equine uveal tract using an immunofluorescence technique. Eleven eyes from 10 Thoroughbred horses aged 1 to 24 years old were used. Indirect immunofluorescence was performed using the primary antibodies CD163, MHC class II (MHC II) and CD20. To demonstrate the site of their greatest distribution, positive cells were manually counted in 3 different parts of the uveal tract (ciliary body, iris and choroid), and their average number was assessed by statistical analysis. The distribution of pleomorphic CD163- and MHC II-expressed cells was detected throughout the equine uveal tract, but no CD20-expressed cells were detected. The statistical analysis demonstrated the distribution of CD163- and MHC II-positive cells focusing on the ciliary body. These results demonstrated that the ciliary body is the largest site of their distribution in the normal equine uveal tract, and the ciliary body is considered to play important roles in uveal and/or ocular immune homeostasis. The data provided in this study will help further understanding of equine ocular immunity in the normal state and might be beneficial for understanding of mechanisms of ocular disorders, such as equine uveitis.

  2. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  3. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2017-07-01

    Full Text Available Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature–precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 % than models relying directly on temperature and precipitation as predictors (36 %. Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate–crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  4. Optimization of b-value distribution for biexponential diffusion-weighted MR imaging of normal prostate.

    Science.gov (United States)

    Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko

    2014-05-01

    To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.

  5. Basic study on radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Naka, R.; Kawarabayashi, J.; Uritani, A.; Iguchi, T.; Kaneko, J.; Takeuchi, H.; Kakuta, T.

    2000-01-01

    Recently, some methods of radiation distribution sensing with optical fibers have been proposed. These methods employ scintillating fibers or scintillators with wavelength-shifting fibers. The positions of radiation interactions are detected by applying a time-of-flight (TOF) technique to the scintillation photon propagation. In the former method, the attenuation length for the scintillation photons in the scintillating fiber is relatively short, so that the operating length of the sensor is limited to several meters. In the latter method, a radiation distribution cannot continuously be obtained but discretely. To improve these shortcomings, a normal optical fiber made of polymethyl methacrylate (PMMA) is used in this study. Although the scintillation efficiency of PMMA is very low, several photons are emitted through interaction with a radiation. The fiber is transparent for the emitted photons to have a relatively long operating length. A radiation distribution can continuously be obtained. This paper describes a principle of the position sensing method based on the time of flight technique and preliminary results obtained for 90 Sr- 90 Y beta rays, 137 Cs gamma rays, and 14 MeV neutrons. The spatial resolutions for the above three kinds of radiations are 0.30 m, 0.37 m, 0.13 m, and the detection efficiencies are 1.1 x 10 -3 , 1.6 x 10 -7 , 5.4 x 10 -6 , respectively, with 10 m operation length. The results of a spectroscopic study on the optical property of the fiber are also described. (author)

  6. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL......-40 in osteoarthritic (n=9) and macroscopically normal (n=5) human articular cartilage, collected from 12 pre-selected areas of the femoral head, to discover a potential role for YKL-40 in cartilage remodelling in osteoarthritis. Immunohistochemical analysis showed that YKL-40 staining was found...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  7. Validation of MCDS by comparison of predicted with experimental velocity distribution functions in rarefied normal shocks

    Science.gov (United States)

    Pham-Van-diep, Gerald C.; Erwin, Daniel A.

    1989-01-01

    Velocity distribution functions in normal shock waves in argon and helium are calculated using Monte Carlo direct simulation. These are compared with experimental results for argon at M = 7.18 and for helium at M = 1.59 and 20. For both argon and helium, the variable-hard-sphere (VHS) model is used for the elastic scattering cross section, with the velocity dependence derived from a viscosity-temperature power-law relationship in the way normally used by Bird (1976).

  8. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    International Nuclear Information System (INIS)

    Goodarzi, Samereh; Pazirandeh, Ali; Jameie, Seyed Behnamedin; Baghban Khojasteh, Nasrin

    2012-01-01

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: ► Boron distribution in male and female rats' normal brain was studied in this research. ► Coronal sections of animal tissue samples were irradiated with thermal neutrons. ► Alpha and Lithium tracks were counted using alpha autoradiography. ► Different boron concentration was seen in brain sections of male and female rats. ► The highest boron concentration was seen in 4 h after boron compound injection.

  9. Analysis of a hundred-years series of magnetic activity indices. III. Is the frequency distribution logarithmo-normal

    International Nuclear Information System (INIS)

    Mayaud, P.N.

    1976-01-01

    Because of the various components of positive conservation existing in the series of aa indices, their frequency distribution is necessarily distorted with respect to any random distribution. However when one takes these various components into account, the observed distribution can be considered as being a logarithmo-normal distribution. This implies that the geomagnetic activity satisfies the conditions of the central limit theorem, according to which a phenomenon which presents such a distribution is due to independent causes whose effects are multiplicative. Furthermore, the distorsion of the frequency distribution caused by the 11-year and 90-year cycles corresponds to a pure attenuation effect; an interpretation by the solar 'coronal holes' is proposed [fr

  10. Distribution pattern of urine albumin creatinine ratio and the prevalence of high-normal levels in untreated asymptomatic non-diabetic hypertensive patients.

    Science.gov (United States)

    Ohmaru, Natsuki; Nakatsu, Takaaki; Izumi, Reishi; Mashima, Keiichi; Toki, Misako; Kobayashi, Asako; Ogawa, Hiroko; Hirohata, Satoshi; Ikeda, Satoru; Kusachi, Shozo

    2011-01-01

    Even high-normal albuminuria is reportedly associated with cardiovascular events. We determined the urine albumin creatinine ratio (UACR) in spot urine samples and analyzed the UACR distribution and the prevalence of high-normal levels. The UACR was determined using immunoturbidimetry in 332 untreated asymptomatic non-diabetic Japanese patients with hypertension and in 69 control subjects. The microalbuminuria and macroalbuminuria levels were defined as a UCAR ≥30 and creatinine and a UCAR ≥300 µg/mg·creatinine, respectively. The distribution patterns showed a highly skewed distribution for the lower levels, and a common logarithmic transformation produced a close fit to a Gaussian distribution with median, 25th and 75th percentile values of 22.6, 13.5 and 48.2 µg/mg·creatinine, respectively. When a high-normal UACR was set at >20 to creatinine, 19.9% (66/332) of the hypertensive patients exhibited a high-normal UACR. Microalbuminuria and macroalbuminuria were observed in 36.1% (120/336) and 2.1% (7/332) of the patients, respectively. UACR was significantly correlated with the systolic and diastolic blood pressures and the pulse pressure. A stepwise multivariate analysis revealed that these pressures as well as age were independent factors that increased UACR. The UACR distribution exhibited a highly skewed pattern, with approximately 60% of untreated, non-diabetic hypertensive patients exhibiting a high-normal or larger UACR. Both hypertension and age are independent risk factors that increase the UACR. The present study indicated that a considerable percentage of patients require anti-hypertensive drugs with antiproteinuric effects at the start of treatment.

  11. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto

    2013-01-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue

  12. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)

    2013-11-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.

  13. Recurrent major depression and right hippocampal volume: A bivariate linkage and association study.

    Science.gov (United States)

    Mathias, Samuel R; Knowles, Emma E M; Kent, Jack W; McKay, D Reese; Curran, Joanne E; de Almeida, Marcio A A; Dyer, Thomas D; Göring, Harald H H; Olvera, Rene L; Duggirala, Ravi; Fox, Peter T; Almasy, Laura; Blangero, John; Glahn, David C

    2016-01-01

    Previous work has shown that the hippocampus is smaller in the brains of individuals suffering from major depressive disorder (MDD) than those of healthy controls. Moreover, right hippocampal volume specifically has been found to predict the probability of subsequent depressive episodes. This study explored the utility of right hippocampal volume as an endophenotype of recurrent MDD (rMDD). We observed a significant genetic correlation between the two traits in a large sample of Mexican American individuals from extended pedigrees (ρg = -0.34, p = 0.013). A bivariate linkage scan revealed a significant pleiotropic quantitative trait locus on chromosome 18p11.31-32 (LOD = 3.61). Bivariate association analysis conducted under the linkage peak revealed a variant (rs574972) within an intron of the gene SMCHD1 meeting the corrected significance level (χ(2) = 19.0, p = 7.4 × 10(-5)). Univariate association analyses of each phenotype separately revealed that the same variant was significant for right hippocampal volume alone, and also revealed a suggestively significant variant (rs12455524) within the gene DLGAP1 for rMDD alone. The results implicate right-hemisphere hippocampal volume as a possible endophenotype of rMDD, and in so doing highlight a potential gene of interest for rMDD risk. © 2015 Wiley Periodicals, Inc.

  14. Comparison of Six Methods for the Detection of Causality in a Bivariate Time Series

    Czech Academy of Sciences Publication Activity Database

    Krakovská, A.; Jakubík, J.; Chvosteková, M.; Coufal, David; Jajcay, Nikola; Paluš, Milan

    2018-01-01

    Roč. 97, č. 4 (2018), č. článku 042207. ISSN 2470-0045 R&D Projects: GA MZd(CZ) NV15-33250A Institutional support: RVO:67985807 Keywords : comparative study * causality detection * bivariate models * Granger causality * transfer entropy * convergent cross mappings Impact factor: 2.366, year: 2016 https://journals.aps.org/pre/abstract/10.1103/PhysRevE.97.042207

  15. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    Science.gov (United States)

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.

  16. Bivariate copulas on the exponentially weighted moving average control chart

    Directory of Open Access Journals (Sweden)

    Sasigarn Kuvattana

    2016-10-01

    Full Text Available This paper proposes four types of copulas on the Exponentially Weighted Moving Average (EWMA control chart when observations are from an exponential distribution using a Monte Carlo simulation approach. The performance of the control chart is based on the Average Run Length (ARL which is compared for each copula. Copula functions for specifying dependence between random variables are used and measured by Kendall’s tau. The results show that the Normal copula can be used for almost all shifts.

  17. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  19. Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities

    International Nuclear Information System (INIS)

    Waite, D.A.; Denham, D.H.

    1975-01-01

    The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these

  20. Normal Approximations to the Distributions of the Wilcoxon Statistics: Accurate to What "N"? Graphical Insights

    Science.gov (United States)

    Bellera, Carine A.; Julien, Marilyse; Hanley, James A.

    2010-01-01

    The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…

  1. Can the bivariate Hurst exponent be higher than an average of the separate Hurst exponents?

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2015-01-01

    Roč. 431, č. 1 (2015), s. 124-127 ISSN 0378-4371 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : Correlations * Power- law cross-correlations * Bivariate Hurst exponent * Spectrum coherence Subject RIV: AH - Economics Impact factor: 1.785, year: 2015 http://library.utia.cas.cz/separaty/2015/E/kristoufek-0452314.pdf

  2. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    Science.gov (United States)

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution

    DEFF Research Database (Denmark)

    Stentoft, Lars Peter

    In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... properties shows that there are important option pricing differences compared to the Gaussian case as well as to the symmetric special case. A large scale empirical examination shows that our model outperforms the Gaussian case for pricing options on three large US stocks as well as a major index...

  4. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  5. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    Energy Technology Data Exchange (ETDEWEB)

    Goodarzi, Samereh, E-mail: samere.g@gmail.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Pazirandeh, Ali, E-mail: paziran@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Jameie, Seyed Behnamedin, E-mail: behnamjameie@tums.ac.ir [Basic Science Department, Faculty of Allied Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Anatomy, Faculty of Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Baghban Khojasteh, Nasrin, E-mail: khojasteh_n@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of)

    2012-06-15

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: Black-Right-Pointing-Pointer Boron distribution in male and female rats' normal brain was studied in this research. Black-Right-Pointing-Pointer Coronal sections of animal tissue samples were irradiated with thermal neutrons. Black-Right-Pointing-Pointer Alpha and Lithium tracks were counted using alpha autoradiography. Black-Right-Pointing-Pointer Different boron concentration was seen in brain sections of male and female rats. Black-Right-Pointing-Pointer The highest boron concentration was seen in 4 h after boron compound injection.

  6. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  7. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  8. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering

    Science.gov (United States)

    Gamayunov, K. V.; Khazanov, G. V.

    2006-01-01

    calculate the pitch-angle diffusion coefficients using the typical wave normal distributions obtained from our self-consistent ring current-EMIC wave model, and try to quantify the effect of EMIC wave normal angle characteristics on relativistic electron scattering.

  9. Application of bivariate mapping for hydrological classification and analysis of temporal change and scale effects in Switzerland

    NARCIS (Netherlands)

    Speich, Matthias J.R.; Bernhard, Luzi; Teuling, Ryan; Zappa, Massimiliano

    2015-01-01

    Hydrological classification schemes are important tools for assessing the impacts of a changing climate on the hydrology of a region. In this paper, we present bivariate mapping as a simple means of classifying hydrological data for a quantitative and qualitative assessment of temporal change.

  10. Skewed Normal Distribution Of Return Assets In Call European Option Pricing

    Directory of Open Access Journals (Sweden)

    Evy Sulistianingsih

    2011-12-01

    Full Text Available Option is one of security derivates. In financial market, option is a contract that gives a right (notthe obligation for its owner to buy or sell a particular asset for a certain price at a certain time.Option can give a guarantee for a risk that can be faced in a market.This paper studies about theuse of Skewed Normal Distribution (SN in call europeanoption pricing. The SN provides aflexible framework that captures the skewness of log return. We obtain aclosed form solution forthe european call option pricing when log return follow the SN. Then, we will compare optionprices that is obtained by the SN and the Black-Scholes model with the option prices of market. Keywords: skewed normaldistribution, log return, options.

  11. Reply to: Are There More Gifted People than Would Be Expected on a Normal Distribution?

    Science.gov (United States)

    Gallagher, James J.

    2014-01-01

    The author responds to the article by Warne, Godwin, and Smith (2013) on the question of whether there are more gifted people than would be expected in a Gaussian normal distribution. He asserts that the answer to this question is yes, based on (a) data that he and his colleagues have collected, (b) data that are already available and quoted by…

  12. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    Science.gov (United States)

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  13. Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles

    Science.gov (United States)

    Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya

    2018-03-01

    A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.

  14. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  15. Bivariate Genomic Footprinting Detects Changes in Transcription Factor Activity

    Directory of Open Access Journals (Sweden)

    Songjoon Baek

    2017-05-01

    Full Text Available In response to activating signals, transcription factors (TFs bind DNA and regulate gene expression. TF binding can be measured by protection of the bound sequence from DNase digestion (i.e., footprint. Here, we report that 80% of TF binding motifs do not show a measurable footprint, partly because of a variable cleavage pattern within the motif sequence. To more faithfully portray the effect of TFs on chromatin, we developed an algorithm that captures two TF-dependent effects on chromatin accessibility: footprinting and motif-flanking accessibility. The algorithm, termed bivariate genomic footprinting (BaGFoot, efficiently detects TF activity. BaGFoot is robust to different accessibility assays (DNase-seq, ATAC-seq, all examined peak-calling programs, and a variety of cut bias correction approaches. BaGFoot reliably predicts TF binding and provides valuable information regarding the TFs affecting chromatin accessibility in various biological systems and following various biological events, including in cases where an absolute footprint cannot be determined.

  16. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    Science.gov (United States)

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  17. Preparation, distribution, stability and tumor imaging properties of [62Zn] Bleomycin complex in normal and tumor-bearing mice

    International Nuclear Information System (INIS)

    Jalilian, A.R.; Fateh, B.; Ghergherehchi, M.; Karimian, A.; Matloobi, M.; Moradkhani, S.; Kamalidehghan, M.; Tabeie, F.

    2003-01-01

    Backgrounds: Bleomycin (BLM) has been labeled with radioisotopes and widely used in therapy and diagnosis. In this study BLM was labeled with [ 62 Zn] zinc chloride for oncologic PET studies. Materials and methods: The complex was obtained at the P H=2 normal saline at 90 d eg C in 60 min. Radio-TLC showed on overall radiochemical yield of 95-97% (radiochemical purity>97%). Stability of complex was checked in vitro in mice and human plasma/urine. Results: Preliminary in vitro studies performed to determined complex stability and distribution of [ 62 Zn] BLM in normal and fibrosarcoma tumors in mice according to bio-distribution/imaging studies. Conclusion: [ 62 Zn] BLM can be used in PET oncology studies due to its suitable physico-chemical propertied as a diagnostic complex behavior in higher animals

  18. Improving risk estimates of runoff producing areas: formulating variable source areas as a bivariate process.

    Science.gov (United States)

    Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd

    2014-05-01

    Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Exact, time-independent estimation of clone size distributions in normal and mutated cells.

    Science.gov (United States)

    Roshan, A; Jones, P H; Greenman, C D

    2014-10-06

    Biological tools such as genetic lineage tracing, three-dimensional confocal microscopy and next-generation DNA sequencing are providing new ways to quantify the distribution of clones of normal and mutated cells. Understanding population-wide clone size distributions in vivo is complicated by multiple cell types within observed tissues, and overlapping birth and death processes. This has led to the increased need for mathematically informed models to understand their biological significance. Standard approaches usually require knowledge of clonal age. We show that modelling on clone size independent of time is an alternative method that offers certain analytical advantages; it can help parametrize these models, and obtain distributions for counts of mutated or proliferating cells, for example. When applied to a general birth-death process common in epithelial progenitors, this takes the form of a gambler's ruin problem, the solution of which relates to counting Motzkin lattice paths. Applying this approach to mutational processes, alternative, exact, formulations of classic Luria-Delbrück-type problems emerge. This approach can be extended beyond neutral models of mutant clonal evolution. Applications of these approaches are twofold. First, we resolve the probability of progenitor cells generating proliferating or differentiating progeny in clonal lineage tracing experiments in vivo or cell culture assays where clone age is not known. Second, we model mutation frequency distributions that deep sequencing of subclonal samples produce.

  20. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  1. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.; Jun, M.

    2015-01-01

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  2. Comparative pharmacokinetic and tissue distribution profiles of four major bioactive components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Wang, Chang-Hong; Tao, Yan-Yan; Zhou, Hua; Liu, Cheng-Hai

    2015-10-10

    Fuzheng Huayu recipe (FZHY) is a herbal product for the treatment of liver fibrosis approved by the Chinese State Food and Drug Administration (SFDA), but its pharmacokinetics and tissue distribution had not been investigated. In this study, the liver fibrotic model was induced with intraperitoneal injection of dimethylnitrosamine (DMN), and FZHY was given orally to the model and normal rats. The plasma pharmacokinetics and tissue distribution profiles of four major bioactive components from FZHY were analyzed in the normal and fibrotic rat groups using an ultrahigh performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. Results revealed that the bioavailabilities of danshensu (DSS), salvianolic acid B (SAB) and rosmarinic acid (ROS) in liver fibrotic rats increased 1.49, 3.31 and 2.37-fold, respectively, compared to normal rats. There was no obvious difference in the pharmacokinetics of amygdalin (AMY) between the normal and fibrotic rats. The tissue distribution of DSS, SAB, and AMY trended to be mostly in the kidney and lung. The distribution of DSS, SAB, and AMY in liver tissue of the model rats was significantly decreased compared to the normal rats. Significant differences in the pharmacokinetics and tissue distribution profiles of DSS, ROS, SAB and AMY were observed in rats with hepatic fibrosis after oral administration of FZHY. These results provide a meaningful basis for developing a clinical dosage regimen in the treatment of hepatic fibrosis by FZHY. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. SNPMClust: Bivariate Gaussian Genotype Clustering and Calling for Illumina Microarrays

    Directory of Open Access Journals (Sweden)

    Stephen W. Erickson

    2016-07-01

    Full Text Available SNPMClust is an R package for genotype clustering and calling with Illumina microarrays. It was originally developed for studies using the GoldenGate custom genotyping platform but can be used with other Illumina platforms, including Infinium BeadChip. The algorithm first rescales the fluorescent signal intensity data, adds empirically derived pseudo-data to minor allele genotype clusters, then uses the package mclust for bivariate Gaussian model fitting. We compared the accuracy and sensitivity of SNPMClust to that of GenCall, Illumina's proprietary algorithm, on a data set of 94 whole-genome amplified buccal (cheek swab DNA samples. These samples were genotyped on a custom panel which included 1064 SNPs for which the true genotype was known with high confidence. SNPMClust produced uniformly lower false call rates over a wide range of overall call rates.

  4. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  5. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    CERN Document Server

    Smolyar, V A; Eremin, V V

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well

  6. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    International Nuclear Information System (INIS)

    Smolyar, V.A.; Eremin, A.V.; Eremin, V.V.

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well [ru

  7. The Weight of Euro Coins: Its Distribution Might Not Be as Normal as You Would Expect

    Science.gov (United States)

    Shkedy, Ziv; Aerts, Marc; Callaert, Herman

    2006-01-01

    Classical regression models, ANOVA models and linear mixed models are just three examples (out of many) in which the normal distribution of the response is an essential assumption of the model. In this paper we use a dataset of 2000 euro coins containing information (up to the milligram) about the weight of each coin, to illustrate that the…

  8. Statistical Inference for a Class of Multivariate Negative Binomial Distributions

    DEFF Research Database (Denmark)

    Rubak, Ege H.; Møller, Jesper; McCullagh, Peter

    This paper considers statistical inference procedures for a class of models for positively correlated count variables called -permanental random fields, and which can be viewed as a family of multivariate negative binomial distributions. Their appealing probabilistic properties have earlier been...... studied in the literature, while this is the first statistical paper on -permanental random fields. The focus is on maximum likelihood estimation, maximum quasi-likelihood estimation and on maximum composite likelihood estimation based on uni- and bivariate distributions. Furthermore, new results...

  9. Rough Sets and Stomped Normal Distribution for Simultaneous Segmentation and Bias Field Correction in Brain MR Images.

    Science.gov (United States)

    Banerjee, Abhirup; Maji, Pradipta

    2015-12-01

    The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.

  10. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  11. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  12. Distribution Log Normal of 222 Rn in the state of Zacatecas, Mexico

    International Nuclear Information System (INIS)

    Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    In this work the evaluation of the concentration of 222 Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of 222 Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of 222 Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of 222 Rn in air are near to the one limit settled down by the EPA of 148 Bq/m 3 are presented. (Author)

  13. Quantum arrival times and operator normalization

    International Nuclear Information System (INIS)

    Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.

    2003-01-01

    A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization

  14. Topographical Distribution of Arsenic, Manganese, and Selenium in the Normal Human Brain

    DEFF Research Database (Denmark)

    Larsen, Niels Agersnap; Pakkenberg, H.; Damsgaard, Else

    1979-01-01

    The concentrations of arsenic, manganese and selenium per gram wet tissue weight were determined in samples from 24 areas of normal human brains from 5 persons with ages ranging from 15 to 81 years of age. The concentrations of the 3 elements were determined for each sample by means of neutron...... activation analysis with radiochemical separation. Distinct patterns of distribution were shown for each of the 3 elements. Variations between individuals were found for some but not all brain areas, resulting in coefficients of variation between individuals of about 30% for arsenic, 10% for manganese and 20......% for selenium. The results seem to indicate that arsenic is associated with the lipid phase, manganese with the dry matter and selenium with the aqueous phase of brain tissue....

  15. Statistical distributions applications and parameter estimates

    CERN Document Server

    Thomopoulos, Nick T

    2017-01-01

    This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability.  Understanding statistical distributions is fundamental for researchers in almost all disciplines.  The informed researcher will select the statistical distribution that best fits the data in the study at hand.  Some of the distributions are well known to the general researcher and are in use in a wide variety of ways.  Other useful distributions are less understood and are not in common use.  The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study.  The distributions are for continuous, discrete, and bivariate random variables.  In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values.  In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...

  16. Geovisualization of land use and land cover using bivariate maps and Sankey flow diagrams

    Science.gov (United States)

    Strode, Georgianna; Mesev, Victor; Thornton, Benjamin; Jerez, Marjorie; Tricarico, Thomas; McAlear, Tyler

    2018-05-01

    The terms `land use' and `land cover' typically describe categories that convey information about the landscape. Despite the major difference of land use implying some degree of anthropogenic disturbance, the two terms are commonly used interchangeably, especially when anthropogenic disturbance is ambiguous, say managed forestland or abandoned agricultural fields. Cartographically, land use and land cover are also sometimes represented interchangeably within common legends, giving with the impression that the landscape is a seamless continuum of land use parcels spatially adjacent to land cover tracts. We believe this is misleading, and feel we need to reiterate the well-established symbiosis of land uses as amalgams of land covers; in other words land covers are subsets of land use. Our paper addresses this spatially complex, and frequently ambiguous relationship, and posits that bivariate cartographic techniques are an ideal vehicle for representing both land use and land cover simultaneously. In more specific terms, we explore the use of nested symbology as ways to represent graphically land use and land cover, where land cover are circles nested with land use squares. We also investigate bivariate legends for representing statistical covariance as a means for visualizing the combinations of land use and cover. Lastly, we apply Sankey flow diagrams to further illustrate the complex, multifaceted relationships between land use and land cover. Our work is demonstrated on data representing land use and cover data for the US state of Florida.

  17. Surface to 90 km winds for Kennedy Space Center, Florida, and Vandenberg AFB, California

    Science.gov (United States)

    Johnson, D. L.; Brown, S. C.

    1979-01-01

    Bivariate normal wind statistics for a 90 degree flight azimuth, from 0 through 90 km altitude, for Kennedy Space Center, Florida, and Vandenberg AFB, California are presented. Wind probability distributions and statistics for any rotation of axes can be computed from the five given parameters.

  18. Effects of a primordial magnetic field with log-normal distribution on the cosmic microwave background

    International Nuclear Information System (INIS)

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Takahashi, Keitaro

    2011-01-01

    We study the effect of primordial magnetic fields (PMFs) on the anisotropies of the cosmic microwave background (CMB). We assume the spectrum of PMFs is described by log-normal distribution which has a characteristic scale, rather than power-law spectrum. This scale is expected to reflect the generation mechanisms and our analysis is complementary to previous studies with power-law spectrum. We calculate power spectra of energy density and Lorentz force of the log-normal PMFs, and then calculate CMB temperature and polarization angular power spectra from scalar, vector, and tensor modes of perturbations generated from such PMFs. By comparing these spectra with WMAP7, QUaD, CBI, Boomerang, and ACBAR data sets, we find that the current CMB data set places the strongest constraint at k≅10 -2.5 Mpc -1 with the upper limit B < or approx. 3 nG.

  19. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  20. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  1. Pharmacokinetics and tissue distribution of five active ingredients of Eucommiae cortex in normal and ovariectomized mice by UHPLC-MS/MS.

    Science.gov (United States)

    An, Jing; Hu, Fangdi; Wang, Changhong; Zhang, Zijia; Yang, Li; Wang, Zhengtao

    2016-09-01

    1. Pinoresinol di-O-β-d-glucopyranoside (PDG), geniposide (GE), geniposidic acid (GA), aucubin (AN) and chlorogenic acid (CA) are the representative active ingredients in Eucommiae cortex (EC), which may be estrogenic. 2. The ultra high-performance liquid chromatography/tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous determination of the five ingredients showed good linearity, low limits of quantification and high extraction recoveries, as well as acceptable precision, accuracy and stability in mice plasma and tissue samples (liver, spleen, kidney and uterus). It was successfully applied to the comparative study on pharmacokinetics and tissue distribution of PDG, GE, GA, AN and CA between normal and ovariectomized (OVX) mice. 3. The results indicated that except CA, the plasma and tissue concentrations of PDG, GE, GA in OVX mice were all greater than those in normal mice. AN could only be detected in the plasma and liver homogenate of normal mice, which was poorly absorbed in OVX mice and low in other measured tissues. PDG, GE and GA seem to be better absorbed in OVX mice than in normal mice proved by the remarkable increased value of AUC0-∞ and Cmax. It is beneficial that PDG, GE, GA have better plasma absorption and tissue distribution in pathological state.

  2. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  3. Distribution of erlotinib in rash and normal skin in cancer patients receiving erlotinib visualized by matrix assisted laser desorption/ionization mass spectrometry imaging.

    Science.gov (United States)

    Nishimura, Meiko; Hayashi, Mitsuhiro; Mizutani, Yu; Takenaka, Kei; Imamura, Yoshinori; Chayahara, Naoko; Toyoda, Masanori; Kiyota, Naomi; Mukohara, Toru; Aikawa, Hiroaki; Fujiwara, Yasuhiro; Hamada, Akinobu; Minami, Hironobu

    2018-04-06

    The development of skin rashes is the most common adverse event observed in cancer patients treated with epidermal growth factor receptor-tyrosine kinase inhibitors such as erlotinib. However, the pharmacological evidence has not been fully revealed. Erlotinib distribution in the rashes was more heterogeneous than that in the normal skin, and the rashes contained statistically higher concentrations of erlotinib than adjacent normal skin in the superficial skin layer (229 ± 192 vs. 120 ± 103 ions/mm 2 ; P = 0.009 in paired t -test). LC-MS/MS confirmed that the concentration of erlotinib in the skin rashes was higher than that in normal skin in the superficial skin layer (1946 ± 1258 vs. 1174 ± 662 ng/cm 3 ; P = 0.028 in paired t -test). The results of MALDI-MSI and LC-MS/MS were well correlated (coefficient of correlation 0.879, P distribution of erlotinib in the skin tissue was visualized using non-labeled MALDI-MSI. Erlotinib concentration in the superficial layer of the skin rashes was higher than that in the adjacent normal skin. We examined patients with advanced pancreatic cancer who developed skin rashes after treatment with erlotinib and gemcitabine. We biopsied both the rash and adjacent normal skin tissues, and visualized and compared the distribution of erlotinib within the skin using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). The tissue concentration of erlotinib was also measured by liquid chromatography-tandem mass spectrometry (LC-MS/MS) with laser microdissection.

  4. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. Visualizing Tensor Normal Distributions at Multiple Levels of Detail.

    Science.gov (United States)

    Abbasloo, Amin; Wiens, Vitalis; Hermann, Max; Schultz, Thomas

    2016-01-01

    Despite the widely recognized importance of symmetric second order tensor fields in medicine and engineering, the visualization of data uncertainty in tensor fields is still in its infancy. A recently proposed tensorial normal distribution, involving a fourth order covariance tensor, provides a mathematical description of how different aspects of the tensor field, such as trace, anisotropy, or orientation, vary and covary at each point. However, this wealth of information is far too rich for a human analyst to take in at a single glance, and no suitable visualization tools are available. We propose a novel approach that facilitates visual analysis of tensor covariance at multiple levels of detail. We start with a visual abstraction that uses slice views and direct volume rendering to indicate large-scale changes in the covariance structure, and locations with high overall variance. We then provide tools for interactive exploration, making it possible to drill down into different types of variability, such as in shape or orientation. Finally, we allow the analyst to focus on specific locations of the field, and provide tensor glyph animations and overlays that intuitively depict confidence intervals at those points. Our system is demonstrated by investigating the effects of measurement noise on diffusion tensor MRI, and by analyzing two ensembles of stress tensor fields from solid mechanics.

  7. Bivariate quadratic method in quantifying the differential capacitance and energy capacity of supercapacitors under high current operation

    Science.gov (United States)

    Goh, Chin-Teng; Cruden, Andrew

    2014-11-01

    Capacitance and resistance are the fundamental electrical parameters used to evaluate the electrical characteristics of a supercapacitor, namely the dynamic voltage response, energy capacity, state of charge and health condition. In the British Standards EN62391 and EN62576, the constant capacitance method can be further improved with a differential capacitance that more accurately describes the dynamic voltage response of supercapacitors. This paper presents a novel bivariate quadratic based method to model the dynamic voltage response of supercapacitors under high current charge-discharge cycling, and to enable the derivation of the differential capacitance and energy capacity directly from terminal measurements, i.e. voltage and current, rather than from multiple pulsed-current or excitation signal tests across different bias levels. The estimation results the author achieves are in close agreement with experimental measurements, within a relative error of 0.2%, at various high current levels (25-200 A), more accurate than the constant capacitance method (4-7%). The archival value of this paper is the introduction of an improved quantification method for the electrical characteristics of supercapacitors, and the disclosure of the distinct properties of supercapacitors: the nonlinear capacitance-voltage characteristic, capacitance variation between charging and discharging, and distribution of energy capacity across the operating voltage window.

  8. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    Science.gov (United States)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  9. A Platoon Dispersion Model Based on a Truncated Normal Distribution of Speed

    Directory of Open Access Journals (Sweden)

    Ming Wei

    2012-01-01

    Full Text Available Understanding platoon dispersion is critical for the coordination of traffic signal control in an urban traffic network. Assuming that platoon speed follows a truncated normal distribution, ranging from minimum speed to maximum speed, this paper develops a piecewise density function that describes platoon dispersion characteristics as the platoon moves from an upstream to a downstream intersection. Based on this density function, the expected number of cars in the platoon that pass the downstream intersection, and the expected number of cars in the platoon that do not pass the downstream point are calculated. To facilitate coordination in a traffic signal control system, dispersion models for the front and the rear of the platoon are also derived. Finally, a numeric computation for the coordination of successive signals is presented to illustrate the validity of the proposed model.

  10. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  11. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues.

    Science.gov (United States)

    Foldager, Casper Bindzus; Toh, Wei Seong; Gomoll, Andreas H; Olsen, Bjørn Reino; Spector, Myron

    2014-04-01

    The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti-collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional roles of these 2 extracellular matrix proteins

  12. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues

    Science.gov (United States)

    Toh, Wei Seong; Gomoll, Andreas H.; Olsen, Bjørn Reino; Spector, Myron

    2014-01-01

    Objective: The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Design: Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti–collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. Results: When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. Conclusions: We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional

  13. Using bivariate latent basis growth curve analysis to better understand treatment outcome in youth with anorexia nervosa.

    Science.gov (United States)

    Byrne, Catherine E; Wonderlich, Joseph A; Curby, Timothy; Fischer, Sarah; Lock, James; Le Grange, Daniel

    2018-04-25

    This study explored the relation between eating-related obsessionality and weight restoration utilizing bivariate latent basis growth curve modelling. Eating-related obsessionality is a moderator of treatment outcome for adolescents with anorexia nervosa (AN). This study examined the degree to which the rate of change in eating-related obsessionality was associated with the rate of change in weight over time in family-based treatment (FBT) and individual therapy for AN. Data were drawn from a 2-site randomized controlled trial that compared FBT and adolescent focused therapy for AN. Bivariate latent basis growth curves were used to examine the differences of the relations between trajectories of body weight and symptoms associated with eating and weight obsessionality. In the FBT group, the slope of eating-related obsessionality scores and the slope of weight were significantly (negatively) correlated. This finding indicates that a decrease in overall eating-relating obsessionality is significantly associated with an increase in weight for individuals who received FBT. However, there was no relation between change in obsessionality scores and change in weight in the adolescent focused therapy group. Results suggest that FBT has a specific impact on both weight gain and obsessive compulsive behaviour that is distinct from individual therapy. Copyright © 2018 John Wiley & Sons, Ltd and Eating Disorders Association.

  14. Bivariate Cointegration Analysis of Energy-Economy Interactions in Iran

    Directory of Open Access Journals (Sweden)

    Ismail Oladimeji Soile

    2015-12-01

    Full Text Available Fixing the prices of energy products below their opportunity cost for welfare and redistribution purposes is common with governments of many oil producing developing countries. This has often resulted in huge energy consumption in developing countries and the question that emerge is whether this increased energy consumption results in higher economic activities. Available statistics show that Iran’s economy growth shrunk for the first time in two decades from 2011 amidst the introduction of pricing reform in 2010 and 2014 suggesting a relationship between energy use and economic growth. Accordingly, the study examined the causality and the likelihood of a long term relationship between energy and economic growth in Iran. Unlike previous studies which have focused on the effects and effectiveness of the reform, the paper investigates the rationale for the reform. The study applied a bivariate cointegration time series econometric approach. The results reveals a one-way causality running from economic growth to energy with no feedback with evidence of long run connection. The implication of this is that energy conservation policy is not inimical to economic growth. This evidence lend further support for the ongoing subsidy reforms in Iran as a measure to check excessive and inefficient use of energy.

  15. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, Addendum

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.

  16. Retention and subcellular distribution of 67Ga in normal organs

    International Nuclear Information System (INIS)

    Ando, A.; Ando, I.; Hiraki, T.

    1986-01-01

    Using normal rats, retention values and subcellular distribution of 67 Ga in each organ were investigated. At 10 min after administration of 67 Ga-citrate the retention value of 67 Ga in blood was 6.77% dose/g, and this value decreased with time. The values for skeletal muscle, lung, pancreas, adrenal, heart muscle, brain, small intestine, large intestine and spinal cord were the highest at 10 min after administration, and they decreased with time. Conversely this value in bone increased until 10 days after injection. But in the liver, kidney, and stomach, these values increased with time after administration and were highest 24 h or 48 h after injection. After that, they decreased with time. The value in spleen reached a plateau 48 h after administration, and hardly varied for 10 days. From the results of subcellular fractionation, it was deduced that lysosome plays quite an important role in the concentration of 67 Ga in small intestine, stomach, lung, kidney and pancreas; a lesser role in its concentration in heart muscle, and hardly any role in the 67 Ga accumulation in skeletal muscle. In spleen, the contents in nuclear, mitochrondrial, microsomal, and supernatant fractions all contributed to the accumulation of 67 Ga. (orig.) [de

  17. Are There More Gifted People Than Would Be Expected in a Normal Distribution? An Investigation of the Overabundance Hypothesis

    Science.gov (United States)

    Warne, Russell T.; Godwin, Lindsey R.; Smith, Kyle V.

    2013-01-01

    Among some gifted education researchers, advocates, and practitioners, it is sometimes believed that there is a larger number of gifted people in the general population than would be predicted from a normal distribution (e.g., Gallagher, 2008; N. M. Robinson, Zigler, & Gallagher, 2000; Silverman, 1995, 2009), a belief that we termed the…

  18. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  19. M-dwarf exoplanet surface density distribution. A log-normal fit from 0.07 to 400 AU

    Science.gov (United States)

    Meyer, Michael R.; Amara, Adam; Reggiani, Maddalena; Quanz, Sascha P.

    2018-04-01

    Aims: We fit a log-normal function to the M-dwarf orbital surface density distribution of gas giant planets, over the mass range 1-10 times that of Jupiter, from 0.07 to 400 AU. Methods: We used a Markov chain Monte Carlo approach to explore the likelihoods of various parameter values consistent with point estimates of the data given our assumed functional form. Results: This fit is consistent with radial velocity, microlensing, and direct-imaging observations, is well-motivated from theoretical and phenomenological points of view, and predicts results of future surveys. We present probability distributions for each parameter and a maximum likelihood estimate solution. Conclusions: We suggest that this function makes more physical sense than other widely used functions, and we explore the implications of our results on the design of future exoplanet surveys.

  20. Reliability Implications in Wood Systems of a Bivariate Gaussian-Weibull Distribution and the Associated Univariate Pseudo-truncated Weibull

    Science.gov (United States)

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2014-01-01

    Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...

  1. Normal bone and soft tissue distribution of fluorine-18-sodium fluoride and artifacts on 18F-NaF PET/CT bone scan: a pictorial review.

    Science.gov (United States)

    Sarikaya, Ismet; Elgazzar, Abdelhamid H; Sarikaya, Ali; Alfeeli, Mahmoud

    2017-10-01

    Fluorine-18-sodium fluoride (F-NaF) PET/CT is a relatively new and high-resolution bone imaging modality. Since the use of F-NaF PET/CT has been increasing, it is important to accurately assess the images and be aware of normal distribution and major artifacts. In this pictorial review article, we will describe the normal uptake patterns of F-NaF in the bone tissues, particularly in complex structures, as well as its physiologic soft tissue distribution and certain artifacts seen on F-NaF PET/CT images.

  2. Normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man; evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I-131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I-131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The normal adrenal glands were seldom seen and faintly imaged in 2% at 24 h after injection and in 16% at 48 h, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extra-adrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I-131 MIBG uptake at 24 through 72 h

  3. The effect of signal variability on the histograms of anthropomorphic channel outputs: factors resulting in non-normally distributed data

    Science.gov (United States)

    Elshahaby, Fatma E. A.; Ghaly, Michael; Jha, Abhinav K.; Frey, Eric C.

    2015-03-01

    Model Observers are widely used in medical imaging for the optimization and evaluation of instrumentation, acquisition parameters and image reconstruction and processing methods. The channelized Hotelling observer (CHO) is a commonly used model observer in nuclear medicine and has seen increasing use in other modalities. An anthropmorphic CHO consists of a set of channels that model some aspects of the human visual system and the Hotelling Observer, which is the optimal linear discriminant. The optimality of the CHO is based on the assumption that the channel outputs for data with and without the signal present have a multivariate normal distribution with equal class covariance matrices. The channel outputs result from the dot product of channel templates with input images and are thus the sum of a large number of random variables. The central limit theorem is thus often used to justify the assumption that the channel outputs are normally distributed. In this work, we aim to examine this assumption for realistically simulated nuclear medicine images when various types of signal variability are present.

  4. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  5. Genetic correlations between body condition scores and fertility in dairy cattle using bivariate random regression models.

    Science.gov (United States)

    De Haas, Y; Janss, L L G; Kadarmideen, H N

    2007-10-01

    Genetic correlations between body condition score (BCS) and fertility traits in dairy cattle were estimated using bivariate random regression models. BCS was recorded by the Swiss Holstein Association on 22,075 lactating heifers (primiparous cows) from 856 sires. Fertility data during first lactation were extracted for 40,736 cows. The fertility traits were days to first service (DFS), days between first and last insemination (DFLI), calving interval (CI), number of services per conception (NSPC) and conception rate to first insemination (CRFI). A bivariate model was used to estimate genetic correlations between BCS as a longitudinal trait by random regression components, and daughter's fertility at the sire level as a single lactation measurement. Heritability of BCS was 0.17, and heritabilities for fertility traits were low (0.01-0.08). Genetic correlations between BCS and fertility over the lactation varied from: -0.45 to -0.14 for DFS; -0.75 to 0.03 for DFLI; from -0.59 to -0.02 for CI; from -0.47 to 0.33 for NSPC and from 0.08 to 0.82 for CRFI. These results show (genetic) interactions between fat reserves and reproduction along the lactation trajectory of modern dairy cows, which can be useful in genetic selection as well as in management. Maximum genetic gain in fertility from indirect selection on BCS should be based on measurements taken in mid lactation when the genetic variance for BCS is largest, and the genetic correlations between BCS and fertility is strongest.

  6. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  7. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  8. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  9. The Analysis of Bankruptcy Risk Using the Normal Distribution Gauss-Laplace in Case of a Company is the Most Modern Romanian Sea-River Port on the Danube

    Directory of Open Access Journals (Sweden)

    Rodica Pripoaie

    2015-08-01

    Full Text Available This work presents the application of the normal distribution Gauss-Laplace in case of a company is the most modern Romanian sea-river port on the Danube, specialized service providers, with a handling capacity of approx. 20,000,000 tons / year. The normal distribution Gauss-Laplace is the most known and used probability distribution, because it surprises better the evolution of economic and financial phenomena. Around the average, which has the greatest frequency, gravitate values more to less distant than average, but with the same standard deviation. It is noted that, although used in the forecasting calculations, analysis of profitability threshold - even ignores the risk of decisional operations (regarding deviations between the forecast and achievements, which may, in certain circumstances, influence much the activity of the company. This can be held into account when carefully studying the evolution of turnover follows a law of probability. In case not exist any information on the law of probability of turnover and no reason that one case appear more than another, according of Laplace law, we consider that these cases are uniformly distributed, therefore they follow a normal distribution.

  10. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Science.gov (United States)

    Watabe, Tadashi; Naka, Sadahiro; Ikeda, Hayato; Horitsugi, Genki; Kanai, Yasukazu; Isohashi, Kayako; Ishibashi, Mana; Kato, Hiroki; Shimosegawa, Eku; Watabe, Hiroshi; Hatazawa, Jun

    2014-01-01

    Acetylcholinesterase (AChE) inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11)C-Donepezil (DNP) and the AChE activity in the normal rat, with special focus on the adrenal glands. The distribution of (11)C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g). A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11)C-DNP (45.0 ± 10.7 MBq). The whole-body distribution of the (11)C-DNP PET was evaluated based on the Vt (total distribution volume) by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11)C-DNP in the body (following the liver) (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3), respectively), indicating that the distribution of (11)C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach) (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively), indicating high activity of AChE in the adrenal glands. We demonstrated the whole-body distribution of (11)C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11)C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  11. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Directory of Open Access Journals (Sweden)

    Tadashi Watabe

    Full Text Available PURPOSE: Acetylcholinesterase (AChE inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11C-Donepezil (DNP and the AChE activity in the normal rat, with special focus on the adrenal glands. METHODS: The distribution of (11C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g. A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11C-DNP (45.0 ± 10.7 MBq. The whole-body distribution of the (11C-DNP PET was evaluated based on the Vt (total distribution volume by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. RESULTS: The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11C-DNP in the body (following the liver (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3, respectively, indicating that the distribution of (11C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively, indicating high activity of AChE in the adrenal glands. CONCLUSIONS: We demonstrated the whole-body distribution of (11C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  12. Simulation study of pO2 distribution in induced tumour masses and normal tissues within a microcirculation environment.

    Science.gov (United States)

    Li, Mao; Li, Yan; Wen, Peng Paul

    2014-01-01

    The biological microenvironment is interrupted when tumour masses are introduced because of the strong competition for oxygen. During the period of avascular growth of tumours, capillaries that existed play a crucial role in supplying oxygen to both tumourous and healthy cells. Due to limitations of oxygen supply from capillaries, healthy cells have to compete for oxygen with tumourous cells. In this study, an improved Krogh's cylinder model which is more realistic than the previously reported assumption that oxygen is homogeneously distributed in a microenvironment, is proposed to describe the process of the oxygen diffusion from a capillary to its surrounding environment. The capillary wall permeability is also taken into account. The simulation study is conducted and the results show that when tumour masses are implanted at the upstream part of a capillary and followed by normal tissues, the whole normal tissues suffer from hypoxia. In contrast, when normal tissues are ahead of tumour masses, their pO2 is sufficient. In both situations, the pO2 in the whole normal tissues drops significantly due to the axial diffusion at the interface of normal tissues and tumourous cells. As the existence of the axial oxygen diffusion cannot supply the whole tumour masses, only these tumourous cells that are near the interface can be partially supplied, and have a small chance to survive.

  13. Spin fluctuations in liquid 3He: a strong-coupling calculation of T/sub c/ and the normal-state distribution function

    International Nuclear Information System (INIS)

    Fay, D.; Layzer, A.

    1975-01-01

    The Berk--Schrieffer method of strong-coupling superconductivity for nearly ferromagnetic systems is generalized to arbitrary L-state pairing and realistic (hard-core) potentials. Application to 3 He yields a P-state transition but very low values for T/sub c/ and an unsatisfactory normal-state momentum distribution

  14. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  15. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    Science.gov (United States)

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  16. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    Directory of Open Access Journals (Sweden)

    S. S. Motsa

    2014-01-01

    Full Text Available This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs. The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  17. Association between childhood maltreatment and normal adult personality traits: exploration of an understudied field.

    Science.gov (United States)

    Hengartner, Michael P; Cohen, Lisa J; Rodgers, Stephanie; Müller, Mario; Rössler, Wulf; Ajdacic-Gross, Vladeta

    2015-02-01

    We assessed normal personality traits and childhood trauma in approximately 1170 subjects from a general population-based community sample. In bivariate analyses emotional abuse was most pervasively related to personality, showing significant detrimental associations with neuroticism, extraversion, openness, conscientiousness, and agreeableness. Neuroticism was significantly related to emotional abuse and neglect, physical abuse and neglect, and sexual abuse. Emotional abuse was related to neuroticism in men more profoundly than in women (β = 0.095). Adjusting for the covariance between childhood maltreatment variables, neuroticism was mainly related to emotional abuse (β = 0.193), extraversion to emotional neglect (β = -0.259), openness to emotional abuse (β = 0.175), conscientiousness to emotional abuse (β = -0.110), and agreeableness to emotional neglect (β = -0.153). The proportion of variance explained was highest in neuroticism (5.6%) and lowest in openness (1.9%) and conscientiousness (1.8%). These findings help to understand the complex association between childhood maltreatment and both normal and pathological personality.

  18. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  19. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  20. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    OpenAIRE

    Wang Kuan-Min; Lai Hung-Cheng

    2013-01-01

    This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the...

  1. [Calbindin and parvalbumin distribution in spinal cord of normal and rabies-infected mice].

    Science.gov (United States)

    Monroy-Gómez, Jeison; Torres-Fernández, Orlando

    2013-01-01

    Rabies is a fatal infectious disease of the nervous system; however, the knowledge about the pathogenic neural mechanisms in rabies is scarce. In addition, there are few studies of rabies pathology of the spinal cord. To study the distribution of calcium binding proteins calbindin and parvalbumin and assessing the effect of rabies virus infection on their expression in the spinal cord of mice. MATERIALES Y METHODS: Mice were inoculated with rabies virus, by intracerebral or intramuscular route. The spinal cord was extracted to perform some crosscuts which were treated by immunohistochemistry with monoclonal antibodies to reveal the presence of the two proteins in normal and rabies infected mice. We did qualitative and quantitative analyses of the immunoreactivity of the two proteins. Calbindin and parvalbumin showed differential distribution in Rexed laminae. Rabies infection produced a decrease in the expression of calbindin. On the contrary, the infection caused an increased expression of parvalbumin. The effect of rabies infection on the two proteins expression was similar when comparing both routes of inoculation. The differential effect of rabies virus infection on the expression of calbindin and parvalbumin in the spinal cord of mice was similar to that previously reported for brain areas. This result suggests uniformity in the response to rabies infection throughout the central nervous system. This is an important contribution to the understanding of the pathogenesis of rabies.

  2. Segmentation and intensity estimation of microarray images using a gamma-t mixture model.

    Science.gov (United States)

    Baek, Jangsun; Son, Young Sook; McLachlan, Geoffrey J

    2007-02-15

    We present a new approach to the analysis of images for complementary DNA microarray experiments. The image segmentation and intensity estimation are performed simultaneously by adopting a two-component mixture model. One component of this mixture corresponds to the distribution of the background intensity, while the other corresponds to the distribution of the foreground intensity. The intensity measurement is a bivariate vector consisting of red and green intensities. The background intensity component is modeled by the bivariate gamma distribution, whose marginal densities for the red and green intensities are independent three-parameter gamma distributions with different parameters. The foreground intensity component is taken to be the bivariate t distribution, with the constraint that the mean of the foreground is greater than that of the background for each of the two colors. The degrees of freedom of this t distribution are inferred from the data but they could be specified in advance to reduce the computation time. Also, the covariance matrix is not restricted to being diagonal and so it allows for nonzero correlation between R and G foreground intensities. This gamma-t mixture model is fitted by maximum likelihood via the EM algorithm. A final step is executed whereby nonparametric (kernel) smoothing is undertaken of the posterior probabilities of component membership. The main advantages of this approach are: (1) it enjoys the well-known strengths of a mixture model, namely flexibility and adaptability to the data; (2) it considers the segmentation and intensity simultaneously and not separately as in commonly used existing software, and it also works with the red and green intensities in a bivariate framework as opposed to their separate estimation via univariate methods; (3) the use of the three-parameter gamma distribution for the background red and green intensities provides a much better fit than the normal (log normal) or t distributions; (4) the

  3. Distribution and ultrastructure of pigment cells in the skins of normal and albino adult turbot, Scophthalmus Maximus

    Institute of Scientific and Technical Information of China (English)

    GUO Huarong; HUANG Bing; QI Fei; ZHANG Shicui

    2007-01-01

    The distribution and ultrastructure of pigment cells in skins of normal and albino adult turbots were examined with transmission electron microscopy (TEM). Three types of pigment cells of melanophore, iridophore and xanthophore have been recognized in adult turbot skins. The skin color depends mainly on the amount and distribution of melanophore and iridophore, as xanthophore is quite rare. No pigment cells can be found in the epidermis of the skins. In the pigmented ocular skin of the turbot, melanophore and iridophore are usually co-localized in the dermis. This is quite different from the distribution in larvae skin. In albino and white blind skins of adult turbots, however, only iridophore monolayer still exists, while the melanophore monolayer disappears. This cytological evidence explains why the albino adult turbot, unlike its larvae, could never resume its body color no matter what environmental and nutritional conditions were provided. Endocytosis is quite active in the cellular membrane of the iridophore. This might be related to the formation of reflective platelet and stability of the iridophore.

  4. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. TOTAL NUMBER, DISTRIBUTION, AND PHENOTYPE OF CELLS EXPRESSING CHONDROITIN SULPHATE PROTEOGLYCANS IN THE NORMAL HUMAN AMYGDALA

    Science.gov (United States)

    Pantazopoulos, Harry; Murray, Elisabeth A.; Berretta, Sabina

    2009-01-01

    Chondroitin sulphate proteoglycans (CSPGs) are a key structural component of the brain extracellular matrix. They are involved in critical neurodevelopmental functions and are one of the main components of pericellular aggregates known as perineuronal nets. As a step toward investigating their functional and pathophysiological roles in the human amygdala, we assessed the pattern of CSPG expression in the normal human amygdala using wisteria floribunda agglutinin (WFA) lectin-histochemistry. Total numbers of WFA-labeled elements were measured in the lateral (LN), basal (BN), accessory basal (ABN) and cortical (CO) nuclei of the amygdala from 15 normal adult human subjects. For interspecies qualitative comparison, we also investigated the pattern of WFA labeling in the amygdala of naïve rats (n=32) and rhesus monkeys (Macaca mulatta; n=6). In human amygdala, WFA lectin-histochemistry resulted in labeling of perineuronal nets and cells with clear glial morphology, while neurons did not show WFA-labeling. Total numbers of WFA-labeled glial cells showed high interindividual variability. These cells aggregated in clusters with a consistent between-subjects spatial distribution. In a subset of human subjects (n=5), dual color fluorescence using an antibody raised against glial fibrillary acidic protein (GFAP) and WFA showed that the majority (93.7%) of WFA-labeled glial cells correspond to astrocytes. In rat and monkey amygdala, WFA histochemistry labeled perineuronal nets, but not glial cells. These results suggest that astrocytes are the main cell type expressing CSPGs in the adult human amygdala. Their highly segregated distribution pattern suggests that these cells serve specialized functions within human amygdalar nuclei. PMID:18374308

  6. Simulação estocástica da radiação fotossinteticamente ativa e da temperatura do ar por diferentes métodos Random simulation of photosynthetically active radiation and air temperature through different methods

    Directory of Open Access Journals (Sweden)

    Thomas Newton Martin

    2007-09-01

    Full Text Available O objetivo deste trabalho foi comparar três métodos para simulação de dados de radiação fotossinteticamente ativa e temperatura do ar, com o uso de parâmetros diários de 17 estações do Estado de São Paulo. A simulação foi realizada para o primeiro e o 16º dia de cada mês, conforme: a distribuição normal truncada a mais ou menos 1,96 de desvio-padrão; distribuição triangular assimétrica; e distribuição normal bivariada. As estimativas com os dados simulados foram comparadas com os respectivos parâmetros (obtidos com os dados observados, pelos testes de homogeneidade de variância F e Bartlett, teste t de comparação de médias, coeficiente de correlação de Pearson, índice de concordância de Willmott, índice de desempenho de Camargo, coeficiente angular e o teste de normalidade dos dados. A simulação pela distribuição normal bivariada é a mais adequada para representar as variáveis climáticas.The purpose of this work was to compare three methods for simulating data of photosynthetically active radiation and air temperature, using daily parameters of 17 stations of São Paulo State, Brazil. The simulation of those elements was carried out for de 1st and the 16th day of each month, through three cases: nonsymmetric triangular distribution; normal distribution truncated at 1.96 standard deviation; and bivaried normal distribution. The simulated data were evaluated through the tests of homogeneity of variance F and Bartlett, t test, agreement index of Willmott, angular coefficient of the straight line, the index of performance of Camargo and tack the normal distribution (uni-varied. The simulation using the bi-varied normal distribution is most appropriate for representing the climate variables.

  7. The normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man: evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I- 131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I- 131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The ''normal'' adrenal glands were seldom seen and faintly imaged in 2% at 24 hr after injection and in 16% at 48 hr, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extraadrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I- 131 MIBG uptake at 24 through 72 hr

  8. Applying Emax model and bivariate thin plate splines to assess drug interactions.

    Science.gov (United States)

    Kong, Maiying; Lee, J Jack

    2010-01-01

    We review the semiparametric approach previously proposed by Kong and Lee and extend it to a case in which the dose-effect curves follow the Emax model instead of the median effect equation. When the maximum effects for the investigated drugs are different, we provide a procedure to obtain the additive effect based on the Loewe additivity model. Then, we apply a bivariate thin plate spline approach to estimate the effect beyond additivity along with its 95 per cent point-wise confidence interval as well as its 95 per cent simultaneous confidence interval for any combination dose. Thus, synergy, additivity, and antagonism can be identified. The advantages of the method are that it provides an overall assessment of the combination effect on the entire two-dimensional dose space spanned by the experimental doses, and it enables us to identify complex patterns of drug interaction in combination studies. In addition, this approach is robust to outliers. To illustrate this procedure, we analyzed data from two case studies.

  9. Advection-diffusion model for normal grain growth and the stagnation of normal grain growth in thin films

    International Nuclear Information System (INIS)

    Lou, C.

    2002-01-01

    An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)

  10. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  11. Similar distributions of repaired sites in chromatin of normal and xeroderma pigmentosum variant cells damaged by ultraviolet light

    International Nuclear Information System (INIS)

    Cleaver, J.E.

    1979-01-01

    Excision repair of damage from ultraviolet light in both normal and xeroderma pigmentosum variant fibroblasts at early times after irradiation occurred preferentially in regions of DNA accessible to micrococcal nuclease digestion. These regions are predominantly the linker regions between nucleosomes in chromatin. The alterations reported at polymerization and ligation steps of excision repair in the variant are therefore not associated with changes in the relative distributions of repair sites in linker and core particle regions of DNA. (Auth.)

  12. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1978-01-01

    This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.

  13. Comparative pharmacokinetics and tissue distribution profiles of lignan components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Zheng, Tian-Hui; Tao, Yan-Yan; Liu, Cheng-Hai

    2015-05-26

    Fuzheng Huayu recipe (FZHY) is formulated on the basis of Chinese medicine theory in treating liver fibrosis. To illuminate the influence of the pathological state of liver fibrosis on the pharmacokinetics and tissue distribution profiles of lignan components from FZHY. Male Wistar rats were randomly divided into normal group and Hepatic fibrosis group (induced by dimethylnitrosamine). Six lignan components were detected and quantified by ultrahigh performance liquid chromatography-tandem mass spectrometry(UHPLC-MS/MS)in the plasma and tissue of normal and hepatic fibrosis rats. A rapid, sensitive and convenient UHPLC-MS/MS method has been developed for the simultaneous determination of six lignan components in different rat biological samples successfully. After oral administration of FZHY at a dose of 15g/kg, the pharmacokinetic behaviors of schizandrin A (SIA), schizandrin B (SIB), schizandrin C (SIC), schisandrol A (SOA), Schisandrol B (SOB) and schisantherin A (STA) have been significantly changed in hepatic fibrosis rats compared with the normal rats, and their AUC(0-t) values were increased by 235.09%, 388.44%, 223.30%, 669.30%, 295.08% and 267.63% orderly (Pdistribution results showed the amount of SIA, SIB, SOA and SOB were significant increased in heart, lung, spleen and kidney of hepatic fibrosis rats compared with normal rats at most of the time point (Pdistribution of lignan components in normal and hepatic fibrosis rats. The hepatic fibrosis could alter the pharmacokinetics and tissue distribution properties of lignan components in rats after administration of FZHY. The results might be helpful for guide the clinical application of this medicine. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. [Study on the relationship between Terra-MODIS image and the snail distribution in marshland of Jiangning county, Jiangsu province].

    Science.gov (United States)

    Zhang, Bo; Zhang, Zhi-ying; Xu, De-zhong; Sun, Zhi-dong; Zhou, Xiao-nong; Gong, Zi-li; Liu, Shi-jun; Liu, Cheng; Xu, Bin; Zhou, Yun

    2003-04-01

    To analyze the relationship between the normalized difference vegetation index (NDVI) and the snail distribution in marshland of Jiangning county in Jiangsu province, and to explore the utility of Terra-MODIS image map in the small scale snail habitats surveillance. NDVI were extracted from MODIS image by vector chart of the snail distribution using ArcView 8.1 and ERDAS 8.5 software. The relationship between NDVI and the snail distribution were Investigated using Bivariate correlations and stepwise linear regression. The snail density on marshland was positively correlated with the mean NDVI in the first ten-day of May and the maximum NDVI (N(20max)) in the last ten-day of May. Incidence of pixel with the live snail on marshland was positively correlated with the mean NDVI (N(2mean)) in the first ten-day of May. An equation Y(1) = 0.009 47 x N(20max) (R(2) = 0.73), Y(2) = 0.018 6 x N(2mean) (R(2) = 0.906) was established. This study showed that the Terra-MODIS satellite images reflecting the status of the vegetation on marshland in Jiangning county could be applied to the study to supervise the snail habitat. The results suggested that MODIS images could be used to survey the small scale snail habitats on marshland.

  15. A Basic Bivariate Structure of Personality Attributes Evident Across Nine Languages.

    Science.gov (United States)

    Saucier, Gerard; Thalmayer, Amber Gayle; Payne, Doris L; Carlson, Robert; Sanogo, Lamine; Ole-Kotikash, Leonard; Church, A Timothy; Katigbak, Marcia S; Somer, Oya; Szarota, Piotr; Szirmák, Zsofia; Zhou, Xinyue

    2014-02-01

    Here, two studies seek to characterize a parsimonious common-denominator personality structure with optimal cross-cultural replicability. Personality differences are observed in all human populations and cultures, but lexicons for personality attributes contain so many distinctions that parsimony is lacking. Models stipulating the most important attributes have been formulated by experts or by empirical studies drawing on experience in a very limited range of cultures. Factor analyses of personality lexicons of nine languages of diverse provenance (Chinese, Korean, Filipino, Turkish, Greek, Polish, Hungarian, Maasai, and Senoufo) were examined, and their common structure was compared to that of several prominent models in psychology. A parsimonious bivariate model showed evidence of substantial convergence and ubiquity across cultures. Analyses involving key markers of these dimensions in English indicate that they are broad dimensions involving the overlapping content of the interpersonal circumplex, models of communion and agency, and morality/warmth and competence. These "Big Two" dimensions-Social Self-Regulation and Dynamism-provide a common-denominator model involving the two most crucial axes of personality variation, ubiquitous across cultures. The Big Two might serve as an umbrella model serving to link diverse theoretical models and associated research literatures. © 2013 Wiley Periodicals, Inc.

  16. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  17. Aquaculture in artificially developed wetlands in urban areas: an application of the bivariate relationship between soil and surface water in landscape ecology.

    Science.gov (United States)

    Paul, Abhijit

    2011-01-01

    Wetlands show a strong bivariate relationship between soil and surface water. Artificially developed wetlands help to build landscape ecology and make built environments sustainable. The bheries, wetlands of eastern Calcutta (India), utilize the city sewage to develop urban aquaculture that supports the local fish industries and opens a new frontier in sustainable environmental planning research.

  18. Spatial Distribution of Iron Within the Normal Human Liver Using Dual-Source Dual-Energy CT Imaging.

    Science.gov (United States)

    Abadia, Andres F; Grant, Katharine L; Carey, Kathleen E; Bolch, Wesley E; Morin, Richard L

    2017-11-01

    Explore the potential of dual-source dual-energy (DSDE) computed tomography (CT) to retrospectively analyze the uniformity of iron distribution and establish iron concentration ranges and distribution patterns found in healthy livers. Ten mixtures consisting of an iron nitrate solution and deionized water were prepared in test tubes and scanned using a DSDE 128-slice CT system. Iron images were derived from a 3-material decomposition algorithm (optimized for the quantification of iron). A conversion factor (mg Fe/mL per Hounsfield unit) was calculated from this phantom study as the quotient of known tube concentrations and their corresponding CT values. Retrospective analysis was performed of patients who had undergone DSDE imaging for renal stones. Thirty-seven patients with normal liver function were randomly selected (mean age, 52.5 years). The examinations were processed for iron concentration. Multiple regions of interest were analyzed, and iron concentration (mg Fe/mL) and distribution was reported. The mean conversion factor obtained from the phantom study was 0.15 mg Fe/mL per Hounsfield unit. Whole-liver mean iron concentrations yielded a range of 0.0 to 2.91 mg Fe/mL, with 94.6% (35/37) of the patients exhibiting mean concentrations below 1.0 mg Fe/mL. The most important finding was that iron concentration was not uniform and patients exhibited regionally high concentrations (36/37). These regions of higher concentration were observed to be dominant in the middle-to-upper part of the liver (75%), medially (72.2%), and anteriorly (83.3%). Dual-source dual-energy CT can be used to assess the uniformity of iron distribution in healthy subjects. Applying similar techniques to unhealthy livers, future research may focus on the impact of hepatic iron content and distribution for noninvasive assessment in diseased subjects.

  19. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  20. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  1. A bivariate optimal replacement policy for a multistate repairable system

    International Nuclear Information System (INIS)

    Zhang Yuanlin; Yam, Richard C.M.; Zuo, Ming J.

    2007-01-01

    In this paper, a deteriorating simple repairable system with k+1 states, including k failure states and one working state, is studied. It is assumed that the system after repair is not 'as good as new' and the deterioration of the system is stochastic. We consider a bivariate replacement policy, denoted by (T,N), in which the system is replaced when its working age has reached T or the number of failures it has experienced has reached N, whichever occurs first. The objective is to determine the optimal replacement policy (T,N)* such that the long-run expected profit per unit time is maximized. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal replacement policy can be determined analytically or numerically. We prove that the optimal policy (T,N)* is better than the optimal policy N* for a multistate simple repairable system. We also show that a general monotone process model for a multistate simple repairable system is equivalent to a geometric process model for a two-state simple repairable system in the sense that they have the same structure for the long-run expected profit (or cost) per unit time and the same optimal policy. Finally, a numerical example is given to illustrate the theoretical results

  2. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. I - Pressure distribution

    Science.gov (United States)

    Messiter, A. F.

    1980-01-01

    Asymptotic solutions are derived for the pressure distribution in the interaction of a weak normal shock wave with a turbulent boundary layer. The undisturbed boundary layer is characterized by the law of the wall and the law of the wake for compressible flow. In the limiting case considered, for 'high' transonic speeds, the sonic line is very close to the wall. Comparisons with experiment are shown, with corrections included for the effect of longitudinal wall curvature and for the boundary-layer displacement effect in a circular pipe.

  3. Tritium distribution ratios between the 30 % tributyl phosphate(TBP)-normal dodecane(nDD) organic phase and uranyl nitrate-nitric acid aqueous phase

    International Nuclear Information System (INIS)

    Fujine, Sachio; Uchiyama, Gunzou; Sugikawa, Susumu; Maeda, Mitsuru; Tsujino, Takeshi.

    1989-10-01

    Tritium distribution ratios between the organic and aqueous phases were measured for the system of 30 % tributyl phosphate(TBP)-normal dodecane(nDD)/uranyl nitrate-nitric acid water. It was confirmed that tritium is extracted by TBP into the organic phase in both chemical forms of tritiated water (HTO) and tritiated nitric acid (TNO 3 ). The value of tritium distribution ratio ranged from 0.002 to 0.005 for the conditions of 0-6 mol/L nitric acid, 0.5-800 mCi/L tritium in aqueous phase, and 0-125 g-U/L uranium in organic phase. Isotopic distribution coefficient of tritium between the organic and aqueous phases was observed to be about 0.95. (author)

  4. Queue Length and Server Content Distribution in an Infinite-Buffer Batch-Service Queue with Batch-Size-Dependent Service

    Directory of Open Access Journals (Sweden)

    U. C. Gupta

    2015-01-01

    Full Text Available We analyze an infinite-buffer batch-size-dependent batch-service queue with Poisson arrival and arbitrarily distributed service time. Using supplementary variable technique, we derive a bivariate probability generating function from which the joint distribution of queue and server content at departure epoch of a batch is extracted and presented in terms of roots of the characteristic equation. We also obtain the joint distribution of queue and server content at arbitrary epoch. Finally, the utility of analytical results is demonstrated by the inclusion of some numerical examples which also includes the investigation of multiple zeros.

  5. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  6. Heterogeneous distribution of a diffusional tracer in the aortic wall of normal and atherosclerotic rabbits

    International Nuclear Information System (INIS)

    Tsutsui, H.; Tomoike, H.; Nakamura, M.

    1990-01-01

    Tracer distribution as an index of nutritional support across the thoracic and abdominal aortas in rabbits in the presence or absence of atherosclerotic lesions was evaluated using [ 14 C]antipyrine, a metabolically inert, diffusible indicator. Intimal plaques were produced by endothelial balloon denudation of the thoracic aorta and a 1% cholesterol diet. After a steady intravenous infusion of 200 microCi of [ 14 C]antipyrine for 60 seconds, thoracic and abdominal aortas and the heart were excised, and autoradiograms of 20-microns-thick sections were quantified, using microcomputer-aided densitometry. Regional radioactivity and regional diffusional support, as an index of nutritional flow estimated from the timed collections of arterial blood, was 367 and 421 nCi.g-1 (82 and 106 ml.min-1.100 g-1) in thoracic aortic media of the normal and atherosclerotic rabbits, respectively. Radioactivity at the thickened intima was 179 nCi.g-1 (p less than 0.01 versus media). The gruel was noted at a deeper site within the thickened intima, and diffusional support here was 110 nCi.g-1 (p less than 0.01 versus an average radioactivity at the thickened intima). After ligating the intercostal arteries, regional tracer distribution in the media beneath the fibrofatty lesion, but not the plaque-free intima, was reduced to 46%. Thus, in the presence of advanced intimal thickening, the heterogeneous distribution of diffusional flow is prominent across the vessel wall, and abluminal routes are crucial to meet the increased demands of nutritional requirements

  7. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, 2

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1976-01-01

    The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.

  8. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  9. Measurement of activity-weighted size distributions of radon decay products in a normally occupied home

    International Nuclear Information System (INIS)

    Hopke, P.K.; Wasiolek, P.; Montassier, N.; Cavallo, A.; Gadsby, K.; Socolow, R.

    1992-01-01

    In order to assess the exposure of individuals to the presence of indoor radioactivity arising from the decay of radon, an automated, semicontinuous graded screen array system was developed to permit the measurement of the activity-weighted size distributions of the radon progeny in homes. The system has been modified so that the electronics and sampling heads can be separated from the pump by approximately 15 m. The system was placed in the living room of a one-storey house with basement in Princeton, NJ and operated for 2 weeks while the house was occupied by the home owners in their normal manner. One of the house occupants was a cigarette smoker. Radon and potential alpha energy concentration (PAEC) measurements were also made, but condensation nuclei counts were not performed. PAEC values ranged from 23.4 to 461.6 mWL. In the measured activity size distributions, the amount of activity in the 0.5-1.5 nm size range can be considered to be the unattached fraction. The mean value for the 218 Po unattached fraction is 0.217 with a range of 0.054-0.549. The median value for the unattached fraction of PAEC is 0.077 with a range of 0.022-0.178. (author)

  10. Application of a frequency distribution method for determining instars of the beet armyworm (Lepidoptera: Noctuidae) from widths of cast head capsules

    Science.gov (United States)

    Y. Chen; S. J. Seybold

    2013-01-01

    Instar determination of field-collected insect larvae has generally been based on the analysis of head capsule width frequency distributions or bivariate plotting, but few studies have tested the validity of such methods. We used head capsules from exuviae of known instars of the beet armyworm, Spodoptera exigua (Hübner) (Lepidoptera: Noctuidae),...

  11. Finite-size effects in transcript sequencing count distribution: its power-law correction necessarily precedes downstream normalization and comparative analysis.

    Science.gov (United States)

    Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank

    2018-02-12

    signal-to-noise ratio by 50% and the statistical/detection sensitivity by as high as 30% regardless of the downstream mapping and normalization methods. Most importantly, the power-law correction improves concordance in significant calls among different normalization methods of a data series averagely by 22%. When presented with a higher sequence depth (4 times difference), the improvement in concordance is asymmetrical (32% for the higher sequencing depth instance versus 13% for the lower instance) and demonstrates that the simple power-law correction can increase significant detection with higher sequencing depths. Finally, the correction dramatically enhances the statistical conclusions and eludes the metastasis potential of the NUGC3 cell line against AGS of our dilution analysis. The finite-size effects due to undersampling generally plagues transcript count data with reproducibility issues but can be minimized through a simple power-law correction of the count distribution. This distribution correction has direct implication on the biological interpretation of the study and the rigor of the scientific findings. This article was reviewed by Oliviero Carugo, Thomas Dandekar and Sandor Pongor.

  12. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  13. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB

    Science.gov (United States)

    Khazanov, G. V.; Gamayunov, K. V.

    2007-01-01

    We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.

  14. Financing options and economic impact: distributed generation using solar photovoltaic systems in Normal, Illinois

    Directory of Open Access Journals (Sweden)

    Jin H. Jo

    2016-04-01

    Full Text Available Due to increasing price volatility in fossil-fuel-produced energy, the demand for clean, renewable, and abundant energy is more prevalent than in past years. Solar photovoltaic (PV systems have been well documented for their ability to produce electrical energy while at the same time offering support to mitigate the negative externalities associated with fossil fuel combustion. Prices for PV systems have decreased over the past few years, however residential and commercial owners may still opt out of purchasing a system due to the overall price required for a PV system installation. Therefore, determining optimal financing options for residential and small-scale purchasers is a necessity. We report on payment methods currently used for distributed community solar projects throughout the US and suggest appropriate options for purchasers in Normal, Illinois given their economic status. We also examine the jobs and total economic impact of a PV system implementation in the case study area.

  15. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization.

    Directory of Open Access Journals (Sweden)

    Margaretha A Vink

    Full Text Available Post-vaccine monitoring programs for human papillomavirus (HPV have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination.

  16. Determination and correlation of spatial distribution of trace elements in normal and neoplastic breast tissues evaluated by μ-XRF

    International Nuclear Information System (INIS)

    Silva, M.P.; Oliveira, M.A.; Poletti, M.E.

    2012-01-01

    Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using μ-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 μm thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 μm microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by Spearman

  17. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  18. Global Bi-ventricular endocardial distribution of activation rate during long duration ventricular fibrillation in normal and heart failure canines.

    Science.gov (United States)

    Luo, Qingzhi; Jin, Qi; Zhang, Ning; Han, Yanxin; Wang, Yilong; Huang, Shangwei; Lin, Changjian; Ling, Tianyou; Chen, Kang; Pan, Wenqi; Wu, Liqun

    2017-04-13

    The objective of this study was to detect differences in the distribution of the left and right ventricle (LV & RV) activation rate (AR) during short-duration ventricular fibrillation (SDVF, 1 min) in normal and heart failure (HF) canine hearts. Ventricular fibrillation (VF) was electrically induced in six healthy dogs (control group) and six dogs with right ventricular pacing-induced congestive HF (HF group). Two 64-electrode basket catheters deployed in the LV and RV were used for global endocardium electrical mapping. The AR of VF was estimated by fast Fourier transform analysis from each electrode. In the control group, the LV was activated faster than the RV in the first 20 s, after which there was no detectable difference in the AR between them. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the posterior LV was activated fastest, while the anterior was slowest. In the HF group, a detectable AR gradient existed between the two ventricles within 3 min of VF, with the LV activating more quickly than the RV. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the septum of the LV was activated fastest, while the anterior was activated slowest. A global bi-ventricular endocardial AR gradient existed within the first 20 s of VF but disappeared in the LDVF in healthy hearts. However, the AR gradient was always observed in both SDVF and LDVF in HF hearts. The findings of this study suggest that LDVF in HF hearts can be maintained differently from normal hearts, which accordingly should lead to the development of different management strategies for LDVF resuscitation.

  19. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    Science.gov (United States)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  20. A preliminary evaluation of myoelectrical energy distribution of the front neck muscles in pharyngeal phase during normal swallowing.

    Science.gov (United States)

    Mingxing Zhu; Wanzhang Yang; Samuel, Oluwarotimi Williams; Yun Xiang; Jianping Huang; Haiqing Zou; Guanglin Li

    2016-08-01

    Pharyngeal phase is a central hub of swallowing in which food bolus pass through from the oral cavity to the esophageal. Proper understanding of the muscular activities in the pharyngeal phase is useful for assessing swallowing function and the occurrence of dysphagia in humans. In this study, high-density (HD) surface electromyography (sEMG) was used to study the muscular activities in the pharyngeal phase during swallowing tasks involving three healthy male subjects. The root mean square (RMS) of the HD sEMG data was computed by using a series of segmented windows as myoelectrical energy. And the RMS of each window covering all channels (16×5) formed a matrix. During the pharyngeal phase of swallowing, three of the matrixes were chosen and normalized to obtain the HD energy maps and the statistical parameter. The maps across different viscosity levels offered the energy distribution which showed the muscular activities of the left and right sides of the front neck muscles. In addition, the normalized average RMS (NARE) across different viscosity levels revealed a left-right significant correlation (r=0.868±0.629, pstronger correlation when swallowing water. This pilot study suggests that HD sEMG would be a potential tool to evaluate muscular activities in pharyngeal phase during normal swallowing. Also, it might provide useful information for dysphagia diagnosis.

  1. Stereology of extremes; bivariate models and computation

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Hlubinka, D.

    2003-01-01

    Roč. 5, č. 3 (2003), s. 289-308 ISSN 1387-5841 R&D Projects: GA AV ČR IAA1075201; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z1075907 Keywords : sample extremes * domain of attraction * normalizing constants Subject RIV: BA - General Mathematics

  2. Aerosol lung inhalation scintigraphy in normal subjects

    Energy Technology Data Exchange (ETDEWEB)

    Sui, Osamu; Shimazu, Hideki

    1985-03-01

    We previously reported basic and clinical evaluation of aerosol lung inhalation scintigraphy with /sup 99m/Tc-millimicrosphere albumin (milli MISA) and concluded aerosol inhalation scintigraphy with /sup 99m/Tc-milli MISA was useful for routine examination. But central airway deposit of aerosol particles was found in not only the patients with chronic obstructive pulmonary disease (COPD) but also normal subjects. So we performed aerosol inhalation scintigraphy in normal subjects and evaluated their scintigrams. The subjects had normal values of FEVsub(1.0)% (more than 70%) in lung function tests, no abnormal findings in chest X-ray films and no symptoms and signs. The findings of aerosol inhalation scintigrams in them were classified into 3 patterns; type I: homogeneous distribution without central airway deposit, type II: homogeneous distribution with central airway deposit, type III: inhomogeneous distribution. These patterns were compared with lung function tests. There was no significant correlation between type I and type II in lung function tests. Type III was different from type I and type II in inhomogeneous distribution. This finding showed no correlation with %VC, FEVsub(1.0)%, MMF, V radical50 and V radical50/V radical25, but good correlation with V radical25 in a maximum forced expiratory flow-volume curve. Flow-volume curve is one of the sensitive methods in early detection of COPD, so inhomogeneous distribution of type III is considered to be due to small airway dysfunction.

  3. Stellar Distributions and NIR Colours of Normal Galaxies

    NARCIS (Netherlands)

    Peletier, R. F.; Grijs, R. de

    1997-01-01

    Abstract: We discuss some results of a morphological study of edge-on galaxies, based on optical and especially near-infrared surface photometry. We find that the vertical surface brightness distributions of galaxies are fitted very well by exponential profiles, much better than by isothermal

  4. The Semiparametric Normal Variance-Mean Mixture Model

    DEFF Research Database (Denmark)

    Korsholm, Lars

    1997-01-01

    We discuss the normal vairance-mean mixture model from a semi-parametric point of view, i.e. we let the mixing distribution belong to a non parametric family. The main results are consistency of the non parametric maximum likelihood estimat or in this case, and construction of an asymptotically...... normal and efficient estimator....

  5. Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.

    Science.gov (United States)

    Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.

    1998-01-01

    The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)

  6. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  7. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions

    International Nuclear Information System (INIS)

    Hassanein, A.; Konkashbaev, I.

    1999-01-01

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutions provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters

  8. Partial LVAD restores ventricular outputs and normalizes LV but not RV stress distributions in the acutely failing heart in silico

    OpenAIRE

    Sack, Kevin L.; Baillargeon, Brian; Acevedo-Bolton, Gabriel; Genet, Martin; Rebelo, Nuno; Kuhl, Ellen; Klein, Liviu; Weiselthaler, Georg M.; Burkhoff, Daniel; Franz, Thomas; Guccione, Julius M.

    2016-01-01

    Purpose: Heart failure is a worldwide epidemic that is unlikely to change as the population ages and life expectancy increases. We sought to detail significant recent improvements to the Dassault Systèmes Living Heart Model (LHM) and use the LHM to compute left ventricular (LV) and right ventricular (RV) myofiber stress distributions under the following 4 conditions: (1) normal cardiac function; (2) acute left heart failure (ALHF); (3) ALHF treated using an LV assist device (LVAD) flow rate o...

  9. Computer modeling the boron compound factor in normal brain tissue

    International Nuclear Information System (INIS)

    Gavin, P.R.; Huiskamp, R.; Wheeler, F.J.; Griebenow, M.L.

    1993-01-01

    The macroscopic distribution of borocaptate sodium (Na 2 B 12 H 11 SH or BSH) in normal tissues has been determined and can be accurately predicted from the blood concentration. The compound para-borono-phenylalanine (p-BPA) has also been studied in dogs and normal tissue distribution has been determined. The total physical dose required to reach a biological isoeffect appears to increase directly as the proportion of boron capture dose increases. This effect, together with knowledge of the macrodistribution, led to estimates of the influence of the microdistribution of the BSH compound. This paper reports a computer model that was used to predict the compound factor for BSH and p-BPA and, hence, the equivalent radiation in normal tissues. The compound factor would need to be calculated for other compounds with different distributions. This information is needed to design appropriate normal tissue tolerance studies for different organ systems and/or different boron compounds

  10. Absolute quantification of pharmacokinetic distribution of RES colloids in individuals with normal liver function

    International Nuclear Information System (INIS)

    Herzog, H.; Spohr, G.; Notohamiprodjo, G.; Feinendegen, L.E.

    1987-01-01

    Estimates of the radiation dose resulting from liver-spleen scintigraphy 99 TCsup(m)-labelled colloids are based on pharmacokinetic data mainly determined in animals. The aim of this study was to check the pharmacokinetic data by direct, absolute in vivo quantification in man. Liver and spleen activities were directly measured using a double-energy window technique. Activities in other organs were quantified by conjugate whole-body scans. All measurement procedures were checked using the whole-body Alderson phantom. Pharmacokinetic data for sulphur colloid, tin colloid, human serum albumin (HSA) millimicrospheres, and phytate were obtained in 13 to 20 normal subjects for each type of colloid. Depending on the colloid type liver uptake was between 54 and 75% of the total administered dose (TAD) and spleen uptake was 3.5 to 21% TAD. Activity measured in blood, urine, lung and thyroid proved to be far from negligible. The results of this work suggest a correction of the animal-based data of colloid distribution and radiation dose on the basis of the direct measurement of absolute uptake in man. (author)

  11. On the maximum entropy distributions of inherently positive nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Taavitsainen, A., E-mail: aapo.taavitsainen@gmail.com; Vanhanen, R.

    2017-05-11

    The multivariate log-normal distribution is used by many authors and statistical uncertainty propagation programs for inherently positive quantities. Sometimes it is claimed that the log-normal distribution results from the maximum entropy principle, if only means, covariances and inherent positiveness of quantities are known or assumed to be known. In this article we show that this is not true. Assuming a constant prior distribution, the maximum entropy distribution is in fact a truncated multivariate normal distribution – whenever it exists. However, its practical application to multidimensional cases is hindered by lack of a method to compute its location and scale parameters from means and covariances. Therefore, regardless of its theoretical disadvantage, use of other distributions seems to be a practical necessity. - Highlights: • Statistical uncertainty propagation requires a sampling distribution. • The objective distribution of inherently positive quantities is determined. • The objectivity is based on the maximum entropy principle. • The maximum entropy distribution is the truncated normal distribution. • Applicability of log-normal or normal distribution approximation is limited.

  12. powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks

    Science.gov (United States)

    Murray, Steven G.

    2018-05-01

    powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.

  13. Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Directory of Open Access Journals (Sweden)

    Amin Zadeh Sarraf

    2013-09-01

    Full Text Available Purpose: Numerous companies are expecting their knowledge management (KM to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement  in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem. Keywords: Knowledge management; strategy; TOPSIS; Normal distribution; entropy

  14. A Vehicle for Bivariate Data Analysis

    Science.gov (United States)

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  15. Pharmacokinetics of tritiated water in normal and dietary-induced obese rats

    International Nuclear Information System (INIS)

    Shum, L.Y.; Jusko, W.J.

    1986-01-01

    Tritiated water disposition was characterized in normal and dietary-induced obese rats to assess pharmacokinetic concerns in calculating water space and estimating body fat. A monoexponential decline in serum tritium activity was observed in both groups of rats, thus facilitating use of various computational methods. The volume of distribution and the total clearance of tritium in obese rats were larger than in normal rats because of the increased body weight. The values of water space (volume of distribution) estimated from moment analysis or dose divided by serum tritium activity at time zero (extrapolated) or at 2 hr were all similar. Thus, obesity does not alter the distribution equilibrium time and distribution pattern of tritium, and the conventional 2-hr single blood sampling after intravenous injection is adequate to estimate the water space of normal and obese rats

  16. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    International Nuclear Information System (INIS)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu; Hirunuma, Rieko; Enomoto, Shuichi

    2001-01-01

    Because vanadium ions such as vanadyl (VO 2+ ) and vanadate (VO 3- ) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ( 48 V) and zinc ( 65 Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  17. Immunolocalization of transforming growth factor alpha in normal human tissues

    DEFF Research Database (Denmark)

    Christensen, M E; Poulsen, Steen Seier

    1996-01-01

    anchorage-independent growth of normal cells and was, therefore, considered as an "oncogenic" growth factor. Later, its immunohistochemical presence in normal human cells as well as its biological effects in normal human tissues have been demonstrated. The aim of the present investigation was to elucidate...... the distribution of the growth factor in a broad spectrum of normal human tissues. Indirect immunoenzymatic staining methods were used. The polypeptide was detected with a polyclonal as well as a monoclonal antibody. The polyclonal and monoclonal antibodies demonstrated almost identical immunoreactivity. TGF......-alpha was found to be widely distributed in cells of normal human tissues derived from all three germ layers, most often in differentiated cells. In epithelial cells, three different kinds of staining patterns were observed, either diffuse cytoplasmic, cytoplasmic in the basal parts of the cells, or distinctly...

  18. What is common becomes normal: the effect of obesity prevalence on maternal perception.

    Science.gov (United States)

    Binkin, N; Spinelli, A; Baglio, G; Lamberti, A

    2013-05-01

    This analysis investigates the poorly-known effect of local prevalence of childhood obesity on mothers' perception of their children's weight status. In 2008, a national nutritional survey of children attending the third grade of elementary school was conducted in Italy. Children were measured and classified as underweight, normal weight, overweight and obese, using the International Obesity Task Force cut-offs for body mass index (BMI). A parental questionnaire included parental perception of their child's weight status (underweight, normal, a little overweight and a lot overweight). Regions were classified by childhood obesity prevalence (maternal perception and regional obesity prevalence, and maternal and child characteristics were examined using bivariate and logistic regression analyses. Complete data were available for 37 590 children, of whom 24% were overweight and 12% obese. Mothers correctly identified the status of 84% of normal weight, 52% of overweight and 14% of obese children. Among overweight children, factors associated with underestimation of the child's weight included lower maternal education (adjusted odds ratio, aOR, 1.9; 95% confidence interval (CI) 1.6-2.4), residence in a high-obesity region (aOR 2.2; 95% CI 1.9-2.6), male gender (aOR 1.4; 95% CI 1.2-1.6) and child's BMI. Higher regional obesity prevalence is associated with lower maternal perception, suggesting that what is common has a greater likelihood of being perceived as normal. As perception is a first step to change, it may be harder to intervene in areas with high-obesity prevalence where intervention is most urgent. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    Science.gov (United States)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  20. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  1. Multivariate Marshall and Olkin Exponential Minification Process ...

    African Journals Online (AJOL)

    A stationary bivariate minification process with bivariate Marshall-Olkin exponential distribution that was earlier studied by Miroslav et al [15]is in this paper extended to multivariate minification process with multivariate Marshall and Olkin exponential distribution as its stationary marginal distribution. The innovation and the ...

  2. The use of copulas to practical estimation of multivariate stochastic differential equation mixed effects models

    International Nuclear Information System (INIS)

    Rupšys, P.

    2015-01-01

    A system of stochastic differential equations (SDE) with mixed-effects parameters and multivariate normal copula density function were used to develop tree height model for Scots pine trees in Lithuania. A two-step maximum likelihood parameter estimation method is used and computational guidelines are given. After fitting the conditional probability density functions to outside bark diameter at breast height, and total tree height, a bivariate normal copula distribution model was constructed. Predictions from the mixed-effects parameters SDE tree height model calculated during this research were compared to the regression tree height equations. The results are implemented in the symbolic computational language MAPLE

  3. The use of copulas to practical estimation of multivariate stochastic differential equation mixed effects models

    Energy Technology Data Exchange (ETDEWEB)

    Rupšys, P. [Aleksandras Stulginskis University, Studenų g. 11, Akademija, Kaunas district, LT – 53361 Lithuania (Lithuania)

    2015-10-28

    A system of stochastic differential equations (SDE) with mixed-effects parameters and multivariate normal copula density function were used to develop tree height model for Scots pine trees in Lithuania. A two-step maximum likelihood parameter estimation method is used and computational guidelines are given. After fitting the conditional probability density functions to outside bark diameter at breast height, and total tree height, a bivariate normal copula distribution model was constructed. Predictions from the mixed-effects parameters SDE tree height model calculated during this research were compared to the regression tree height equations. The results are implemented in the symbolic computational language MAPLE.

  4. Discrete factor approximations in simultaneous equation models: estimating the impact of a dummy endogenous variable on a continuous outcome.

    Science.gov (United States)

    Mroz, T A

    1999-10-01

    This paper contains a Monte Carlo evaluation of estimators used to control for endogeneity of dummy explanatory variables in continuous outcome regression models. When the true model has bivariate normal disturbances, estimators using discrete factor approximations compare favorably to efficient estimators in terms of precision and bias; these approximation estimators dominate all the other estimators examined when the disturbances are non-normal. The experiments also indicate that one should liberally add points of support to the discrete factor distribution. The paper concludes with an application of the discrete factor approximation to the estimation of the impact of marriage on wages.

  5. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  6. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    Energy Technology Data Exchange (ETDEWEB)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu [Department of Analytical and Bioinorganic Chemistry, Kyoto Pharmaceutical University, Kyoto (Japan); Hirunuma, Rieko; Enomoto, Shuichi [Radioisotope Technology Division, Cyclotron Center, Institute of Physical and Chemical Research (RIKEN), Wako, Saitama (Japan)

    2001-05-01

    Because vanadium ions such as vanadyl (VO{sup 2+}) and vanadate (VO{sup 3-}) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ({sup 48}V) and zinc ({sup 65}Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  7. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  8. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  9. UPLC-MS method for quantification of pterostilbene and its application to comparative study of bioavailability and tissue distribution in normal and Lewis lung carcinoma bearing mice.

    Science.gov (United States)

    Deng, Li; Li, Yongzhi; Zhang, Xinshi; Chen, Bo; Deng, Yulin; Li, Yujuan

    2015-10-10

    A UPLC-MS method was developed for determination of pterostilbene (PTS) in plasma and tissues of mice. PTS was separated on Agilent Zorbax XDB-C18 column (50 × 2.1 mm, 1.8 μm) with gradient mobile phase at the flow rate of 0.2 ml/min. The detection was performed by negative ion electrospray ionization in multiple reaction monitoring mode. The linear calibration curve of PTS in mouse plasma and tissues ranged from 1.0 to 5000 and 0.50 to 500 ng/ml (r(2)>0.9979), respectively, with lowest limits of quantification (LLOQ) were between 0.5 and 2.0 ng/ml, respectively. The accuracy and precision of the assay were satisfactory. The validated method was applied to the study of bioavailability and tissue distribution of PTS in normal and Lewis lung carcinoma (LLC) bearing mice. The bioavailability of PTS (dose 14, 28 and 56 mg/kg) in normal mice were 11.9%, 13.9% and 26.4%, respectively; and the maximum level (82.1 ± 14.2 μg/g) was found in stomach (dose 28 mg/kg). The bioavailability, peak concentration (Cmax), time to peak concentration (Tmax) of PTS in LLC mice was increased compared with normal mice. The results indicated the UPLC-MS method is reliable and bioavailability and tissue distribution of PTS in normal and LLC mice were dramatically different. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Spatial distribution of cannabinoid receptor type 1 (CB1 in normal canine central and peripheral nervous system.

    Directory of Open Access Journals (Sweden)

    Jessica Freundt-Revilla

    Full Text Available The endocannabinoid system is a regulatory pathway consisting of two main types of cannabinoid receptors (CB1 and CB2 and their endogenous ligands, the endocannabinoids. The CB1 receptor is highly expressed in the central and peripheral nervous systems (PNS in mammalians and is involved in neuromodulatory functions. Since endocannabinoids were shown to be elevated in cerebrospinal fluid of epileptic dogs, knowledge about the species specific CB receptor expression in the nervous system is required. Therefore, we assessed the spatial distribution of CB1 receptors in the normal canine CNS and PNS. Immunohistochemistry of several regions of the brain, spinal cord and peripheral nerves from a healthy four-week-old puppy, three six-month-old dogs, and one ten-year-old dog revealed strong dot-like immunoreactivity in the neuropil of the cerebral cortex, Cornu Ammonis (CA and dentate gyrus of the hippocampus, midbrain, cerebellum, medulla oblongata and grey matter of the spinal cord. Dense CB1 expression was found in fibres of the globus pallidus and substantia nigra surrounding immunonegative neurons. Astrocytes were constantly positive in all examined regions. CB1 labelled neurons and satellite cells of the dorsal root ganglia, and myelinating Schwann cells in the PNS. These results demonstrate for the first time the spatial distribution of CB1 receptors in the healthy canine CNS and PNS. These results can be used as a basis for further studies aiming to elucidate the physiological consequences of this particular anatomical and cellular distribution.

  11. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  12. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  13. The rank of a normally distributed matrix and positive definiteness of a noncentral Wishart distributed matrix

    NARCIS (Netherlands)

    Steerneman, A. G. M.; van Perlo-ten Kleij, Frederieke

    2008-01-01

    If X similar to N-nxk(M, I-n circle times Sigma), then S = X'X has the noncentral Wishart distribution W-k(')(n, Sigma; A), where Lambda = M'M. Here Sigma is allowed to be singular. It is well known that if Lambda = 0, then S has a (central) Wishart distribution and. S is positive definite with

  14. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  15. X-ray emssion from normal galaxies

    International Nuclear Information System (INIS)

    Speybroeck, L. van; Bechtold, J.

    1981-01-01

    A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)

  16. The clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast

    International Nuclear Information System (INIS)

    Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung

    2006-01-01

    We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings

  17. Data Normalization to Accelerate Training for Linear Neural Net to Predict Tropical Cyclone Tracks

    Directory of Open Access Journals (Sweden)

    Jian Jin

    2015-01-01

    Full Text Available When pure linear neural network (PLNN is used to predict tropical cyclone tracks (TCTs in South China Sea, whether the data is normalized or not greatly affects the training process. In this paper, min.-max. method and normal distribution method, instead of standard normal distribution, are applied to TCT data before modeling. We propose the experimental schemes in which, with min.-max. method, the min.-max. value pair of each variable is mapped to (−1, 1 and (0, 1; with normal distribution method, each variable’s mean and standard deviation pair is set to (0, 1 and (100, 1. We present the following results: (1 data scaled to the similar intervals have similar effects, no matter the use of min.-max. or normal distribution method; (2 mapping data to around 0 gains much faster training speed than mapping them to the intervals far away from 0 or using unnormalized raw data, although all of them can approach the same lower level after certain steps from their training error curves. This could be useful to decide data normalization method when PLNN is used individually.

  18. Distribution of age at menopause in two Danish samples

    DEFF Research Database (Denmark)

    Boldsen, J L; Jeune, B

    1990-01-01

    We analyzed the distribution of reported age at natural menopause in two random samples of Danish women (n = 176 and n = 150) to determine the shape of the distribution and to disclose any possible trends in the distribution parameters. It was necessary to correct the frequencies of the reported...... ages for the effect of differing ages at reporting. The corrected distribution of age at menopause differs from the normal distribution in the same way in both samples. Both distributions could be described by a mixture of two normal distributions. It appears that most of the parameters of the normal...... distribution mixtures remain unchanged over a 50-year time lag. The position of the distribution, that is, the mean age at menopause, however, increases slightly but significantly....

  19. MASCOT user's guide--Version 2.0: Analytical solutions for multidimensional transport of a four-member radionuclide decay chain in ground water

    International Nuclear Information System (INIS)

    Gureghian, A.B.

    1988-07-01

    The MASCOT code computes the two- and three-dimensional space-time dependent convective-dispersive transport of a four-member radionuclide decay chain in unbounded homogeneous porous media, for constant and radionuclide-dependent release, and assuming steady- state isothermal ground-water flow and parallel streamlines. The model can handle a single or multiple finite line source or a Gaussian distributed source in the two-dimensional case, and a single or multiple patch source or bivariate-normal distributed source in the three-dimensional case. The differential equations are solved by Laplace and Fourier transforms and a Gauss-Legendre integration scheme. 33 figs., 3 tabs

  20. Subchondral bone density distribution of the talus in clinically normal Labrador Retrievers.

    Science.gov (United States)

    Dingemanse, W; Müller-Gerbl, M; Jonkers, I; Vander Sloten, J; van Bree, H; Gielen, I

    2016-03-15

    Bones continually adapt their morphology to their load bearing function. At the level of the subchondral bone, the density distribution is highly correlated with the loading distribution of the joint. Therefore, subchondral bone density distribution can be used to study joint biomechanics non-invasively. In addition physiological and pathological joint loading is an important aspect of orthopaedic disease, and research focusing on joint biomechanics will benefit veterinary orthopaedics. This study was conducted to evaluate density distribution in the subchondral bone of the canine talus, as a parameter reflecting the long-term joint loading in the tarsocrural joint. Two main density maxima were found, one proximally on the medial trochlear ridge and one distally on the lateral trochlear ridge. All joints showed very similar density distribution patterns and no significant differences were found in the localisation of the density maxima between left and right limbs and between dogs. Based on the density distribution the lateral trochlear ridge is most likely subjected to highest loads within the tarsocrural joint. The joint loading distribution is very similar between dogs of the same breed. In addition, the joint loading distribution supports previous suggestions of the important role of biomechanics in the development of OC lesions in the tarsus. Important benefits of computed tomographic osteoabsorptiometry (CTOAM), i.e. the possibility of in vivo imaging and temporal evaluation, make this technique a valuable addition to the field of veterinary orthopaedic research.

  1. Vibrational Spectra And Potential Energy Distributions of Normal Modes of N,N'-Etilenbis(P-Toluen sulfonamide)

    International Nuclear Information System (INIS)

    Alyar, S.

    2008-01-01

    N-substituted sulfonamides are well known for their diuretic, antidiabetic, antibacterial and antifungal, anticancer e.g., and are widely used in the therapy of patients. These important bioactive properties are strongly affected by the special features of -CH 2 -SO 2 -NR-linker and intramolecular motion Thus, the studies of energetic and spatial properties on N-substituted sulfonamides are of great importance to improve our understanding of their biological activities and enhance abilities to predict new drugs. Density Functional Theory B3LYP /6-31G(d,p) level has been applied to obtain the vibrational force field for the most stable conformation of N,N'-etilenbis(p-toluensulfonamit)(ptsen)having sulfonamide moiety. The results of these calculation have been compared with spectroscopic data to verify accuracy of calculation and applicability of the DFT approach to ptsen. Additionally, complete normal coordinate analyses with quantum mechanical scaling (SQM) were performed to derive the potential energy distributions (PE)

  2. Multivariate phase type distributions - Applications and parameter estimation

    DEFF Research Database (Denmark)

    Meisch, David

    The best known univariate probability distribution is the normal distribution. It is used throughout the literature in a broad field of applications. In cases where it is not sensible to use the normal distribution alternative distributions are at hand and well understood, many of these belonging...... and statistical inference, is the multivariate normal distribution. Unfortunately only little is known about the general class of multivariate phase type distribution. Considering the results concerning parameter estimation and inference theory of univariate phase type distributions, the class of multivariate...... projects and depend on reliable cost estimates. The Successive Principle is a group analysis method primarily used for analyzing medium to large projects in relation to cost or duration. We believe that the mathematical modeling used in the Successive Principle can be improved. We suggested a novel...

  3. Normal Anti-Invariant Submanifolds of Paraquaternionic Kähler Manifolds

    Directory of Open Access Journals (Sweden)

    Novac-Claudiu Chiriac

    2006-12-01

    Full Text Available We introduce normal anti-invariant submanifolds of paraquaternionic Kähler manifolds and study the geometric structures induced on them. We obtain necessary and sufficient conditions for the integrability of the distributions defined on a normal anti-invariant submanifold. Also, we present characterizations of local (global anti-invariant products.

  4. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model.

    Directory of Open Access Journals (Sweden)

    Habib Baghirov

    Full Text Available The treatment of brain diseases is hindered by the blood-brain barrier (BBB preventing most drugs from entering the brain. Focused ultrasound (FUS with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma.

  5. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model

    Science.gov (United States)

    Baghirov, Habib; Snipstad, Sofie; Sulheim, Einar; Berg, Sigrid; Hansen, Rune; Thorsen, Frits; Mørch, Yrr; Åslund, Andreas K. O.

    2018-01-01

    The treatment of brain diseases is hindered by the blood-brain barrier (BBB) preventing most drugs from entering the brain. Focused ultrasound (FUS) with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate) nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma. PMID:29338016

  6. Comparison of the procedures of Fleishman and Ramberg et al. for generating non-normal data in simulation studies

    Directory of Open Access Journals (Sweden)

    Rebecca Bendayan

    2014-01-01

    Full Text Available Simulation techniques must be able to generate the types of distributions most commonly encountered in real data, for example, non-normal distributions. Two recognized procedures for generating non-normal data are Fleishman's linear transformation method and the method proposed by Ramberg et al. that is based on generalization of the Tukey lambda distribution. This study compares tríese procedures in terms of the extent to which the distributions they generate fit their respective theoretical models, and it also examines the number of simulations needed to achieve this fit. To this end, the paper considers, in addition to the normal distribution, a series of non-normal distributions that are commonly found in real data, and then analyses fit according to the extent to which normality is violated and the number of simulations performed. The results show that the two data generation procedures behave similarly. As the degree of contamination of the theoretical distribution increases, so does the number of simulations required to ensure a good fit to the generated data. The two procedures generate more accurate normal and non-normal distributions when at least 7000 simulations are performed, although when the degree of contamination is severe (with values of skewness and kurtosis of 2 and 6, respectively it is advisable to perform 15000 simulations.

  7. On the generation of log-Levy distributions and extreme randomness

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2011-01-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Levy distributions. The log-Levy distributions are the Levy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Levy distributions emerge universally-the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot's extreme randomness. (paper)

  8. Log-normality of indoor radon data in the Walloon region of Belgium

    International Nuclear Information System (INIS)

    Cinelli, Giorgia; Tondeur, François

    2015-01-01

    The deviations of the distribution of Belgian indoor radon data from the log-normal trend are examined. Simulated data are generated to provide a theoretical frame for understanding these deviations. It is shown that the 3-component structure of indoor radon (radon from subsoil, outdoor air and building materials) generates deviations in the low- and high-concentration tails, but this low-C trend can be almost completely compensated by the effect of measurement uncertainties and by possible small errors in background subtraction. The predicted low-C and high-C deviations are well observed in the Belgian data, when considering the global distribution of all data. The agreement with the log-normal model is improved when considering data organised in homogeneous geological groups. As the deviation from log-normality is often due to the low-C tail for which there is no interest, it is proposed to use the log-normal fit limited to the high-C half of the distribution. With this prescription, the vast majority of the geological groups of data are compatible with the log-normal model, the remaining deviations being mostly due to a few outliers, and rarely to a “fat tail”. With very few exceptions, the log-normal modelling of the high-concentration part of indoor radon data is expected to give reasonable results, provided that the data are organised in homogeneous geological groups. - Highlights: • Deviations of the distribution of Belgian indoor Rn data from the log-normal trend. • 3-component structure of indoor Rn: subsoil, outdoor air and building materials. • Simulated data generated to provide a theoretical frame for understanding deviations. • Data organised in homogeneous geological groups; better agreement with the log-normal

  9. The distribution of interlaboratory comparison data

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    2008-01-01

    The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality.......The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality....

  10. Relationship between the spatial distribution of SMS messages reporting needs and building damage in 2010 Haiti disaster

    Directory of Open Access Journals (Sweden)

    C. Corbane

    2012-02-01

    Full Text Available Just 4 days after the M = 7.1 earthquake on 12 January 2010, Haitians could send SMS messages about their location and urgent needs through the on-line mapping platform Ushahidi. This real-time crowdsourcing of crisis information provided direct support to key humanitarian resources on the ground, including Search and Rescue teams. In addition to its use as a knowledge base for rescue operations and aid provision, the spatial distribution of geolocated SMS messages may represent an early indicator on the spatial distribution and on the intensity of building damage.

    This work explores the relationship between the spatial patterns of SMS messages and building damage. The latter is derived from the detailed damage assessment of individual buildings interpreted in post-earthquake airborne photos. The interaction between SMS messages and building damage is studied by analyzing the spatial structure of the corresponding bivariate patterns.

    The analysis is performed through the implementation of cross Ripley's K-function which is suitable for characterizing the spatial structure of a bivariate pattern, and more precisely the spatial relationship between two types of point sets located in the same study area.

    The results show a strong attraction between the patterns exhibited by SMS messages and building damages. The interactions identified between the two patterns suggest that the geolocated SMS can be used as early indicators of the spatial distribution of building damage pattern. Accordingly, a statistical model has been developed to map the distribution of building damage from the geolocated SMS pattern.

    The study presented in this paper is the first attempt to derive quantitative estimates on the spatial patterns of novel crowdsourced information and correlate these to established methods in damage assessment using remote sensing data. The consequences of the study findings for rapid damage detection in

  11. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2013-01-01

    Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  12. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution.

    Science.gov (United States)

    Lo, Kenneth; Gottardo, Raphael

    2012-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components.

  13. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  14. The Role of Wealth and Health in Insurance Choice: Bivariate Probit Analysis in China

    Directory of Open Access Journals (Sweden)

    Yiding Yue

    2014-01-01

    Full Text Available This paper captures the correlation between the choices of health insurance and pension insurance using the bivariate probit model and then studies the effect of wealth and health on insurance choice. Our empirical evidence shows that people who participate in a health care program are more likely to participate in a pension plan at the same time, while wealth and health have different effects on the choices of the health care program and the pension program. Generally, the higher an individual’s wealth level is, the more likelihood he will participate in a health care program; but wealth has no effect on the participation of pension. Health status has opposite effects on choices of health care programs and pension plans; the poorer an individual’s health is, the more likely he is to participate in health care programs, while the better health he enjoys, the more likely he is to participate in pension plans. When the investigation scope narrows down to commercial insurance, there is only a significant effect of health status on commercial health insurance. The commercial insurance choice and the insurance choice of the agricultural population are more complicated.

  15. A NEW STATISTICAL PERSPECTIVE TO THE COSMIC VOID DISTRIBUTION

    International Nuclear Information System (INIS)

    Pycke, J-R; Russell, E.

    2016-01-01

    In this study, we obtain the size distribution of voids as a three-parameter redshift-independent log-normal void probability function (VPF) directly from the Cosmic Void Catalog (CVC). Although many statistical models of void distributions are based on the counts in randomly placed cells, the log-normal VPF that we obtain here is independent of the shape of the voids due to the parameter-free void finder of the CVC. We use three void populations drawn from the CVC generated by the Halo Occupation Distribution (HOD) Mocks, which are tuned to three mock SDSS samples to investigate the void distribution statistically and to investigate the effects of the environments on the size distribution. As a result, it is shown that void size distributions obtained from the HOD Mock samples are satisfied by the three-parameter log-normal distribution. In addition, we find that there may be a relation between the hierarchical formation, skewness, and kurtosis of the log-normal distribution for each catalog. We also show that the shape of the three-parameter distribution from the samples is strikingly similar to the galaxy log-normal mass distribution obtained from numerical studies. This similarity between void size and galaxy mass distributions may possibly indicate evidence of nonlinear mechanisms affecting both voids and galaxies, such as large-scale accretion and tidal effects. Considering the fact that in this study, all voids are generated by galaxy mocks and show hierarchical structures in different levels, it may be possible that the same nonlinear mechanisms of mass distribution affect the void size distribution.

  16. STOCHASTIC PRICING MODEL FOR THE REAL ESTATE MARKET: FORMATION OF LOG-NORMAL GENERAL POPULATION

    Directory of Open Access Journals (Sweden)

    Oleg V. Rusakov

    2015-01-01

    Full Text Available We construct a stochastic model of real estate pricing. The method of the pricing construction is based on a sequential comparison of the supply prices. We proof that under standard assumptions imposed upon the comparison coefficients there exists an unique non-degenerated limit in distribution and this limit has the lognormal law of distribution. The accordance of empirical distributions of prices to thetheoretically obtained log-normal distribution we verify by numerous statistical data of real estate prices from Saint-Petersburg (Russia. For establishing this accordance we essentially apply the efficient and sensitive test of fit of Kolmogorov-Smirnov. Basing on “The Russian Federal Estimation Standard N2”, we conclude that the most probable price, i.e. mode of distribution, is correctly and uniquely defined under the log-normal approximation. Since the mean value of log-normal distribution exceeds the mode - most probable value, it follows that the prices valued by the mathematical expectation are systematically overstated.

  17. Fluid Distribution Pattern in Adult-Onset Congenital, Idiopathic, and Secondary Normal-Pressure Hydrocephalus: Implications for Clinical Care.

    Science.gov (United States)

    Yamada, Shigeki; Ishikawa, Masatsune; Yamamoto, Kazuo

    2017-01-01

    In spite of growing evidence of idiopathic normal-pressure hydrocephalus (NPH), a viewpoint about clinical care for idiopathic NPH is still controversial. A continuous divergence of viewpoints might be due to confusing classifications of idiopathic and adult-onset congenital NPH. To elucidate the classification of NPH, we propose that adult-onset congenital NPH should be explicitly distinguished from idiopathic and secondary NPH. On the basis of conventional CT scan or MRI, idiopathic NPH was defined as narrow sulci at the high convexity in concurrent with enlargement of the ventricles, basal cistern and Sylvian fissure, whereas adult-onset congenital NPH was defined as huge ventricles without high-convexity tightness. We compared clinical characteristics and cerebrospinal fluid distribution among 85 patients diagnosed with idiopathic NPH, 17 patients with secondary NPH, and 7 patients with adult-onset congenital NPH. All patients underwent 3-T MRI examinations and tap-tests. The volumes of ventricles and subarachnoid spaces were measured using a 3D workstation based on T2-weighted 3D sequences. The mean intracranial volume for the patients with adult-onset congenital NPH was almost 100 mL larger than the volumes for patients with idiopathic and secondary NPH. Compared with the patients with idiopathic or secondary NPH, patients with adult-onset congenital NPH exhibited larger ventricles but normal sized subarachnoid spaces. The mean volume ratio of the high-convexity subarachnoid space was significantly less in idiopathic NPH than in adult-onset congenital NPH, whereas the mean volume ratio of the basal cistern and Sylvian fissure in idiopathic NPH was >2 times larger than that in adult-onset congenital NPH. The symptoms of gait disturbance, cognitive impairment, and urinary incontinence in patients with adult-onset congenital NPH tended to progress more slowly compared to their progress in patients with idiopathic NPH. Cerebrospinal fluid distributions and

  18. Estimation of rank correlation for clustered data.

    Science.gov (United States)

    Rosner, Bernard; Glynn, Robert J

    2017-06-30

    It is well known that the sample correlation coefficient (R xy ) is the maximum likelihood estimator of the Pearson correlation (ρ xy ) for independent and identically distributed (i.i.d.) bivariate normal data. However, this is not true for ophthalmologic data where X (e.g., visual acuity) and Y (e.g., visual field) are available for each eye and there is positive intraclass correlation for both X and Y in fellow eyes. In this paper, we provide a regression-based approach for obtaining the maximum likelihood estimator of ρ xy for clustered data, which can be implemented using standard mixed effects model software. This method is also extended to allow for estimation of partial correlation by controlling both X and Y for a vector U_ of other covariates. In addition, these methods can be extended to allow for estimation of rank correlation for clustered data by (i) converting ranks of both X and Y to the probit scale, (ii) estimating the Pearson correlation between probit scores for X and Y, and (iii) using the relationship between Pearson and rank correlation for bivariate normally distributed data. The validity of the methods in finite-sized samples is supported by simulation studies. Finally, two examples from ophthalmology and analgesic abuse are used to illustrate the methods. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  20. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2010-12-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  1. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Genton, Marc G.

    2010-01-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  2. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  3. Combining counts and incidence data: an efficient approach for estimating the log-normal species abundance distribution and diversity indices.

    Science.gov (United States)

    Bellier, Edwige; Grøtan, Vidar; Engen, Steinar; Schartau, Ann Kristin; Diserud, Ola H; Finstad, Anders G

    2012-10-01

    Obtaining accurate estimates of diversity indices is difficult because the number of species encountered in a sample increases with sampling intensity. We introduce a novel method that requires that the presence of species in a sample to be assessed while the counts of the number of individuals per species are only required for just a small part of the sample. To account for species included as incidence data in the species abundance distribution, we modify the likelihood function of the classical Poisson log-normal distribution. Using simulated community assemblages, we contrast diversity estimates based on a community sample, a subsample randomly extracted from the community sample, and a mixture sample where incidence data are added to a subsample. We show that the mixture sampling approach provides more accurate estimates than the subsample and at little extra cost. Diversity indices estimated from a freshwater zooplankton community sampled using the mixture approach show the same pattern of results as the simulation study. Our method efficiently increases the accuracy of diversity estimates and comprehension of the left tail of the species abundance distribution. We show how to choose the scale of sample size needed for a compromise between information gained, accuracy of the estimates and cost expended when assessing biological diversity. The sample size estimates are obtained from key community characteristics, such as the expected number of species in the community, the expected number of individuals in a sample and the evenness of the community.

  4. Distribution Functions of Sizes and Fluxes Determined from Supra-Arcade Downflows

    Science.gov (United States)

    McKenzie, D.; Savage, S.

    2011-01-01

    The frequency distributions of sizes and fluxes of supra-arcade downflows (SADs) provide information about the process of their creation. For example, a fractal creation process may be expected to yield a power-law distribution of sizes and/or fluxes. We examine 120 cross-sectional areas and magnetic flux estimates found by Savage & McKenzie for SADs, and find that (1) the areas are consistent with a log-normal distribution and (2) the fluxes are consistent with both a log-normal and an exponential distribution. Neither set of measurements is compatible with a power-law distribution nor a normal distribution. As a demonstration of the applicability of these findings to improved understanding of reconnection, we consider a simple SAD growth scenario with minimal assumptions, capable of producing a log-normal distribution.

  5. A Hybrid Forecasting Model Based on Bivariate Division and a Backpropagation Artificial Neural Network Optimized by Chaos Particle Swarm Optimization for Day-Ahead Electricity Price

    Directory of Open Access Journals (Sweden)

    Zhilong Wang

    2014-01-01

    Full Text Available In the electricity market, the electricity price plays an inevitable role. Nevertheless, accurate price forecasting, a vital factor affecting both government regulatory agencies and public power companies, remains a huge challenge and a critical problem. Determining how to address the accurate forecasting problem becomes an even more significant task in an era in which electricity is increasingly important. Based on the chaos particle swarm optimization (CPSO, the backpropagation artificial neural network (BPANN, and the idea of bivariate division, this paper proposes a bivariate division BPANN (BD-BPANN method and the CPSO-BD-BPANN method for forecasting electricity price. The former method creatively transforms the electricity demand and price to be a new variable, named DV, which is calculated using the division principle, to forecast the day-ahead electricity by multiplying the forecasted values of the DVs and forecasted values of the demand. Next, to improve the accuracy of BD-BPANN, chaos particle swarm optimization and BD-BPANN are synthesized to form a novel model, CPSO-BD-BPANN. In this study, CPSO is utilized to optimize the initial parameters of BD-BPANN to make its output more stable than the original model. Finally, two forecasting strategies are proposed regarding different situations.

  6. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  7. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  8. Continuous multivariate exponential extension

    International Nuclear Information System (INIS)

    Block, H.W.

    1975-01-01

    The Freund-Weinman multivariate exponential extension is generalized to the case of nonidentically distributed marginal distributions. A fatal shock model is given for the resulting distribution. Results in the bivariate case and the concept of constant multivariate hazard rate lead to a continuous distribution related to the multivariate exponential distribution (MVE) of Marshall and Olkin. This distribution is shown to be a special case of the extended Freund-Weinman distribution. A generalization of the bivariate model of Proschan and Sullo leads to a distribution which contains both the extended Freund-Weinman distribution and the MVE

  9. Hedging effectiveness and volatility models for crude oil market: a dynamic approach; Modelos de volatilidade e a efetividade do hedge no mercado de petroleo: um abordagem dinamica

    Energy Technology Data Exchange (ETDEWEB)

    Salles, Andre Assis de [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)

    2012-07-01

    The hedge strategies allow negotiators that have short and long positions in the market protection against price fluctuations. This paper examines the performance of bivariate volatility models for the crude oil spot and future returns of the Western Texas Intermediate - WTI type barrel prices. Besides the volatility of spot and future crude oil barrel returns time series, the hedge ratio strategy is examined through the hedge effectiveness. Thus this study shows hedge strategies built using methodologies applied in the variance modeling of returns of crude oil prices in the spot and future markets, and covariance between these two market returns, which correspond to the inputs of the hedge strategy shown in this work. From the studied models the bivariate GARCH in a Diagonal VECH and BEKK representations was chosen, using three different models for the mean: a bivariate autoregressive, a vector autoregressive and a vector error correction. The methodologies used here take into consideration the denial of assumptions of homoscedasticity and normality for the return distributions. The data used is logarithm returns of daily prices quoted in dollars per barrel from November 2008 to May 2010 for spot and future contracts, in particular the June contract. (author)

  10. A new normalization method based on electrical field lines for electrical capacitance tomography

    International Nuclear Information System (INIS)

    Zhang, L F; Wang, H X

    2009-01-01

    Electrical capacitance tomography (ECT) is considered to be one of the most promising process tomography techniques. The image reconstruction for ECT is an inverse problem to find the spatially distributed permittivities in a pipe. Usually, the capacitance measurements obtained from the ECT system are normalized at the high and low permittivity for image reconstruction. The parallel normalization model is commonly used during the normalization process, which assumes the distribution of materials in parallel. Thus, the normalized capacitance is a linear function of measured capacitance. A recently used model is a series normalization model which results in the normalized capacitance as a nonlinear function of measured capacitance. The newest presented model is based on electrical field centre lines (EFCL), and is a mixture of two normalization models. The multi-threshold method of this model is presented in this paper. The sensitivity matrices based on different normalization models were obtained, and image reconstruction was carried out accordingly. Simulation results indicate that reconstructed images with higher quality can be obtained based on the presented model

  11. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  12. Bimodal distribution of the magnetic dipole moment in nanoparticles with a monomodal distribution of the physical size

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2015-01-01

    High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal distribution of the magnetic dipole moment. Here, we test this assumption for different types of superparamagnetic iron oxide nanoparticles in the 5–20 nm range, by multimodal fitting of magnetization curves using the MINORIM inversion method. The particles are studied while in dilute colloidal dispersion in a liquid, thereby preventing hysteresis and diminishing the effects of magnetic anisotropy on the interpretation of the magnetization curves. For two different types of well crystallized particles, the magnetic distribution is indeed log-normal, as expected from the physical size distribution. However, two other types of particles, with twinning defects or inhomogeneous oxide phases, are found to have a bimodal magnetic distribution. Our qualitative explanation is that relatively low fields are sufficient to begin aligning the particles in the liquid on the basis of their net dipole moment, whereas higher fields are required to align the smaller domains or less magnetic phases inside the particles. - Highlights: • Multimodal fits of dilute ferrofluids reveal when the particles are multidomain. • No a priori shape of the distribution is assumed by the MINORIM inversion method. • Well crystallized particles have log-normal TEM and magnetic size distributions. • Defective particles can combine a monomodal size and a bimodal dipole moment

  13. Neutron and PIMC determination of the longitudinal momentum distribution of HCP, BCC and normal liquid 4He

    International Nuclear Information System (INIS)

    Blasdell, R.C.; Ceperley, D.M.; Simmons, R.O.

    1993-07-01

    Deep inelastic neutron scattering has been used to measure the neutron Compton profile (NCP) of a series of condensed 4 He samples at densities from 28.8 atoms/nm 3 (essentially the minimum possible density in the solid phase) up to 39.8 atoms/nm 3 using a chopper spectrometer at the Argonne National Laboratory Intense Pulsed Neutron Source. At the lowest density, the NCP was measured along an isochore through the hcp, bcc, and normal liquid phases. Average atomic kinetic energies are extracted from each of the data sets and are compared to both published and new path integral Monte-Carlo (PIMC) calculations as well as other theoretical predictions. In this preliminary analysis of the data, account is taken of the effects of instrumental resolution, multiple scattering, and final-state interactions. Both our measurements and the PIMC theory show that there are only small differences in the kinetic energy and longitudinal momentum distribution of isochoric helium samples, regardless of their phase or crystal structure

  14. Dose distribution following selective internal radiation therapy

    International Nuclear Information System (INIS)

    Fox, R.A.; Klemp, P.F.; Egan, G.; Mina, L.L.; Burton, M.A.; Gray, B.N.

    1991-01-01

    Selective Internal Radiation Therapy is the intrahepatic arterial injection of microspheres labelled with 90Y. The microspheres lodge in the precapillary circulation of tumor resulting in internal radiation therapy. The activity of the 90Y injected is managed by successive administrations of labelled microspheres and after each injection probing the liver with a calibrated beta probe to assess the dose to the superficial layers of normal tissue. Predicted doses of 75 Gy have been delivered without subsequent evidence of radiation damage to normal cells. This contrasts with the complications resulting from doses in excess of 30 Gy delivered from external beam radiotherapy. Detailed analysis of microsphere distribution in a cubic centimeter of normal liver and the calculation of dose to a 3-dimensional fine grid has shown that the radiation distribution created by the finite size and distribution of the microspheres results in an highly heterogeneous dose pattern. It has been shown that a third of normal liver will receive less than 33.7% of the dose predicted by assuming an homogeneous distribution of 90Y

  15. Organ distribution of 111In-oxine labeled lymphocytes in normal subjects and in patients with chronic lymphocytic leukemia and malignant lymphoma

    International Nuclear Information System (INIS)

    Matsuda, Shin; Uchida, Tatsumi; Yui, Tokuo; Kariyone, Shigeo

    1982-01-01

    T and B lymphocyte survival and organ distribution were studied by using 111 In-oxine labeled autologous lymphocytes in 3 normal subjects, 3 patients with chronic lymphocytic leukemia (CLL) and 9 with malignant lymphoma (ML).FDisappearance curves of the labeled lymphocytes showed two exponential components in all cases. The half time of the first component was within 1 hour in all cases. That of the second one was 50.7 +- 6.4 hours for all lymphocytes, 52.0 +- 5.5 hours for T lymphocytes and 31.6 +- 4.9 hours for B lymphocytes in normal subjects, 192.6 hours for T-CLL and 57.7 +- 46.9 hours for B-CLL, and 60.2 +- 30.7 hours for T cell type of malignant lymphoma (T-ML) and 63.7 +- 24.5 hours for B cell type of malignant lymphoma (B-ML). These data might suggest that all lymphocyte disappearance curve reflected T lymphocyte disappearance curve chiefly, and the half time of B lymphocytes was shorter than that of T lymphocytes. In the T-CLL, the half time of the second component prolonged extremely in comparison with that of normal T lymphocytes. The labeled cells were accumulated in the lungs, spleen and liver immediately after the infusion, then in the spleen most remarkably 1 hour after the infusion in all cases. The radioactivity over the bone marrow was observed from 1 hour in all cases and that of lymph nodes were first noticed 18 hours after the infusion in T-CLL and T-ML, 68 hours in B-CLL but were not noticed in normal subjects and B-ML. The recovery of labeled cells in the blood was 28.5 +- 7.9% for all lymphocytes, 19.7 +- 1.9% for T lymphocytes and 11.0 +- 5.1% for B lymphocytes in normal subjects, 25.8 +- 1.6% for CLL, and 17.6 +- 11.0% for T-ML, 7.7 +- 5.2% for B-ML, respectively. (J.P.N.)

  16. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  17. Research on Normal Human Plantar Pressure Test

    Directory of Open Access Journals (Sweden)

    Liu Xi Yang

    2016-01-01

    Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.

  18. Sonographic measurement of normal thyroid gland in the neonates

    International Nuclear Information System (INIS)

    Jeon, Gwun; Shin, Ji Hoon; Hong, Hyun Sook; Kim, Jung Hoon; Hwang, Jung Hwa; Goo, Dong Erk; Choi, Deuk Lin; Yang, Seung Boo

    2001-01-01

    The purpose of this study was to establish the sonographic measurement of normal thyroid gland in the neonates. Ultrasonographic evaluation of the thyroid gland was performed in the first week of life in 107 term neonates. The serum level of thyroid stimulating hormone was normal in all neonates. Their gestational age at birth was range from 37 to 41 weeks and birth weight was 3305 ± 457.74 g. Sonography was performed with 7.5 MHz linear array transducer (SA-8800 MT, Medison, Korea) within the first week of postnatal age. Maximal transverse (T), anteroposterior (AP), and longitudinal (L) dimensions of thyroid gland were measured. The volume of each lobe was estimate by using standard geometric formula; volume of a prolate ellipsoid = T X AP X L X π/6. The total volume of the thyroid gland was calculated as the sum of the each lobe. The correlations with total thyroid volume and weight, height, the body surface area, gestational age of the neonate were estimated by Pearson's coefficient and p volume on Bivariate correlation analysis. Total thyroid volume was 0.68 ± 0.23 cm 3 , left and right lobe volumes of thyroid gland were 0.32 ± 0.12 cm 3 and 0.36 ± 0.14 cm 3 , respectively. T, AP, and L dimension of right lobe were 0.69 ± 0.14 cm, 0.71 ± 0.13 cm, 1.37 ± 0.22 cm respectively. And those of left lobe were 0.70 ± 0.11 cm, 0.65 ± 0.13 cm, 1.31 ± 0.21 cm, respectively. The Pearson's coefficients for total thyroid volume with the weight, and body surface area of neonate were 0.385, 0.395 (p 3 and was significantly correlated with the weight and body surface area.

  19. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  20. Site-dependent distribution of macrophages in normal human extraocular muscles

    NARCIS (Netherlands)

    Schmidt, E. D.; van der Gaag, R.; Mourits, M. P.; Koornneef, L.

    1993-01-01

    PURPOSE: Clinical data indicate that extraocular muscles have different susceptibilities for some orbital immune disorders depending on their anatomic location. The resident immunocompetent cells may be important mediators in the local pathogenesis of such disorders so the distribution of these

  1. Analysis of the normal optical, Michel and molecular potentials on ...

    Indian Academy of Sciences (India)

    6. — journal of. June 2016 physics pp. 1275–1286. Analysis of the normal ... the levels are obtained for the three optical potentials to estimate the quality ... The experimental angular distribution data for the 40Ca(6Li, d)44Ti reaction .... analysed using the normal optical, Michel and molecular potentials within the framework.

  2. On the distribution of DDO galaxies

    International Nuclear Information System (INIS)

    Sharp, N.A.; Jones, B.J.T.; Jones, J.E.

    1978-01-01

    The distribution of DDO galaxies on the sky and their relationship to normal galaxies have been examined. The results appear to contradict the universality of the luminosity function for galaxies. They also indicate that DDO galaxies are preferentially found in the vicinity of normal galaxies, but not uniformly in that they tend to avoid clusters. This may be due to the dependence of distribution upon morphological type. (author)

  3. PHARMACOKINETIC VARIATIONS OF OFLOXACIN IN NORMAL AND FEBRILE RABBITS

    Directory of Open Access Journals (Sweden)

    M. AHMAD, H. RAZA, G. MURTAZA AND N. AKHTAR

    2008-12-01

    Full Text Available The influence of experimentally Escherichia coli-induced fever (EEIF on the pharmacokinetics of ofloxacin was evaluated. Ofloxacin was administered @ 20 mg.kg-1 body weight intravenously to a group of eight healthy rabbits and compared these results to values in same eight rabbits with EEIF. Pharmacokinetic parameters of ofloxacin in normal and febrile rabbits were determined by using two compartment open kinetic model. Peak plasma level (Cmax and area under the plasma concentration-time curve (AUC0-α in normal and febrile rabbits did not differ (P>0.05. However, area under first moment of plasma concentration-time curve (AUMC0-α in febrile rabbits was significantly (P<0.05 higher than that in normal rabbits. Mean values for elimination rate constant (Ke, elimination half life (t1/2β and apparent volume of distribution (Vd were significantly (P<0.05 lower in febrile rabbits compared to normal rabbits, while mean residence time (MRT and total body clearance (Cl of ofloxacin did not show any significant difference in the normal and febrile rabbits. Clinical significance of the above results can be related to the changes in the volume of distribution and elimination half life that illustrates an altered steady state in febrile condition; hence, the need for an adjustment of dosage regimen in EEIF is required.

  4. Determining prescription durations based on the parametric waiting time distribution

    DEFF Research Database (Denmark)

    Støvring, Henrik; Pottegård, Anton; Hallas, Jesper

    2016-01-01

    two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users...... in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies......-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide...

  5. Apparent Transition in the Human Height Distribution Caused by Age-Dependent Variation during Puberty Period

    Science.gov (United States)

    Iwata, Takaki; Yamazaki, Yoshihiro; Kuninaka, Hiroto

    2013-08-01

    In this study, we examine the validity of the transition of the human height distribution from the log-normal distribution to the normal distribution during puberty, as suggested in an earlier study [Kuninaka et al.: J. Phys. Soc. Jpn. 78 (2009) 125001]. Our data analysis reveals that, in late puberty, the variation in height decreases as children grow. Thus, the classification of a height dataset by age at this stage leads us to analyze a mixture of distributions with larger means and smaller variations. This mixture distribution has a negative skewness and is consequently closer to the normal distribution than to the log-normal distribution. The opposite case occurs in early puberty and the mixture distribution is positively skewed, which resembles the log-normal distribution rather than the normal distribution. Thus, this scenario mimics the transition during puberty. Additionally, our scenario is realized through a numerical simulation based on a statistical model. The present study does not support the transition suggested by the earlier study.

  6. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    International Nuclear Information System (INIS)

    Smallidge, S.T.; Baker, T.T.; VanLeeuwen, D.; Gould, W.R.; Thompson, B.C.

    2010-01-01

    Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.

  7. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  8. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    Science.gov (United States)

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  9. Microscopic prediction of speech intelligibility in spatially distributed speech-shaped noise for normal-hearing listeners.

    Science.gov (United States)

    Geravanchizadeh, Masoud; Fallah, Ali

    2015-12-01

    A binaural and psychoacoustically motivated intelligibility model, based on a well-known monaural microscopic model is proposed. This model simulates a phoneme recognition task in the presence of spatially distributed speech-shaped noise in anechoic scenarios. In the proposed model, binaural advantage effects are considered by generating a feature vector for a dynamic-time-warping speech recognizer. This vector consists of three subvectors incorporating two monaural subvectors to model the better-ear hearing, and a binaural subvector to simulate the binaural unmasking effect. The binaural unit of the model is based on equalization-cancellation theory. This model operates blindly, which means separate recordings of speech and noise are not required for the predictions. Speech intelligibility tests were conducted with 12 normal hearing listeners by collecting speech reception thresholds (SRTs) in the presence of single and multiple sources of speech-shaped noise. The comparison of the model predictions with the measured binaural SRTs, and with the predictions of a macroscopic binaural model called extended equalization-cancellation, shows that this approach predicts the intelligibility in anechoic scenarios with good precision. The square of the correlation coefficient (r(2)) and the mean-absolute error between the model predictions and the measurements are 0.98 and 0.62 dB, respectively.

  10. Improving runoff risk estimates: Formulating runoff as a bivariate process using the SCS curve number method

    Science.gov (United States)

    Shaw, Stephen B.; Walter, M. Todd

    2009-03-01

    The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.

  11. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  12. Deformation associated with continental normal faults

    Science.gov (United States)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  13. Local charge nonequilibrium and anomalous energy dependence of normalized moments in narrow rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    From the study of even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows, we propose a simple picture for charge correlation with nonzero correlation length and calculate the multiplicity distributions and the normalized moments in different rapidity windows at different energies. The results explain the experimentally observed coincidence and separation of even and odd distributions and also the anomalous energy dependence of normalized moments in narrow rapidity windows. The reason for the separation of even-odd distributions, appearing first at large multiplicities, is shown to be energy conservation. The special role of no-particle events in narrow rapidity windows is pointed out

  14. The skin immune system (SIS): distribution and immunophenotype of lymphocyte subpopulations in normal human skin

    NARCIS (Netherlands)

    Bos, J. D.; Zonneveld, I.; Das, P. K.; Krieg, S. R.; van der Loos, C. M.; Kapsenberg, M. L.

    1987-01-01

    The complexity of immune response-associated cells present in normal human skin was recently redefined as the skin immune system (SIS). In the present study, the exact immunophenotypes of lymphocyte subpopulations with their localizations in normal human skin were determined quantitatively. B cells

  15. Unit Root Testing and Estimation in Nonlinear ESTAR Models with Normal and Non-Normal Errors.

    Directory of Open Access Journals (Sweden)

    Umair Khalil

    Full Text Available Exponential Smooth Transition Autoregressive (ESTAR models can capture non-linear adjustment of the deviations from equilibrium conditions which may explain the economic behavior of many variables that appear non stationary from a linear viewpoint. Many researchers employ the Kapetanios test which has a unit root as the null and a stationary nonlinear model as the alternative. However this test statistics is based on the assumption of normally distributed errors in the DGP. Cook has analyzed the size of the nonlinear unit root of this test in the presence of heavy-tailed innovation process and obtained the critical values for both finite variance and infinite variance cases. However the test statistics of Cook are oversized. It has been found by researchers that using conventional tests is dangerous though the best performance among these is a HCCME. The over sizing for LM tests can be reduced by employing fixed design wild bootstrap remedies which provide a valuable alternative to the conventional tests. In this paper the size of the Kapetanios test statistic employing hetroscedastic consistent covariance matrices has been derived and the results are reported for various sample sizes in which size distortion is reduced. The properties for estimates of ESTAR models have been investigated when errors are assumed non-normal. We compare the results obtained through the fitting of nonlinear least square with that of the quantile regression fitting in the presence of outliers and the error distribution was considered to be from t-distribution for various sample sizes.

  16. Competition between clonal plasma cells and normal cells for potentially overlapping bone marrow niches is associated with a progressively altered cellular distribution in MGUS vs myeloma.

    Science.gov (United States)

    Paiva, B; Pérez-Andrés, M; Vídriales, M-B; Almeida, J; de las Heras, N; Mateos, M-V; López-Corral, L; Gutiérrez, N C; Blanco, J; Oriol, A; Hernández, M T; de Arriba, F; de Coca, A G; Terol, M-J; de la Rubia, J; González, Y; Martín, A; Sureda, A; Schmidt-Hieber, M; Schmitz, A; Johnsen, H E; Lahuerta, J-J; Bladé, J; San-Miguel, J F; Orfao, A

    2011-04-01

    Disappearance of normal bone marrow (BM) plasma cells (PC) predicts malignant transformation of monoclonal gammopathy of undetermined significance (MGUS) and smoldering myeloma (SMM) into symptomatic multiple myeloma (MM). The homing, behavior and survival of normal PC, but also CD34(+) hematopoietic stem cells (HSC), B-cell precursors, and clonal PC largely depends on their interaction with stromal cell-derived factor-1 (SDF-1) expressing, potentially overlapping BM stromal cell niches. Here, we investigate the distribution, phenotypic characteristics and competitive migration capacity of these cell populations in patients with MGUS, SMM and MM vs healthy adults (HA) aged >60 years. Our results show that BM and peripheral blood (PB) clonal PC progressively increase from MGUS to MM, the latter showing a slightly more immature immunophenotype. Of note, such increased number of clonal PC is associated with progressive depletion of normal PC, B-cell precursors and CD34(+) HSC in the BM, also with a parallel increase in PB. In an ex vivo model, normal PC, B-cell precursors and CD34(+) HSC from MGUS and SMM, but not MM patients, were able to abrogate the migration of clonal PC into serial concentrations of SDF-1. Overall, our results show that progressive competition and replacement of normal BM cells by clonal PC is associated with more advanced disease in patients with MGUS, SMM and MM.

  17. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  18. A closer look at the effect of preliminary goodness-of-fit testing for normality for the one-sample t-test.

    Science.gov (United States)

    Rochon, Justine; Kieser, Meinhard

    2011-11-01

    Student's one-sample t-test is a commonly used method when inference about the population mean is made. As advocated in textbooks and articles, the assumption of normality is often checked by a preliminary goodness-of-fit (GOF) test. In a paper recently published by Schucany and Ng it was shown that, for the uniform distribution, screening of samples by a pretest for normality leads to a more conservative conditional Type I error rate than application of the one-sample t-test without preliminary GOF test. In contrast, for the exponential distribution, the conditional level is even more elevated than the Type I error rate of the t-test without pretest. We examine the reasons behind these characteristics. In a simulation study, samples drawn from the exponential, lognormal, uniform, Student's t-distribution with 2 degrees of freedom (t(2) ) and the standard normal distribution that had passed normality screening, as well as the ingredients of the test statistics calculated from these samples, are investigated. For non-normal distributions, we found that preliminary testing for normality may change the distribution of means and standard deviations of the selected samples as well as the correlation between them (if the underlying distribution is non-symmetric), thus leading to altered distributions of the resulting test statistics. It is shown that for skewed distributions the excess in Type I error rate may be even more pronounced when testing one-sided hypotheses. ©2010 The British Psychological Society.

  19. A Post-Truncation Parameterization of Truncated Normal Technical Inefficiency

    OpenAIRE

    Christine Amsler; Peter Schmidt; Wen-Jen Tsay

    2013-01-01

    In this paper we consider a stochastic frontier model in which the distribution of technical inefficiency is truncated normal. In standard notation, technical inefficiency u is distributed as N^+ (μ,σ^2). This distribution is affected by some environmental variables z that may or may not affect the level of the frontier but that do affect the shortfall of output from the frontier. We will distinguish the pre-truncation mean (μ) and variance (σ^2) from the post-truncation mean μ_*=E(u) and var...

  20. Are your covariates under control? How normalization can re-introduce covariate effects.

    Science.gov (United States)

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  1. Quasi-normal modes from non-commutative matrix dynamics

    Science.gov (United States)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  2. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study.

    Science.gov (United States)

    Seed, Mike; van Amerom, Joshua F P; Yoo, Shi-Joon; Al Nafisi, Bahiyah; Grosse-Wortmann, Lars; Jaeggi, Edgar; Jansz, Michael S; Macgowan, Christopher K

    2012-11-26

    We present the first phase contrast (PC) cardiovascular magnetic resonance (CMR) measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG). A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30-39 weeks). Flow was measured in the major fetal vessels and indexed to the fetal weight. There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96). Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO) 540 ± 101, main pulmonary artery (MPA) 327 ± 68, ascending aorta (AAo) 198 ± 38, superior vena cava (SVC) 147 ± 46, ductus arteriosus (DA) 220 ± 39,pulmonary blood flow (PBF) 106 ± 59,descending aorta (DAo) 273 ± 85, umbilical vein (UV) 160 ± 62, foramen ovale (FO)107 ± 54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60 ± 4, AAo37 ± 4, SVC 28 ± 7, DA 41 ± 8, PBF 19 ± 10, DAo50 ± 12, UV 30 ± 9, FO 21 ± 12. This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  3. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Seed Mike

    2012-11-01

    Full Text Available Abstract Background We present the first phase contrast (PC cardiovascular magnetic resonance (CMR measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG. Methods A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30–39 weeks. Flow was measured in the major fetal vessels and indexed to the fetal weight. Results There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96. Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO 540±101, main pulmonary artery (MPA 327±68, ascending aorta (AAo 198±38, superior vena cava (SVC 147±46, ductus arteriosus (DA 220±39,pulmonary blood flow (PBF 106±59,descending aorta (DAo 273±85, umbilical vein (UV 160±62, foramen ovale (FO107±54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60±4, AAo37±4, SVC 28±7, DA 41±8, PBF 19±10, DAo50±12, UV 30±9, FO 21±12. Conclusion This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  4. Evaluation of distribution patterns and decision of distribution coefficients of trace elements in high-purity aluminium by INAA

    International Nuclear Information System (INIS)

    Hayakawa, Yasuhiro; Suzuki, Shogo; Hirai, Shoji

    1986-01-01

    Recently, a high-purity aluminium has been used in semi-coductor device, so on. It was required that trace impurities should be reduced and that its content should be quantitatively evaluated. In this study, distribution patterns of many trace impurities in 99.999 % aluminium ingots, which was purified using a normal freezing method, were evaluated by an INAA. The effective distribution coefficient k for each detected elements was calculated using a theoretical distribution equation in the normal freezing method. As a result, the elements of k 1 was Hf. Especially, La, Sm, U and Th could be effectively purified, but Sc and Hf could be scarcely purified. Further more, it was found that the slower freezing gave the effective distribution coefficient close to the equilibrium distribution coefficient, and that the effective distribution coefficient became smaller with the larger atomic radius. (author)

  5. Normal modes of vibration in nickel

    Energy Technology Data Exchange (ETDEWEB)

    Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B

    1964-07-01

    The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)

  6. Is Middle-Upper Arm Circumference “normally” distributed? Secondary data analysis of 852 nutrition surveys

    Directory of Open Access Journals (Sweden)

    Severine Frison

    2016-05-01

    Full Text Available Abstract Background Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 % are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH and/or low Mid-Upper Arm Circumference (MUAC (since 2005. Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. Methods This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise “non-normal” distributions. Results The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 % distributions using the Shapiro–Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 % were skewed (D’Agostino test and 196 (36.8 % had a kurtosis different to the one observed in the normal distribution (Anscombe–Glynn test. Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 % showed high digit preference, 164 (30.8 % had a large design effect, and 204 (38.3 % a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were “normalised” and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating “normal” after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7

  7. On Introducing Asymmetry into Circular Distributions

    Directory of Open Access Journals (Sweden)

    Dale Umbach

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} We give a brief history of the results which led to the introduction of asymmetry into symmetric circular distributions. This is followed by the presentation of another method of introducing asymmetry. Some properties of the induced distributions are studied. Finally, this new distribution is shown to be a reasonable fit to the Jander ant data as presented in Fisher (1993.

  8. Exponential model normalization for electrical capacitance tomography with external electrodes under gap permittivity conditions

    International Nuclear Information System (INIS)

    Baidillah, Marlin R; Takei, Masahiro

    2017-01-01

    A nonlinear normalization model which is called exponential model for electrical capacitance tomography (ECT) with external electrodes under gap permittivity conditions has been developed. The exponential model normalization is proposed based on the inherently nonlinear relationship characteristic between the mixture permittivity and the measured capacitance due to the gap permittivity of inner wall. The parameters of exponential equation are derived by using an exponential fitting curve based on the simulation and a scaling function is added to adjust the experiment system condition. The exponential model normalization was applied to two dimensional low and high contrast dielectric distribution phantoms by using simulation and experimental studies. The proposed normalization model has been compared with other normalization models i.e. Parallel, Series, Maxwell and Böttcher models. Based on the comparison of image reconstruction results, the exponential model is reliable to predict the nonlinear normalization of measured capacitance in term of low and high contrast dielectric distribution. (paper)

  9. Approach of the value of an annuity when non-central moments of the capitalization factor are known: an R application with interest rates following normal and beta distributions

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    2015-07-01

    Full Text Available This paper proposes an expression of the value of an annuity with payments of 1 unit each when the interest rate is random. In order to attain this objective, we proceed on the assumption that the non-central moments of the capitalization factor are known. Specifically, to calculate the value of these annuities, we propose two different expressions. First, we suppose that the random interest rate is normally distributed; then, we assume that it follows the beta distribution. A practical application of these two methodologies is also implemented using the R statistical software.

  10. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  11. Normal human bone marrow and its variations in MRI

    International Nuclear Information System (INIS)

    Vahlensieck, M.; Schmidt, H.M.

    2000-01-01

    Physiology and age dependant changes of human bone marrow are described. The resulting normal distribution patterns of active and inactive bone marrow including the various contrasts on different MR-sequences are discussed. (orig.) [de

  12. Back to Normal! Gaussianizing posterior distributions for cosmological probes

    Science.gov (United States)

    Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.

    2014-05-01

    We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.

  13. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Min Hoan; Cho, Jeong Yeon [Samsung Cheil Hospital, Sungkunkwan University school of Medicine, Seoul (Korea, Republic of)

    2003-09-15

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  14. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    International Nuclear Information System (INIS)

    Moon, Min Hoan; Cho, Jeong Yeon

    2003-01-01

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  15. Three-dimensional finite analysis of acetabular contact pressure and contact area during normal walking.

    Science.gov (United States)

    Wang, Guangye; Huang, Wenjun; Song, Qi; Liang, Jinfeng

    2017-11-01

    This study aims to analyze the contact areas and pressure distributions between the femoral head and mortar during normal walking using a three-dimensional finite element model (3D-FEM). Computed tomography (CT) scanning technology and a computer image processing system were used to establish the 3D-FEM. The acetabular mortar model was used to simulate the pressures during 32 consecutive normal walking phases and the contact areas at different phases were calculated. The distribution of the pressure peak values during the 32 consecutive normal walking phases was bimodal, which reached the peak (4.2 Mpa) at the initial phase where the contact area was significantly higher than that at the stepping phase. The sites that always kept contact were concentrated on the acetabular top and leaned inwards, while the anterior and posterior acetabular horns had no pressure concentration. The pressure distributions of acetabular cartilage at different phases were significantly different, the zone of increased pressure at the support phase distributed at the acetabular top area, while that at the stepping phase distributed in the inside of acetabular cartilage. The zones of increased contact pressure and the distributions of acetabular contact areas had important significance towards clinical researches, and could indicate the inductive factors of acetabular osteoarthritis. Copyright © 2016. Published by Elsevier Taiwan.

  16. Normal zone soliton in large composite superconductors

    International Nuclear Information System (INIS)

    Kupferman, R.; Mints, R.G.; Ben-Jacob, E.

    1992-01-01

    The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain

  17. The self-normalized Donsker theorem revisited

    OpenAIRE

    Parczewski, Peter

    2016-01-01

    We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.

  18. Neuronal variability during handwriting: lognormal distribution.

    Directory of Open Access Journals (Sweden)

    Valery I Rupasov

    Full Text Available We examined time-dependent statistical properties of electromyographic (EMG signals recorded from intrinsic hand muscles during handwriting. Our analysis showed that trial-to-trial neuronal variability of EMG signals is well described by the lognormal distribution clearly distinguished from the Gaussian (normal distribution. This finding indicates that EMG formation cannot be described by a conventional model where the signal is normally distributed because it is composed by summation of many random sources. We found that the variability of temporal parameters of handwriting--handwriting duration and response time--is also well described by a lognormal distribution. Although, the exact mechanism of lognormal statistics remains an open question, the results obtained should significantly impact experimental research, theoretical modeling and bioengineering applications of motor networks. In particular, our results suggest that accounting for lognormal distribution of EMGs can improve biomimetic systems that strive to reproduce EMG signals in artificial actuators.

  19. Modeling and forecasting foreign exchange daily closing prices with normal inverse Gaussian

    Science.gov (United States)

    Teneng, Dean

    2013-09-01

    We fit the normal inverse Gaussian(NIG) distribution to foreign exchange closing prices using the open software package R and select best models by Käärik and Umbleja (2011) proposed strategy. We observe that daily closing prices (12/04/2008 - 07/08/2012) of CHF/JPY, AUD/JPY, GBP/JPY, NZD/USD, QAR/CHF, QAR/EUR, SAR/CHF, SAR/EUR, TND/CHF and TND/EUR are excellent fits while EGP/EUR and EUR/GBP are good fits with a Kolmogorov-Smirnov test p-value of 0.062 and 0.08 respectively. It was impossible to estimate normal inverse Gaussian parameters (by maximum likelihood; computational problem) for JPY/CHF but CHF/JPY was an excellent fit. Thus, while the stochastic properties of an exchange rate can be completely modeled with a probability distribution in one direction, it may be impossible the other way around. We also demonstrate that foreign exchange closing prices can be forecasted with the normal inverse Gaussian (NIG) Lévy process, both in cases where the daily closing prices can and cannot be modeled by NIG distribution.

  20. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    Directory of Open Access Journals (Sweden)

    Wang Kuan-Min

    2013-01-01

    Full Text Available This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the correlation contagion test and Dungey et al.’s (2005 contagion test, we find contagion effects between the Vietnamese and four other stock markets, namely Japan, Singapore, China, and the US. Second, we show that the Japanese stock market causes stronger contagion risk in the Vietnamese stock market compared to the stock markets of China, Singapore, and the US. Finally, we show that the Chinese and US stock markets cause weaker contagion effects in the Vietnamese stock market because of stronger interdependence effects between the former two markets.