WorldWideScience

Sample records for normal distributions

  1. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  2. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  3. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  4. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  5. The exp-normal distribution is infinitely divisible

    OpenAIRE

    Pinelis, Iosif

    2018-01-01

    Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.

  6. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  7. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  8. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon

    2011-08-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew-normal distributions. In particular, we describe the characteristic function of skew-normal, skew-t, and other related distributions. © 2011 Elsevier Inc.

  9. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  10. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  11. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  12. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  13. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  14. Ventilation-perfusion distribution in normal subjects.

    Science.gov (United States)

    Beck, Kenneth C; Johnson, Bruce D; Olson, Thomas P; Wilson, Theodore A

    2012-09-01

    Functional values of LogSD of the ventilation distribution (σ(V)) have been reported previously, but functional values of LogSD of the perfusion distribution (σ(q)) and the coefficient of correlation between ventilation and perfusion (ρ) have not been measured in humans. Here, we report values for σ(V), σ(q), and ρ obtained from wash-in data for three gases, helium and two soluble gases, acetylene and dimethyl ether. Normal subjects inspired gas containing the test gases, and the concentrations of the gases at end-expiration during the first 10 breaths were measured with the subjects at rest and at increasing levels of exercise. The regional distribution of ventilation and perfusion was described by a bivariate log-normal distribution with parameters σ(V), σ(q), and ρ, and these parameters were evaluated by matching the values of expired gas concentrations calculated for this distribution to the measured values. Values of cardiac output and LogSD ventilation/perfusion (Va/Q) were obtained. At rest, σ(q) is high (1.08 ± 0.12). With the onset of ventilation, σ(q) decreases to 0.85 ± 0.09 but remains higher than σ(V) (0.43 ± 0.09) at all exercise levels. Rho increases to 0.87 ± 0.07, and the value of LogSD Va/Q for light and moderate exercise is primarily the result of the difference between the magnitudes of σ(q) and σ(V). With known values for the parameters, the bivariate distribution describes the comprehensive distribution of ventilation and perfusion that underlies the distribution of the Va/Q ratio.

  15. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  16. Modified Normal Demand Distributions in (R,S)-Inventory Models

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    2003-01-01

    To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,

  17. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon; Genton, Marc G.

    2011-01-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew

  18. Computation of distribution of minimum resolution for log-normal distribution of chromatographic peak heights.

    Science.gov (United States)

    Davis, Joe M

    2011-10-28

    General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  20. Normal and student´s t distributions and their applications

    CERN Document Server

    Ahsanullah, Mohammad; Shakil, Mohammad

    2014-01-01

    The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.

  1. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Ferreira, Clé cio S.; Genton, Marc G.

    2018-01-01

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down

  2. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  3. Normal distribution of standing balance for healthy Danish children

    DEFF Research Database (Denmark)

    Pedersen, Line Kjeldgaard; Ghasemi, Habib; Rahbek, Ole

    2013-01-01

    Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used in child......Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used...

  4. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  5. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  6. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  7. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  8. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  9. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  10. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  11. A general approach to double-moment normalization of drop size distributions

    NARCIS (Netherlands)

    Lee, G.W.; Zawadzki, I.; Szyrmer, W.; Sempere Torres, D.; Uijlenhoet, R.

    2004-01-01

    Normalization of drop size distributions (DSDs) is reexamined here. First, an extension of the scaling normalization that uses one moment of the DSD as a scaling parameter to a more general scaling normalization that uses two moments as scaling parameters of the normalization is presented. In

  12. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  13. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  14. Generating log-normally distributed random numbers by using the Ziggurat algorithm

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2016-01-01

    Uncertainty analyses are usually based on the Monte Carlo method. Using an efficient random number generator(RNG) is a key element in success of Monte Carlo simulations. Log-normal distributed variates are very typical in NPP PSAs. This paper proposes an approach to generate log normally distributed variates based on the Ziggurat algorithm and evaluates the efficiency of the proposed Ziggurat RNG. The proposed RNG can be helpful to improve the uncertainty analysis of NPP PSAs. This paper focuses on evaluating the efficiency of the Ziggurat algorithm from a NPP PSA point of view. From this study, we can draw the following conclusions. - The Ziggurat algorithm is one of perfect random number generators to product normal distributed variates. - The Ziggurat algorithm is computationally much faster than the most commonly used method, Marsaglia polar method

  15. The retest distribution of the visual field summary index mean deviation is close to normal.

    Science.gov (United States)

    Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz

    2016-09-01

    When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  16. Austenite Grain Size Estimtion from Chord Lengths of Logarithmic-Normal Distribution

    Directory of Open Access Journals (Sweden)

    Adrian H.

    2017-12-01

    Full Text Available Linear section of grains in polyhedral material microstructure is a system of chords. The mean length of chords is the linear grain size of the microstructure. For the prior austenite grains of low alloy structural steels, the chord length is a random variable of gamma- or logarithmic-normal distribution. The statistical grain size estimation belongs to the quantitative metallographic problems. The so-called point estimation is a well known procedure. The interval estimation (grain size confidence interval for the gamma distribution was given elsewhere, but for the logarithmic-normal distribution is the subject of the present contribution. The statistical analysis is analogous to the one for the gamma distribution.

  17. Kullback–Leibler Divergence of the γ–ordered Normal over t–distribution

    OpenAIRE

    Toulias, T-L.; Kitsos, C-P.

    2012-01-01

    The aim of this paper is to evaluate and study the Kullback–Leibler divergence of the γ–ordered Normal distribution, a generalization of Normal distribution emerged from the generalized Fisher’s information measure, over the scaled t–distribution. We investigate this evaluation through a series of bounds and approximations while the asymptotic behavior of the divergence is also studied. Moreover, we obtain a generalization of the known Kullback–Leibler information measure betwe...

  18. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2018-02-26

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down the theoretical foundations for subsequent inference with this model. In particular, we study linear transformations, marginal distributions, selection representations, stochastic representations and hierarchical representations. We also describe an EM-type algorithm for maximum likelihood estimation of the parameters of the model and demonstrate its implementation on a wind dataset. Our family of multivariate distributions unifies and extends many existing models of the literature that can be seen as submodels of our proposal.

  19. Improved Root Normal Size Distributions for Liquid Atomization

    Science.gov (United States)

    2015-11-01

    ANSI Std. Z39.18 j CONVERSION TABLE Conversion Factors for U.S. Customary to metric (SI) units of measurement. MULTIPLY BY TO...Gray (Gy) coulomb /kilogram (C/kg) second (s) kilogram (kg) kilo pascal (kPa) 1 Improved Root Normal Size Distributions for Liquid

  20. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  1. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  2. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    Science.gov (United States)

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  3. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  4. A general approach to double-moment normalization of drop size distributions

    Science.gov (United States)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  5. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  6. Distributive justice and cognitive enhancement in lower, normal intelligence.

    Science.gov (United States)

    Dunlop, Mikael; Savulescu, Julian

    2014-01-01

    There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at

  7. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  8. Radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo; Tsujimura, Norio

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( 90 Sr -90 Y), gamma rays ( 137 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 -5 % and 5.4x10 -4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  9. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  10. Computer program determines exact two-sided tolerance limits for normal distributions

    Science.gov (United States)

    Friedman, H. A.; Webb, S. R.

    1968-01-01

    Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.

  11. Distribution of Different Sized Ocular Surface Vessels in Diabetics and Normal Individuals.

    Science.gov (United States)

    Banaee, Touka; Pourreza, Hamidreza; Doosti, Hassan; Abrishami, Mojtaba; Ehsaei, Asieh; Basiry, Mohsen; Pourreza, Reza

    2017-01-01

    To compare the distribution of different sized vessels using digital photographs of the ocular surface of diabetic and normal individuals. In this cross-sectional study, red-free conjunctival photographs of diabetic and normal individuals, aged 30-60 years, were taken under defined conditions and analyzed using a Radon transform-based algorithm for vascular segmentation. The image areas occupied by vessels (AOV) of different diameters were calculated. The main outcome measure was the distribution curve of mean AOV of different sized vessels. Secondary outcome measures included total AOV and standard deviation (SD) of AOV of different sized vessels. Two hundred and sixty-eight diabetic patients and 297 normal (control) individuals were included, differing in age (45.50 ± 5.19 vs. 40.38 ± 6.19 years, P distribution curves of mean AOV differed between patients and controls (smaller AOV for larger vessels in patients; P distribution curve of vessels compared to controls. Presence of diabetes mellitus is associated with contraction of larger vessels in the conjunctiva. Smaller vessels dilate with diabetic retinopathy. These findings may be useful in the photographic screening of diabetes mellitus and retinopathy.

  12. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  13. Distribution of normal superficial ocular vessels in digital images.

    Science.gov (United States)

    Banaee, Touka; Ehsaei, Asieh; Pourreza, Hamidreza; Khajedaluee, Mohammad; Abrishami, Mojtaba; Basiri, Mohsen; Daneshvar Kakhki, Ramin; Pourreza, Reza

    2014-02-01

    To investigate the distribution of different-sized vessels in the digital images of the ocular surface, an endeavor which may provide useful information for future studies. This study included 295 healthy individuals. From each participant, four digital photographs of the superior and inferior conjunctivae of both eyes, with a fixed succession of photography (right upper, right lower, left upper, left lower), were taken with a slit lamp mounted camera. Photographs were then analyzed by a previously described algorithm for vessel detection in the digital images. The area (of the image) occupied by vessels (AOV) of different sizes was measured. Height, weight, fasting blood sugar (FBS) and hemoglobin levels were also measured and the relationship between these parameters and the AOV was investigated. These findings indicated a statistically significant difference in the distribution of the AOV among the four conjunctival areas. No significant correlations were noted between the AOV of each conjunctival area and the different demographic and biometric factors. Medium-sized vessels were the most abundant vessels in the photographs of the four investigated conjunctival areas. The AOV of the different sizes of vessels follows a normal distribution curve in the four areas of the conjunctiva. The distribution of the vessels in successive photographs changes in a specific manner, with the mean AOV becoming larger as the photos were taken from the right upper to the left lower area. The AOV of vessel sizes has a normal distribution curve and medium-sized vessels occupy the largest area of the photograph. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  14. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2016-01-01

    a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...

  15. Radiation distribution sensing with normal optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo [Nagoya Univ., Dept. of Nuclear Engineering, Nagoya, Aichi (Japan); Tsujimura, Norio [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan)

    2002-12-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ({sup 90}Sr{sup -90}Y), gamma rays ({sup 137}Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10{sup -5}% and 5.4x10{sup -4}%, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  16. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  17. Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution

    NARCIS (Netherlands)

    Belitser, E.; Ghosal, S.

    2003-01-01

    We consider the problem of estimating the mean of an infinite-break dimensional normal distribution from the Bayesian perspective. Under the assumption that the unknown true mean satisfies a "smoothness condition," we first derive the convergence rate of the posterior distribution for a prior that

  18. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  19. Size distribution of interstellar particles. III. Peculiar extinctions and normal infrared extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.; Wallenhorst, S.G.

    1981-01-01

    The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists

  20. The PDF of fluid particle acceleration in turbulent flow with underlying normal distribution of velocity fluctuations

    International Nuclear Information System (INIS)

    Aringazin, A.K.; Mazhitov, M.I.

    2003-01-01

    We describe a formal procedure to obtain and specify the general form of a marginal distribution for the Lagrangian acceleration of fluid particle in developed turbulent flow using Langevin type equation and the assumption that velocity fluctuation u follows a normal distribution with zero mean, in accord to the Heisenberg-Yaglom picture. For a particular representation, β=exp[u], of the fluctuating parameter β, we reproduce the underlying log-normal distribution and the associated marginal distribution, which was found to be in a very good agreement with the new experimental data by Crawford, Mordant, and Bodenschatz on the acceleration statistics. We discuss on arising possibilities to make refinements of the log-normal model

  1. Confidence Intervals for True Scores Using the Skew-Normal Distribution

    Science.gov (United States)

    Garcia-Perez, Miguel A.

    2010-01-01

    A recent comparative analysis of alternative interval estimation approaches and procedures has shown that confidence intervals (CIs) for true raw scores determined with the Score method--which uses the normal approximation to the binomial distribution--have actual coverage probabilities that are closest to their nominal level. It has also recently…

  2. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  3. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  4. A study of the up-and-down method for non-normal distribution functions

    DEFF Research Database (Denmark)

    Vibholm, Svend; Thyregod, Poul

    1988-01-01

    The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimates...

  5. The approximation of the normal distribution by means of chaotic expression

    International Nuclear Information System (INIS)

    Lawnik, M

    2014-01-01

    The approximation of the normal distribution by means of a chaotic expression is achieved by means of Weierstrass function, where, for a certain set of parameters, the density of the derived recurrence renders good approximation of the bell curve

  6. Dobinski-type relations and the log-normal distribution

    International Nuclear Information System (INIS)

    Blasiak, P; Penson, K A; Solomon, A I

    2003-01-01

    We consider sequences of generalized Bell numbers B(n), n = 1, 2, ..., which can be represented by Dobinski-type summation formulae, i.e. B(n) = 1/C Σ k =0 ∞ [P(k)] n /D(k), with P(k) a polynomial, D(k) a function of k and C = const. They include the standard Bell numbers (P(k) k, D(k) = k!, C = e), their generalizations B r,r (n), r = 2, 3, ..., appearing in the normal ordering of powers of boson monomials (P(k) (k+r)!/k!, D(k) = k!, C = e), variants of 'ordered' Bell numbers B o (p) (n) (P(k) = k, D(k) = (p+1/p) k , C = 1 + p, p = 1, 2 ...), etc. We demonstrate that for α, β, γ, t positive integers (α, t ≠ 0), [B(αn 2 + βn + γ)] t is the nth moment of a positive function on (0, ∞) which is a weighted infinite sum of log-normal distributions. (letter to the editor)

  7. The law of distribution of light beam direction fluctuations in telescopes. [normal density functions

    Science.gov (United States)

    Divinskiy, M. L.; Kolchinskiy, I. G.

    1974-01-01

    The distribution of deviations from mean star trail directions was studied on the basis of 105 star trails. It was found that about 93% of the trails yield a distribution in agreement with the normal law. About 4% of the star trails agree with the Charlier distribution.

  8. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  9. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  10. MR imaging of the bone marrow using short TI IR, 1. Normal and pathological intensity distribution of the bone marrow

    Energy Technology Data Exchange (ETDEWEB)

    Ishizaka, Hiroshi; Kurihara, Mikiko; Tomioka, Kuniaki; Kobayashi, Kanako; Sato, Noriko; Nagai, Teruo; Heshiki, Atsuko; Amanuma, Makoto; Mizuno, Hitomi.

    1989-02-01

    Normal vertebral bone marrow intensity distribution and its alteration in various anemias were evaluated on short TI IR sequences. Material consists of 73 individuals, 48 normals and 25 anemic patients excluding neoplastic conditions. All normal and reactive hypercellular bone marrow revealed characteristic intensity distribution; marginal high intensity and central low intensity, corresponding well to normal distribution of red and yellow marrows and their physiological or reactive conversion between red and yellow marrows. Aplastic anemia did not reveal normal intensity distribution, presumably due to autonomous condition.

  11. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  12. Radiation distribution sensing with normal optical fiber

    CERN Document Server

    Kawarabayashi, J; Naka, R; Uritani, A; Watanabe, K I; Iguchi, T; Tsujimura, N

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( sup 9 sup 0 Sr sup - sup 9 sup 0 Y), gamma rays ( sup 1 sup 3 sup 7 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 sup - sup 5 % and 5.4x10 sup - sup 4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that t...

  13. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  14. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  15. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  16. Spatial arrangement and size distribution of normal faults, Buckskin detachment upper plate, Western Arizona

    Science.gov (United States)

    Laubach, S. E.; Hundley, T. H.; Hooker, J. N.; Marrett, R. A.

    2018-03-01

    Fault arrays typically include a wide range of fault sizes and those faults may be randomly located, clustered together, or regularly or periodically located in a rock volume. Here, we investigate size distribution and spatial arrangement of normal faults using rigorous size-scaling methods and normalized correlation count (NCC). Outcrop data from Miocene sedimentary rocks in the immediate upper plate of the regional Buckskin detachment-low angle normal-fault, have differing patterns of spatial arrangement as a function of displacement (offset). Using lower size-thresholds of 1, 0.1, 0.01, and 0.001 m, displacements range over 5 orders of magnitude and have power-law frequency distributions spanning ∼ four orders of magnitude from less than 0.001 m to more than 100 m, with exponents of -0.6 and -0.9. The largest faults with >1 m displacement have a shallower size-distribution slope and regular spacing of about 20 m. In contrast, smaller faults have steep size-distribution slopes and irregular spacing, with NCC plateau patterns indicating imposed clustering. Cluster widths are 15 m for the 0.1-m threshold, 14 m for 0.01-m, and 1 m for 0.001-m displacement threshold faults. Results demonstrate normalized correlation count effectively characterizes the spatial arrangement patterns of these faults. Our example from a high-strain fault pattern above a detachment is compatible with size and spatial organization that was influenced primarily by boundary conditions such as fault shape, mechanical unit thickness and internal stratigraphy on a range of scales rather than purely by interaction among faults during their propagation.

  17. Even-odd charged multiplicity distributions and energy dependence of normalized multiplicity moments in different rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    The even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows are calculated, starting from a simple picture for charge correlation with non-zero correlation length. The coincidence and separation of these distributions are explained. The calculated window-and energy-dependence of normalized moments recovered the behaviour found in experiments. A new definition for normalized moments is propossed, especially suitable for narrow rapidity windows

  18. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  19. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  20. Normal cranial bone marrow MR imaging pattern with age-related ADC value distribution

    International Nuclear Information System (INIS)

    Li Qi; Pan Shinong; Yin Yuming; Li Wei; Chen Zhian; Liu Yunhui; Wu Zhenhua; Guo Qiyong

    2011-01-01

    Objective: To determine MRI appearances of normal age-related cranial bone marrow and the relationship between MRI patterns and apparent diffusion coefficient (ADC) values. Methods: Five hundred subjects were divided into seven groups based on ages. Cranial bone marrow MRI patterns were performed based on different thickness of the diploe and signal intensity distribution characteristics. ADC values of the frontal, parietal, occipital and temporal bones on DWI were measured and calculated. Correlations between ages and ADC values, between patterns and ADC values, as well as the distribution of ADC values were analyzed. Results: Normal cranial bone marrow was divided into four types and six subtypes, Type I, II, III and IV, which had positive correlation with age increasing (χ 2 = 266.36, P 0.05). In addition, there was significant negative correlation between the ADC values and MRI patterns in the normal parietal and occipital bones (r = -0.691 and -0.750, P < 0.01). Conclusion: The combination of MRI features and ADC values changes in different cranial bones showed significant correlation with age increasing. Familiar with the MRI appearance of the normal bone marrow conversion pattern in different age group and their ADC value will aid the diagnosis and differential of the cranial bone pathology.

  1. Breast cancer subtype distribution is different in normal weight, overweight, and obese women.

    Science.gov (United States)

    Gershuni, Victoria; Li, Yun R; Williams, Austin D; So, Alycia; Steel, Laura; Carrigan, Elena; Tchou, Julia

    2017-06-01

    Obesity is associated with tumor promoting pathways related to insulin resistance and chronic low-grade inflammation which have been linked to various disease states, including cancer. Many studies have focused on the relationship between obesity and increased estrogen production, which contributes to the pathogenesis of estrogen receptor-positive breast cancers. The link between obesity and other breast cancer subtypes, such as triple-negative breast cancer (TNBC) and Her2/neu+ (Her2+) breast cancer, is less clear. We hypothesize that obesity may be associated with the pathogenesis of specific breast cancer subtypes resulting in a different subtype distribution than normal weight women. A single-institution, retrospective analysis of tumor characteristics of 848 patients diagnosed with primary operable breast cancer between 2000 and 2013 was performed to evaluate the association between BMI and clinical outcome. Patients were grouped based on their BMI at time of diagnosis stratified into three subgroups: normal weight (BMI = 18-24.9), overweight (BMI = 25-29.9), and obese (BMI > 30). The distribution of breast cancer subtypes across the three BMI subgroups was compared. Obese and overweight women were more likely to present with TNBC and normal weight women with Her2+ breast cancer (p = 0.008). We demonstrated, for the first time, that breast cancer subtype distribution varied significantly according to BMI status. Our results suggested that obesity might activate molecular pathways other than the well-known obesity/estrogen circuit in the pathogenesis of breast cancer. Future studies are needed to understand the molecular mechanisms that drive the variation in subtype distribution across BMI subgroups.

  2. Carbon K-shell photoionization of CO: Molecular frame angular distributions of normal and conjugate shakeup satellites

    International Nuclear Information System (INIS)

    Jahnke, T.; Titze, J.; Foucar, L.; Wallauer, R.; Osipov, T.; Benis, E.P.; Jagutzki, O.; Arnold, W.; Czasch, A.; Staudte, A.; Schoeffler, M.; Alnaser, A.; Weber, T.; Prior, M.H.; Schmidt-Boecking, H.; Doerner, R.

    2011-01-01

    We have measured the molecular frame angular distributions of photoelectrons emitted from the Carbon K-shell of fixed-in-space CO molecules for the case of simultaneous excitation of the remaining molecular ion. Normal and conjugate shakeup states are observed. Photoelectrons belonging to normal Σ-satellite lines show an angular distribution resembling that observed for the main photoline at the same electron energy. Surprisingly a similar shape is found for conjugate shakeup states with Π-symmetry. In our data we identify shake rather than electron scattering (PEVE) as the mechanism producing the conjugate lines. The angular distributions clearly show the presence of a Σ shape resonance for all of the satellite lines.

  3. Distribution of the anticancer drugs doxorubicin, mitoxantrone and topotecan in tumors and normal tissues.

    Science.gov (United States)

    Patel, Krupa J; Trédan, Olivier; Tannock, Ian F

    2013-07-01

    Pharmacokinetic analyses estimate the mean concentration of drug within a given tissue as a function of time, but do not give information about the spatial distribution of drugs within that tissue. Here, we compare the time-dependent spatial distribution of three anticancer drugs within tumors, heart, kidney, liver and brain. Mice bearing various xenografts were treated with doxorubicin, mitoxantrone or topotecan. At various times after injection, tumors and samples of heart, kidney, liver and brain were excised. Within solid tumors, the distribution of doxorubicin, mitoxantrone and topotecan was limited to perivascular regions at 10 min after administration and the distance from blood vessels at which drug intensity fell to half was ~25-75 μm. Although drug distribution improved after 3 and 24 h, there remained a significant decrease in drug fluorescence with increasing distance from tumor blood vessels. Drug distribution was relatively uniform in the heart, kidney and liver with substantially greater perivascular drug uptake than in tumors. There was significantly higher total drug fluorescence in the liver than in tumors after 10 min, 3 and 24 h. Little to no drug fluorescence was observed in the brain. There are marked differences in the spatial distributions of three anticancer drugs within tumor tissue and normal tissues over time, with greater exposure to most normal tissues and limited drug distribution to many cells in tumors. Studies of the spatial distribution of drugs are required to complement pharmacokinetic data in order to better understand and predict drug effects and toxicities.

  4. Neutron importance and the generalized Green function for the conventionally critical reactor with normalized neutron distribution

    International Nuclear Information System (INIS)

    Khromov, V.V.

    1978-01-01

    The notion of neutron importance when applied to nuclear reactor statics problems described by time-independent homogeneous equations of neutron transport with provision for normalization of neutron distribution is considered. An equation has been obtained for the function of neutron importance in a conditionally critical reactor with respect to an arbitrary nons linear functional determined for the normalized neutron distribution. Relation between this function and the generalized Green function of the selfconjugated operator of the reactor equation is determined and the formula of small perturbations for the functionals of a conditionally critical reactor is deduced

  5. Elastin distribution in the normal uterus, uterine leiomyomas, adenomyosis and adenomyomas: a comparison.

    Science.gov (United States)

    Zheng, Wei-Qiang; Ma, Rong; Zheng, Jian-Ming; Gong, Zhi-Jing

    2006-04-01

    To describe the histologic distribution of elastin in the nonpregnant human uterus, uterine leiomyomas, adenomyosis and adenomyomas. Uteri were obtained from women undergoing hysterectomy for benign conditions, including 26 cases of uterine leiomyomas, 24 cases of adenomyosis, 18 adenomyomas and 6 cases of autopsy specimens. Specific histochemical staining techniques were employed in order to demonstrate the distribution of elastin. The distribution of elastin components in the uterus was markedly uneven and showed a decreasing gradient from outer to inner myometrium. No elastin was present within leiomyomas, adenomyomas or adenomyosis. The distribution of elastin may help explain the normal function of the myometrium in labor. It implies that the uneven distribution of elastin components and absence of elastin within leiomyomas, adenomyomas and adenomyosis could be of some clinical significance. The altered elastin distribution in disease states may help explain such symptoms as dysmenorrhea in uterine endometriosis.

  6. The normal distribution of thoracoabdominal aorta small branch artery ostia

    International Nuclear Information System (INIS)

    Cronin, Paul; Williams, David M.; Vellody, Ranjith; Kelly, Aine Marie; Kazerooni, Ella A.; Carlos, Ruth C.

    2011-01-01

    The purpose of this study was to determine the normal distribution of aortic branch artery ostia. CT scans of 100 subjects were retrospectively reviewed. The angular distributions of the aorta with respect to the center of the T3 to L4 vertebral bodies, and of branch artery origins with respect to the center of the aorta were measured. At each vertebral body level the distribution of intercostal/lumbar arteries and other branch arteries were calculated. The proximal descending aorta is posteriorly placed becoming a midline structure, at the thoracolumbar junction, and remains anterior to the vertebral bodies within the abdomen. The intercostal and lumbar artery ostia have a distinct distribution. At each vertebral level from T3 caudally, one intercostal artery originates from the posterior wall of the aorta throughout the thoracic aorta, while the other intercostal artery originates from the medial wall of the descending thoracic aorta high in the chest, posteromedially from the mid-thoracic aorta, and from the posterior wall of the aorta low in the chest. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Lumbar branches originate only from the posterior wall of the abdominal aorta. Aortic branch artery origins arise with a bimodal distribution and have a characteristic location. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Knowing the location of aortic branch artery ostia may help distinguish branch artery pseudoaneurysms from penetrating ulcers.

  7. Probability distribution of atmospheric pollutants: comparison among four methods for the determination of the log-normal distribution parameters; La distribuzione di probabilita` degli inquinanti atmosferici: confronto tra quattro metodi per la determinazione dei parametri della distribuzione log-normale

    Energy Technology Data Exchange (ETDEWEB)

    Bellasio, R [Enviroware s.r.l., Agrate Brianza, Milan (Italy). Centro Direzionale Colleoni; Lanzani, G; Ripamonti, M; Valore, M [Amministrazione Provinciale, Como (Italy)

    1998-04-01

    This work illustrates the possibility to interpolate the measured concentrations of CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} during one year (1995) at the 13 stations of the air quality monitoring station network of the Provinces of Como and Lecco (Italy) by means of a log-normal distribution. Particular attention was given in choosing the method for the determination of the log-normal distribution parameters among four possible methods: I natural, II percentiles, III moments, IV maximum likelihood. In order to evaluate the goodness of fit a ranking procedure was carried out over the values of four indices: absolute deviation, weighted absolute deviation, Kolmogorov-Smirnov index and Cramer-von Mises-Smirnov index. The capability of the log-normal distribution to fit the measured data is then discussed as a function of the pollutant and of the monitoring station. Finally an example of application is given: the effect of an emission reduction strategy in Lombardy Region (the so called `bollino blu`) is evaluated using a log-normal distribution. [Italiano] In questo lavoro si discute la possibilita` di interpolare le concentrazioni misurate di CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} durante un anno solare (il 1995) nelle 13 stazioni della Rete di Rilevamento della qualita` dell`aria delle Provincie di Como e di Lecco mediante una funzione log-normale. In particolare si discute quale metodo e` meglio usare per l`individuazione dei 2 parametri caratteristici della log-normale, tra 4 teoreticamente possibili: I naturale, II dei percentili, III dei momenti, IV della massima verosimiglianza. Per valutare i risultati ottenuti si usano: la deviazione assoluta, la deviazione pesata, il parametro di Kolmogorov-Smirnov e quello di Cramer-von Mises-Smirnov effettuando un ranking tra i metodi in funzione degli inquinanti e della stazione di misura. Ancora in funzione degli inquinanti e delle diverse stazioni di misura si discute poi la capacita` della funzione log-normale di

  8. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming; Jin, Ick-Hoon

    2013-01-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  9. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  10. Externally studentized normal midrange distribution

    Directory of Open Access Journals (Sweden)

    Ben Dêivide de Oliveira Batista

    Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.

  11. Generalization of the normal-exponential model: exploration of a more accurate parametrisation for the signal distribution on Illumina BeadArrays.

    Science.gov (United States)

    Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv

    2012-12-11

    Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement

  12. Energy dependence of angular distributions of sputtered particles by ion-beam bombardment at normal incidence

    International Nuclear Information System (INIS)

    Matsuda, Yoshinobu; Ueda, Yasutoshi; Uchino, Kiichiro; Muraoka, Katsunori; Maeda, Mitsuo; Akazaki, Masanori; Yamamura, Yasunori.

    1986-01-01

    The angular distributions of sputtered Fe-atoms were measured using the laser fluorescence technique during Ar-ion bombardment for energies of 0.6, 1, 2 and 3 keV at normal incidence. The measured cosine distribution at 0.6 keV progressively deviated to an over-cosine distribution at higher energies, and at 3 keV the angular distribution was an overcosine distribution of about 20 %. The experimental results agree qualitatively with calculations by a recent computer simulation code, ACAT. The results are explained by the competition between surface scattering and the effects of primary knock-on atoms, which tend to make the angular distributions over-cosine and under-cosine, respectively. (author)

  13. Normal distribution of /sup 111/In chloride on scintigram

    Energy Technology Data Exchange (ETDEWEB)

    Oyama, K; Machida, K; Hayashi, S; Watari, T; Akaike, A

    1977-05-01

    Indium-111-chloride (/sup 111/InCl/sub 3/) was used as a bone marrow imaging and a tumor-localizing agent in 38 patients (46 scintigrams), who were suspected of, or diagnosed as, having malignant disease, and who were irradiated for malignant disease. The regions of suspected malignant disease, of abnormally accumulated on scintigrams, and the target irradiated, were excluded to estimate the normal distribution of /sup 111/InCl/sub 3/. Scintigrams were taken 48 hrs after intravenous injection of /sup 111/InCl/sub 3/ 1 to 3 mCi. The percent and score distribution of /sup 111/InCl/sub 3/ were noted in 23 regions. As the liver showed the highest accumulation of /sup 111/In on all scintigrams, the liver was designated as 2+. Comparing with the radioactivity in the liver, other regions had similar (2+), moderately decreased (+), or severely decreased (-) accumulation on scintigram. The score is given one for 2+, 0.5 for +, 0 for -. The score and percentage distributions were: liver 100 (100%), lumbar vertebra 58.5 (100%), mediastinum 55 (100%), nasopharynx 50 (100%), testis 47.5 (59%), heart 44.5 (89%), and pelvis 43.5 (78%). Comparing this study with a previous study of /sup 111/In-BLM, score distribution in lumbar vertebra, pelvis, and skull were similar. /sup 111/In-BLM is excreted rapidly after injection, but little /sup 111/InCl/sub 3/ is excreted. Accumulation of /sup 111/In in bone marrow depends upon the amount of /sup 111/In-transferrin in blood. High accumulation in the lumbar vertebra and pelvis shows that /sup 111/InCl/sub 3/ would be effective as a bone marrow imaging agent.

  14. Impact of foot progression angle on the distribution of plantar pressure in normal children.

    Science.gov (United States)

    Lai, Yu-Cheng; Lin, Huey-Shyan; Pan, Hui-Fen; Chang, Wei-Ning; Hsu, Chien-Jen; Renn, Jenn-Huei

    2014-02-01

    Plantar pressure distribution during walking is affected by several gait factors, most especially the foot progression angle which has been studied in children with neuromuscular diseases. However, this relationship in normal children has only been reported in limited studies. The purpose of this study is to clarify the correlation between foot progression angle and plantar pressure distribution in normal children, as well as the impacts of age and sex on this correlation. This study retrospectively reviewed dynamic pedobarographic data that were included in the gait laboratory database of our institution. In total, 77 normally developed children aged 5-16 years who were treated between 2004 and 2009 were included. Each child's footprint was divided into 5 segments: lateral forefoot, medial forefoot, lateral midfoot, medial midfoot, and heel. The percentages of impulse exerted at the medial foot, forefoot, midfoot, and heel were calculated. The average foot progression angle was 5.03° toe-out. Most of the total impulse was exerted on the forefoot (52.0%). Toe-out gait was positively correlated with high medial (r = 0.274; P plantar pressure as part of the treatment of various foot pathologies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. An empirical multivariate log-normal distribution representing uncertainty of biokinetic parameters for 137Cs

    International Nuclear Information System (INIS)

    Miller, G.; Martz, H.; Bertelli, L.; Melo, D.

    2008-01-01

    A simplified biokinetic model for 137 Cs has six parameters representing transfer of material to and from various compartments. Using a Bayesian analysis, the joint probability distribution of these six parameters is determined empirically for two cases with quite a lot of bioassay data. The distribution is found to be a multivariate log-normal. Correlations between different parameters are obtained. The method utilises a fairly large number of pre-determined forward biokinetic calculations, whose results are stored in interpolation tables. Four different methods to sample the multidimensional parameter space with a limited number of samples are investigated: random, stratified, Latin Hypercube sampling with a uniform distribution of parameters and importance sampling using a lognormal distribution that approximates the posterior distribution. The importance sampling method gives much smaller sampling uncertainty. No sampling method-dependent differences are perceptible for the uniform distribution methods. (authors)

  16. On the Use of the Log-Normal Particle Size Distribution to Characterize Global Rain

    Science.gov (United States)

    Meneghini, Robert; Rincon, Rafael; Liao, Liang

    2003-01-01

    Although most parameterizations of the drop size distributions (DSD) use the gamma function, there are several advantages to the log-normal form, particularly if we want to characterize the large scale space-time variability of the DSD and rain rate. The advantages of the distribution are twofold: the logarithm of any moment can be expressed as a linear combination of the individual parameters of the distribution; the parameters of the distribution are approximately normally distributed. Since all radar and rainfall-related parameters can be written approximately as a moment of the DSD, the first property allows us to express the logarithm of any radar/rainfall variable as a linear combination of the individual DSD parameters. Another consequence is that any power law relationship between rain rate, reflectivity factor, specific attenuation or water content can be expressed in terms of the covariance matrix of the DSD parameters. The joint-normal property of the DSD parameters has applications to the description of the space-time variation of rainfall in the sense that any radar-rainfall quantity can be specified by the covariance matrix associated with the DSD parameters at two arbitrary space-time points. As such, the parameterization provides a means by which we can use the spaceborne radar-derived DSD parameters to specify in part the covariance matrices globally. However, since satellite observations have coarse temporal sampling, the specification of the temporal covariance must be derived from ancillary measurements and models. Work is presently underway to determine whether the use of instantaneous rain rate data from the TRMM Precipitation Radar can provide good estimates of the spatial correlation in rain rate from data collected in 5(sup 0)x 5(sup 0) x 1 month space-time boxes. To characterize the temporal characteristics of the DSD parameters, disdrometer data are being used from the Wallops Flight Facility site where as many as 4 disdrometers have been

  17. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  18. Drug binding affinities and potencies are best described by a log-normal distribution and use of geometric means

    International Nuclear Information System (INIS)

    Stanisic, D.; Hancock, A.A.; Kyncl, J.J.; Lin, C.T.; Bush, E.N.

    1986-01-01

    (-)-Norepinephrine (NE) is used as an internal standard in their in vitro adrenergic assays, and the concentration of NE which produces a half-maximal inhibition of specific radioligand binding (affinity; K/sub I/), or half-maximal contractile response (potency; ED 50 ) has been measured numerous times. The goodness-of-fit test for normality was performed on both normal (Gaussian) or log 10 -normal frequency histograms of these data using the SAS Univariate procedure. Specific binding of 3 H-prazosin to rat liver (α 1 -), 3 H rauwolscine to rat cortex (α 2 -) and 3 H-dihydroalprenolol to rat ventricle (β 1 -) or rat lung (β 2 -receptors) was inhibited by NE; the distributions of NE K/sub I/'s at all these sites were skewed to the right, with highly significant (p 50 's of NE in isolated rabbit aorta (α 1 ), phenoxybenzamine-treated dog saphenous vein (α 2 ) and guinea pig atrium (β 1 ). The vasorelaxant potency of atrial natriuretic hormone in histamine-contracted rabbit aorta also was better described by a log-normal distribution, indicating that log-normalcy is probably a general phenomenon of drug-receptor interactions. Because data of this type appear to be log-normally distributed, geometric means should be used in parametric statistical analyses

  19. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  20. Statistical properties of the normalized ice particle size distribution

    Science.gov (United States)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000

  1. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    Science.gov (United States)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  2. Comparing of Normal Stress Distribution in Static and Dynamic Soil-Structure Interaction Analyses

    International Nuclear Information System (INIS)

    Kholdebarin, Alireza; Massumi, Ali; Davoodi, Mohammad; Tabatabaiefar, Hamid Reza

    2008-01-01

    It is important to consider the vertical component of earthquake loading and inertia force in soil-structure interaction analyses. In most circumstances, design engineers are primarily concerned about the analysis of behavior of foundations subjected to earthquake-induced forces transmitted from the bedrock. In this research, a single rigid foundation with designated geometrical parameters located on sandy-clay soil has been modeled in FLAC software with Finite Different Method and subjected to three different vertical components of earthquake records. In these cases, it is important to evaluate effect of footing on underlying soil and to consider normal stress in soil with and without footing. The distribution of normal stress under the footing in static and dynamic states has been studied and compared. This Comparison indicated that, increasing in normal stress under the footing caused by vertical component of ground excitations, has decreased dynamic vertical settlement in comparison with static state

  3. Elastic microfibril distribution in the cornea: Differences between normal and keratoconic stroma.

    Science.gov (United States)

    White, Tomas L; Lewis, Philip N; Young, Robert D; Kitazawa, Koji; Inatomi, Tsutomu; Kinoshita, Shigeru; Meek, Keith M

    2017-06-01

    The optical and biomechanical properties of the cornea are largely governed by the collagen-rich stroma, a layer that represents approximately 90% of the total thickness. Within the stroma, the specific arrangement of superimposed lamellae provides the tissue with tensile strength, whilst the spatial arrangement of individual collagen fibrils within the lamellae confers transparency. In keratoconus, this precise stromal arrangement is lost, resulting in ectasia and visual impairment. In the normal cornea, we previously characterised the three-dimensional arrangement of an elastic fiber network spanning the posterior stroma from limbus-to-limbus. In the peripheral cornea/limbus there are elastin-containing sheets or broad fibers, most of which become microfibril bundles (MBs) with little or no elastin component when reaching the central cornea. The purpose of the current study was to compare this network with the elastic fiber distribution in post-surgical keratoconic corneal buttons, using serial block face scanning electron microscopy and transmission electron microscopy. We have demonstrated that the MB distribution is very different in keratoconus. MBs are absent from a region of stroma anterior to Descemet's membrane, an area that is densely populated in normal cornea, whilst being concentrated below the epithelium, an area in which they are absent in normal cornea. We contend that these latter microfibrils are produced as a biomechanical response to provide additional strength to the anterior stroma in order to prevent tissue rupture at the apex of the cone. A lack of MBs anterior to Descemet's membrane in keratoconus would alter the biomechanical properties of the tissue, potentially contributing to the pathogenesis of the disease. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Thermal modelling of normal distributed nanoparticles through thickness in an inorganic material matrix

    Science.gov (United States)

    Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.

    2017-10-01

    Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.

  5. Distribution and elimination of intravenously administered atrial natriuretic hormone(ANH) to normal and nephrectomized rats

    International Nuclear Information System (INIS)

    Devine, E.; Artman, L.; Budzik, G.; Bush, E.; Holleman, W.

    1986-01-01

    The 24 amino acid peptide, ANH(5-28), was N-terminally labeled with I-125 Bolton-Hunter reagent, iodo-N-succinimidyl 3-(4-hydroxyphenyl)propionate. The I-125 peptide plus 1μg/kg of the I-127 Bolton-Hunter peptide was injected into normal and nephrectomized anesthetized (Nembutal) rats. Blood samples were drawn into a cocktail developed to inhibit plasma induced degradation. Radiolabeled peptides were analyzed by HPLC. A biphasic curve of I-125 ANH(5-28) elimination was obtained, the first phase (t 1/2 = 15 sec) representing in vivo distribution and the second phase (t 1/2 = 7-10 min) a measurement of elimination. This biphasic elimination curve was similar in normal and nephrectomized rats. The apparent volumes of distribution were 15-20 ml for the first phase and > 300 ml for the second phase. In order to examine the tissue distribution of the peptide, animals were sacrificed at 2 minutes and the I-125 tissue contents were quantitated. The majority of the label was located in the liver (50%), kidneys (21%) and the lung (5%). The degradative peptides appearing in the plasma and urine of normal rats were identical. No intact radiolabeled ANH(5-28) was found in the urine. In conclusion, iodinated Bolton-Hunter labeled ANH(5-28) is rapidly removed from the circulation by the liver and to a lesser extent by the kidney, but the rate of elimination is not decreased by nephrectomy

  6. Effects of adipose tissue distribution on maximum lipid oxidation rate during exercise in normal-weight women.

    Science.gov (United States)

    Isacco, L; Thivel, D; Duclos, M; Aucouturier, J; Boisseau, N

    2014-06-01

    Fat mass localization affects lipid metabolism differently at rest and during exercise in overweight and normal-weight subjects. The aim of this study was to investigate the impact of a low vs high ratio of abdominal to lower-body fat mass (index of adipose tissue distribution) on the exercise intensity (Lipox(max)) that elicits the maximum lipid oxidation rate in normal-weight women. Twenty-one normal-weight women (22.0 ± 0.6 years, 22.3 ± 0.1 kg.m(-2)) were separated into two groups of either a low or high abdominal to lower-body fat mass ratio [L-A/LB (n = 11) or H-A/LB (n = 10), respectively]. Lipox(max) and maximum lipid oxidation rate (MLOR) were determined during a submaximum incremental exercise test. Abdominal and lower-body fat mass were determined from DXA scans. The two groups did not differ in aerobic fitness, total fat mass, or total and localized fat-free mass. Lipox(max) and MLOR were significantly lower in H-A/LB vs L-A/LB women (43 ± 3% VO(2max) vs 54 ± 4% VO(2max), and 4.8 ± 0.6 mg min(-1)kg FFM(-1)vs 8.4 ± 0.9 mg min(-1)kg FFM(-1), respectively; P normal-weight women, a predominantly abdominal fat mass distribution compared with a predominantly peripheral fat mass distribution is associated with a lower capacity to maximize lipid oxidation during exercise, as evidenced by their lower Lipox(max) and MLOR. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  7. Using an APOS Framework to Understand Teachers' Responses to Questions on the Normal Distribution

    Science.gov (United States)

    Bansilal, Sarah

    2014-01-01

    This study is an exploration of teachers' engagement with concepts embedded in the normal distribution. The participants were a group of 290 in-service teachers enrolled in a teacher development program. The research instrument was an assessment task that can be described as an "unknown percentage" problem, which required the application…

  8. On the possible ''normalization'' of experimental curves of 230Th vertical distribution in abyssal oceanic sediments

    International Nuclear Information System (INIS)

    Kuznetsov, Yu.V.; Al'terman, Eh.I.; Lisitsyn, A.P.; AN SSSR, Moscow. Inst. Okeanologii)

    1981-01-01

    The possibilities of the method of normalization of experimental ionic curves in reference to dating of abyssal sediments and establishing their accumulation rapidities are studied. The method is based on using correlation between ionic curves extrema and variations of Fe, Mn, C org., and P contents in abyssal oceanic sediments. It has been found that the above method can be successfully applied for correction of 230 Th vertical distribution data obtained by low-background γ-spectrometry. The method leads to most reliable results in those cases when the vertical distribution curves in sediments of elements concentrators of 230 Th are symbasic between themselves. The normalization of experimental ionic curves in many cases gives the possibility to realize the sediment age stratification [ru

  9. One-dimensional time-dependent conduction states and temperature distribution along a normal zone during a quench

    International Nuclear Information System (INIS)

    Lopez, G.

    1991-01-01

    The quench simulations of a superconducting (s.c.) magnet requires some assumptions about the evolution of the normal zone and its temperature profile. The axial evolution of the normal zone is considered through the longitudinal quench velocity. However, the transversal quench propagation may be considered through the transversal quench velocity or with the turn-to-turn time delay quench propagation. The temperature distribution has been assumed adiabatic-like or cosine-like in two different computer programs. Although both profiles are different, they bring about more or less the same qualitative quench results differing only in about 8%. Unfortunately, there are not experimental data for the temperature profile along the conductor in a quench event to have a realistic comparison. Little attention has received the temperature profile, mainly because it is not so critical parameter in the quench analysis. Nonetheless, a confident quench analysis requires that the temperature distribution along the normal zone be taken into account with good approximation. In this paper, an analytical study is made about the temperature profile

  10. Distribution of CD163-positive cell and MHC class II-positive cell in the normal equine uveal tract.

    Science.gov (United States)

    Sano, Yuto; Matsuda, Kazuya; Okamoto, Minoru; Takehana, Kazushige; Hirayama, Kazuko; Taniyama, Hiroyuki

    2016-02-01

    Antigen-presenting cells (APCs) in the uveal tract participate in ocular immunity including immune homeostasis and the pathogenesis of uveitis. In horses, although uveitis is the most common ocular disorder, little is known about ocular immunity, such as the distribution of APCs. In this study, we investigated the distribution of CD163-positive and MHC II-positive cells in the normal equine uveal tract using an immunofluorescence technique. Eleven eyes from 10 Thoroughbred horses aged 1 to 24 years old were used. Indirect immunofluorescence was performed using the primary antibodies CD163, MHC class II (MHC II) and CD20. To demonstrate the site of their greatest distribution, positive cells were manually counted in 3 different parts of the uveal tract (ciliary body, iris and choroid), and their average number was assessed by statistical analysis. The distribution of pleomorphic CD163- and MHC II-expressed cells was detected throughout the equine uveal tract, but no CD20-expressed cells were detected. The statistical analysis demonstrated the distribution of CD163- and MHC II-positive cells focusing on the ciliary body. These results demonstrated that the ciliary body is the largest site of their distribution in the normal equine uveal tract, and the ciliary body is considered to play important roles in uveal and/or ocular immune homeostasis. The data provided in this study will help further understanding of equine ocular immunity in the normal state and might be beneficial for understanding of mechanisms of ocular disorders, such as equine uveitis.

  11. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  12. Optimization of b-value distribution for biexponential diffusion-weighted MR imaging of normal prostate.

    Science.gov (United States)

    Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko

    2014-05-01

    To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.

  13. Basic study on radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Naka, R.; Kawarabayashi, J.; Uritani, A.; Iguchi, T.; Kaneko, J.; Takeuchi, H.; Kakuta, T.

    2000-01-01

    Recently, some methods of radiation distribution sensing with optical fibers have been proposed. These methods employ scintillating fibers or scintillators with wavelength-shifting fibers. The positions of radiation interactions are detected by applying a time-of-flight (TOF) technique to the scintillation photon propagation. In the former method, the attenuation length for the scintillation photons in the scintillating fiber is relatively short, so that the operating length of the sensor is limited to several meters. In the latter method, a radiation distribution cannot continuously be obtained but discretely. To improve these shortcomings, a normal optical fiber made of polymethyl methacrylate (PMMA) is used in this study. Although the scintillation efficiency of PMMA is very low, several photons are emitted through interaction with a radiation. The fiber is transparent for the emitted photons to have a relatively long operating length. A radiation distribution can continuously be obtained. This paper describes a principle of the position sensing method based on the time of flight technique and preliminary results obtained for 90 Sr- 90 Y beta rays, 137 Cs gamma rays, and 14 MeV neutrons. The spatial resolutions for the above three kinds of radiations are 0.30 m, 0.37 m, 0.13 m, and the detection efficiencies are 1.1 x 10 -3 , 1.6 x 10 -7 , 5.4 x 10 -6 , respectively, with 10 m operation length. The results of a spectroscopic study on the optical property of the fiber are also described. (author)

  14. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL......-40 in osteoarthritic (n=9) and macroscopically normal (n=5) human articular cartilage, collected from 12 pre-selected areas of the femoral head, to discover a potential role for YKL-40 in cartilage remodelling in osteoarthritis. Immunohistochemical analysis showed that YKL-40 staining was found...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  15. Validation of MCDS by comparison of predicted with experimental velocity distribution functions in rarefied normal shocks

    Science.gov (United States)

    Pham-Van-diep, Gerald C.; Erwin, Daniel A.

    1989-01-01

    Velocity distribution functions in normal shock waves in argon and helium are calculated using Monte Carlo direct simulation. These are compared with experimental results for argon at M = 7.18 and for helium at M = 1.59 and 20. For both argon and helium, the variable-hard-sphere (VHS) model is used for the elastic scattering cross section, with the velocity dependence derived from a viscosity-temperature power-law relationship in the way normally used by Bird (1976).

  16. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    International Nuclear Information System (INIS)

    Goodarzi, Samereh; Pazirandeh, Ali; Jameie, Seyed Behnamedin; Baghban Khojasteh, Nasrin

    2012-01-01

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: ► Boron distribution in male and female rats' normal brain was studied in this research. ► Coronal sections of animal tissue samples were irradiated with thermal neutrons. ► Alpha and Lithium tracks were counted using alpha autoradiography. ► Different boron concentration was seen in brain sections of male and female rats. ► The highest boron concentration was seen in 4 h after boron compound injection.

  17. Analysis of a hundred-years series of magnetic activity indices. III. Is the frequency distribution logarithmo-normal

    International Nuclear Information System (INIS)

    Mayaud, P.N.

    1976-01-01

    Because of the various components of positive conservation existing in the series of aa indices, their frequency distribution is necessarily distorted with respect to any random distribution. However when one takes these various components into account, the observed distribution can be considered as being a logarithmo-normal distribution. This implies that the geomagnetic activity satisfies the conditions of the central limit theorem, according to which a phenomenon which presents such a distribution is due to independent causes whose effects are multiplicative. Furthermore, the distorsion of the frequency distribution caused by the 11-year and 90-year cycles corresponds to a pure attenuation effect; an interpretation by the solar 'coronal holes' is proposed [fr

  18. Distribution pattern of urine albumin creatinine ratio and the prevalence of high-normal levels in untreated asymptomatic non-diabetic hypertensive patients.

    Science.gov (United States)

    Ohmaru, Natsuki; Nakatsu, Takaaki; Izumi, Reishi; Mashima, Keiichi; Toki, Misako; Kobayashi, Asako; Ogawa, Hiroko; Hirohata, Satoshi; Ikeda, Satoru; Kusachi, Shozo

    2011-01-01

    Even high-normal albuminuria is reportedly associated with cardiovascular events. We determined the urine albumin creatinine ratio (UACR) in spot urine samples and analyzed the UACR distribution and the prevalence of high-normal levels. The UACR was determined using immunoturbidimetry in 332 untreated asymptomatic non-diabetic Japanese patients with hypertension and in 69 control subjects. The microalbuminuria and macroalbuminuria levels were defined as a UCAR ≥30 and creatinine and a UCAR ≥300 µg/mg·creatinine, respectively. The distribution patterns showed a highly skewed distribution for the lower levels, and a common logarithmic transformation produced a close fit to a Gaussian distribution with median, 25th and 75th percentile values of 22.6, 13.5 and 48.2 µg/mg·creatinine, respectively. When a high-normal UACR was set at >20 to creatinine, 19.9% (66/332) of the hypertensive patients exhibited a high-normal UACR. Microalbuminuria and macroalbuminuria were observed in 36.1% (120/336) and 2.1% (7/332) of the patients, respectively. UACR was significantly correlated with the systolic and diastolic blood pressures and the pulse pressure. A stepwise multivariate analysis revealed that these pressures as well as age were independent factors that increased UACR. The UACR distribution exhibited a highly skewed pattern, with approximately 60% of untreated, non-diabetic hypertensive patients exhibiting a high-normal or larger UACR. Both hypertension and age are independent risk factors that increase the UACR. The present study indicated that a considerable percentage of patients require anti-hypertensive drugs with antiproteinuric effects at the start of treatment.

  19. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto

    2013-01-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue

  20. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)

    2013-11-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.

  1. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    Science.gov (United States)

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.

  2. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  3. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities

    International Nuclear Information System (INIS)

    Waite, D.A.; Denham, D.H.

    1975-01-01

    The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these

  5. Normal Approximations to the Distributions of the Wilcoxon Statistics: Accurate to What "N"? Graphical Insights

    Science.gov (United States)

    Bellera, Carine A.; Julien, Marilyse; Hanley, James A.

    2010-01-01

    The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…

  6. American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution

    DEFF Research Database (Denmark)

    Stentoft, Lars Peter

    In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... properties shows that there are important option pricing differences compared to the Gaussian case as well as to the symmetric special case. A large scale empirical examination shows that our model outperforms the Gaussian case for pricing options on three large US stocks as well as a major index...

  7. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  8. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    Energy Technology Data Exchange (ETDEWEB)

    Goodarzi, Samereh, E-mail: samere.g@gmail.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Pazirandeh, Ali, E-mail: paziran@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Jameie, Seyed Behnamedin, E-mail: behnamjameie@tums.ac.ir [Basic Science Department, Faculty of Allied Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Anatomy, Faculty of Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Baghban Khojasteh, Nasrin, E-mail: khojasteh_n@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of)

    2012-06-15

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: Black-Right-Pointing-Pointer Boron distribution in male and female rats' normal brain was studied in this research. Black-Right-Pointing-Pointer Coronal sections of animal tissue samples were irradiated with thermal neutrons. Black-Right-Pointing-Pointer Alpha and Lithium tracks were counted using alpha autoradiography. Black-Right-Pointing-Pointer Different boron concentration was seen in brain sections of male and female rats. Black-Right-Pointing-Pointer The highest boron concentration was seen in 4 h after boron compound injection.

  9. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  10. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering

    Science.gov (United States)

    Gamayunov, K. V.; Khazanov, G. V.

    2006-01-01

    calculate the pitch-angle diffusion coefficients using the typical wave normal distributions obtained from our self-consistent ring current-EMIC wave model, and try to quantify the effect of EMIC wave normal angle characteristics on relativistic electron scattering.

  11. Skewed Normal Distribution Of Return Assets In Call European Option Pricing

    Directory of Open Access Journals (Sweden)

    Evy Sulistianingsih

    2011-12-01

    Full Text Available Option is one of security derivates. In financial market, option is a contract that gives a right (notthe obligation for its owner to buy or sell a particular asset for a certain price at a certain time.Option can give a guarantee for a risk that can be faced in a market.This paper studies about theuse of Skewed Normal Distribution (SN in call europeanoption pricing. The SN provides aflexible framework that captures the skewness of log return. We obtain aclosed form solution forthe european call option pricing when log return follow the SN. Then, we will compare optionprices that is obtained by the SN and the Black-Scholes model with the option prices of market. Keywords: skewed normaldistribution, log return, options.

  12. Reply to: Are There More Gifted People than Would Be Expected on a Normal Distribution?

    Science.gov (United States)

    Gallagher, James J.

    2014-01-01

    The author responds to the article by Warne, Godwin, and Smith (2013) on the question of whether there are more gifted people than would be expected in a Gaussian normal distribution. He asserts that the answer to this question is yes, based on (a) data that he and his colleagues have collected, (b) data that are already available and quoted by…

  13. Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles

    Science.gov (United States)

    Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya

    2018-03-01

    A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.

  14. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  15. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    Science.gov (United States)

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  16. Preparation, distribution, stability and tumor imaging properties of [62Zn] Bleomycin complex in normal and tumor-bearing mice

    International Nuclear Information System (INIS)

    Jalilian, A.R.; Fateh, B.; Ghergherehchi, M.; Karimian, A.; Matloobi, M.; Moradkhani, S.; Kamalidehghan, M.; Tabeie, F.

    2003-01-01

    Backgrounds: Bleomycin (BLM) has been labeled with radioisotopes and widely used in therapy and diagnosis. In this study BLM was labeled with [ 62 Zn] zinc chloride for oncologic PET studies. Materials and methods: The complex was obtained at the P H=2 normal saline at 90 d eg C in 60 min. Radio-TLC showed on overall radiochemical yield of 95-97% (radiochemical purity>97%). Stability of complex was checked in vitro in mice and human plasma/urine. Results: Preliminary in vitro studies performed to determined complex stability and distribution of [ 62 Zn] BLM in normal and fibrosarcoma tumors in mice according to bio-distribution/imaging studies. Conclusion: [ 62 Zn] BLM can be used in PET oncology studies due to its suitable physico-chemical propertied as a diagnostic complex behavior in higher animals

  17. Exact, time-independent estimation of clone size distributions in normal and mutated cells.

    Science.gov (United States)

    Roshan, A; Jones, P H; Greenman, C D

    2014-10-06

    Biological tools such as genetic lineage tracing, three-dimensional confocal microscopy and next-generation DNA sequencing are providing new ways to quantify the distribution of clones of normal and mutated cells. Understanding population-wide clone size distributions in vivo is complicated by multiple cell types within observed tissues, and overlapping birth and death processes. This has led to the increased need for mathematically informed models to understand their biological significance. Standard approaches usually require knowledge of clonal age. We show that modelling on clone size independent of time is an alternative method that offers certain analytical advantages; it can help parametrize these models, and obtain distributions for counts of mutated or proliferating cells, for example. When applied to a general birth-death process common in epithelial progenitors, this takes the form of a gambler's ruin problem, the solution of which relates to counting Motzkin lattice paths. Applying this approach to mutational processes, alternative, exact, formulations of classic Luria-Delbrück-type problems emerge. This approach can be extended beyond neutral models of mutant clonal evolution. Applications of these approaches are twofold. First, we resolve the probability of progenitor cells generating proliferating or differentiating progeny in clonal lineage tracing experiments in vivo or cell culture assays where clone age is not known. Second, we model mutation frequency distributions that deep sequencing of subclonal samples produce.

  18. Comparative pharmacokinetic and tissue distribution profiles of four major bioactive components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Wang, Chang-Hong; Tao, Yan-Yan; Zhou, Hua; Liu, Cheng-Hai

    2015-10-10

    Fuzheng Huayu recipe (FZHY) is a herbal product for the treatment of liver fibrosis approved by the Chinese State Food and Drug Administration (SFDA), but its pharmacokinetics and tissue distribution had not been investigated. In this study, the liver fibrotic model was induced with intraperitoneal injection of dimethylnitrosamine (DMN), and FZHY was given orally to the model and normal rats. The plasma pharmacokinetics and tissue distribution profiles of four major bioactive components from FZHY were analyzed in the normal and fibrotic rat groups using an ultrahigh performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. Results revealed that the bioavailabilities of danshensu (DSS), salvianolic acid B (SAB) and rosmarinic acid (ROS) in liver fibrotic rats increased 1.49, 3.31 and 2.37-fold, respectively, compared to normal rats. There was no obvious difference in the pharmacokinetics of amygdalin (AMY) between the normal and fibrotic rats. The tissue distribution of DSS, SAB, and AMY trended to be mostly in the kidney and lung. The distribution of DSS, SAB, and AMY in liver tissue of the model rats was significantly decreased compared to the normal rats. Significant differences in the pharmacokinetics and tissue distribution profiles of DSS, ROS, SAB and AMY were observed in rats with hepatic fibrosis after oral administration of FZHY. These results provide a meaningful basis for developing a clinical dosage regimen in the treatment of hepatic fibrosis by FZHY. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    CERN Document Server

    Smolyar, V A; Eremin, V V

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well

  1. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    International Nuclear Information System (INIS)

    Smolyar, V.A.; Eremin, A.V.; Eremin, V.V.

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well [ru

  2. The Weight of Euro Coins: Its Distribution Might Not Be as Normal as You Would Expect

    Science.gov (United States)

    Shkedy, Ziv; Aerts, Marc; Callaert, Herman

    2006-01-01

    Classical regression models, ANOVA models and linear mixed models are just three examples (out of many) in which the normal distribution of the response is an essential assumption of the model. In this paper we use a dataset of 2000 euro coins containing information (up to the milligram) about the weight of each coin, to illustrate that the…

  3. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  4. Rough Sets and Stomped Normal Distribution for Simultaneous Segmentation and Bias Field Correction in Brain MR Images.

    Science.gov (United States)

    Banerjee, Abhirup; Maji, Pradipta

    2015-12-01

    The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.

  5. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  6. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  7. Distribution Log Normal of 222 Rn in the state of Zacatecas, Mexico

    International Nuclear Information System (INIS)

    Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    In this work the evaluation of the concentration of 222 Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of 222 Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of 222 Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of 222 Rn in air are near to the one limit settled down by the EPA of 148 Bq/m 3 are presented. (Author)

  8. Quantum arrival times and operator normalization

    International Nuclear Information System (INIS)

    Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.

    2003-01-01

    A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization

  9. Topographical Distribution of Arsenic, Manganese, and Selenium in the Normal Human Brain

    DEFF Research Database (Denmark)

    Larsen, Niels Agersnap; Pakkenberg, H.; Damsgaard, Else

    1979-01-01

    The concentrations of arsenic, manganese and selenium per gram wet tissue weight were determined in samples from 24 areas of normal human brains from 5 persons with ages ranging from 15 to 81 years of age. The concentrations of the 3 elements were determined for each sample by means of neutron...... activation analysis with radiochemical separation. Distinct patterns of distribution were shown for each of the 3 elements. Variations between individuals were found for some but not all brain areas, resulting in coefficients of variation between individuals of about 30% for arsenic, 10% for manganese and 20......% for selenium. The results seem to indicate that arsenic is associated with the lipid phase, manganese with the dry matter and selenium with the aqueous phase of brain tissue....

  10. Effects of a primordial magnetic field with log-normal distribution on the cosmic microwave background

    International Nuclear Information System (INIS)

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Takahashi, Keitaro

    2011-01-01

    We study the effect of primordial magnetic fields (PMFs) on the anisotropies of the cosmic microwave background (CMB). We assume the spectrum of PMFs is described by log-normal distribution which has a characteristic scale, rather than power-law spectrum. This scale is expected to reflect the generation mechanisms and our analysis is complementary to previous studies with power-law spectrum. We calculate power spectra of energy density and Lorentz force of the log-normal PMFs, and then calculate CMB temperature and polarization angular power spectra from scalar, vector, and tensor modes of perturbations generated from such PMFs. By comparing these spectra with WMAP7, QUaD, CBI, Boomerang, and ACBAR data sets, we find that the current CMB data set places the strongest constraint at k≅10 -2.5 Mpc -1 with the upper limit B < or approx. 3 nG.

  11. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  12. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  13. Pharmacokinetics and tissue distribution of five active ingredients of Eucommiae cortex in normal and ovariectomized mice by UHPLC-MS/MS.

    Science.gov (United States)

    An, Jing; Hu, Fangdi; Wang, Changhong; Zhang, Zijia; Yang, Li; Wang, Zhengtao

    2016-09-01

    1. Pinoresinol di-O-β-d-glucopyranoside (PDG), geniposide (GE), geniposidic acid (GA), aucubin (AN) and chlorogenic acid (CA) are the representative active ingredients in Eucommiae cortex (EC), which may be estrogenic. 2. The ultra high-performance liquid chromatography/tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous determination of the five ingredients showed good linearity, low limits of quantification and high extraction recoveries, as well as acceptable precision, accuracy and stability in mice plasma and tissue samples (liver, spleen, kidney and uterus). It was successfully applied to the comparative study on pharmacokinetics and tissue distribution of PDG, GE, GA, AN and CA between normal and ovariectomized (OVX) mice. 3. The results indicated that except CA, the plasma and tissue concentrations of PDG, GE, GA in OVX mice were all greater than those in normal mice. AN could only be detected in the plasma and liver homogenate of normal mice, which was poorly absorbed in OVX mice and low in other measured tissues. PDG, GE and GA seem to be better absorbed in OVX mice than in normal mice proved by the remarkable increased value of AUC0-∞ and Cmax. It is beneficial that PDG, GE, GA have better plasma absorption and tissue distribution in pathological state.

  14. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  15. Distribution of erlotinib in rash and normal skin in cancer patients receiving erlotinib visualized by matrix assisted laser desorption/ionization mass spectrometry imaging.

    Science.gov (United States)

    Nishimura, Meiko; Hayashi, Mitsuhiro; Mizutani, Yu; Takenaka, Kei; Imamura, Yoshinori; Chayahara, Naoko; Toyoda, Masanori; Kiyota, Naomi; Mukohara, Toru; Aikawa, Hiroaki; Fujiwara, Yasuhiro; Hamada, Akinobu; Minami, Hironobu

    2018-04-06

    The development of skin rashes is the most common adverse event observed in cancer patients treated with epidermal growth factor receptor-tyrosine kinase inhibitors such as erlotinib. However, the pharmacological evidence has not been fully revealed. Erlotinib distribution in the rashes was more heterogeneous than that in the normal skin, and the rashes contained statistically higher concentrations of erlotinib than adjacent normal skin in the superficial skin layer (229 ± 192 vs. 120 ± 103 ions/mm 2 ; P = 0.009 in paired t -test). LC-MS/MS confirmed that the concentration of erlotinib in the skin rashes was higher than that in normal skin in the superficial skin layer (1946 ± 1258 vs. 1174 ± 662 ng/cm 3 ; P = 0.028 in paired t -test). The results of MALDI-MSI and LC-MS/MS were well correlated (coefficient of correlation 0.879, P distribution of erlotinib in the skin tissue was visualized using non-labeled MALDI-MSI. Erlotinib concentration in the superficial layer of the skin rashes was higher than that in the adjacent normal skin. We examined patients with advanced pancreatic cancer who developed skin rashes after treatment with erlotinib and gemcitabine. We biopsied both the rash and adjacent normal skin tissues, and visualized and compared the distribution of erlotinib within the skin using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). The tissue concentration of erlotinib was also measured by liquid chromatography-tandem mass spectrometry (LC-MS/MS) with laser microdissection.

  16. Visualizing Tensor Normal Distributions at Multiple Levels of Detail.

    Science.gov (United States)

    Abbasloo, Amin; Wiens, Vitalis; Hermann, Max; Schultz, Thomas

    2016-01-01

    Despite the widely recognized importance of symmetric second order tensor fields in medicine and engineering, the visualization of data uncertainty in tensor fields is still in its infancy. A recently proposed tensorial normal distribution, involving a fourth order covariance tensor, provides a mathematical description of how different aspects of the tensor field, such as trace, anisotropy, or orientation, vary and covary at each point. However, this wealth of information is far too rich for a human analyst to take in at a single glance, and no suitable visualization tools are available. We propose a novel approach that facilitates visual analysis of tensor covariance at multiple levels of detail. We start with a visual abstraction that uses slice views and direct volume rendering to indicate large-scale changes in the covariance structure, and locations with high overall variance. We then provide tools for interactive exploration, making it possible to drill down into different types of variability, such as in shape or orientation. Finally, we allow the analyst to focus on specific locations of the field, and provide tensor glyph animations and overlays that intuitively depict confidence intervals at those points. Our system is demonstrated by investigating the effects of measurement noise on diffusion tensor MRI, and by analyzing two ensembles of stress tensor fields from solid mechanics.

  17. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    Science.gov (United States)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  18. A Platoon Dispersion Model Based on a Truncated Normal Distribution of Speed

    Directory of Open Access Journals (Sweden)

    Ming Wei

    2012-01-01

    Full Text Available Understanding platoon dispersion is critical for the coordination of traffic signal control in an urban traffic network. Assuming that platoon speed follows a truncated normal distribution, ranging from minimum speed to maximum speed, this paper develops a piecewise density function that describes platoon dispersion characteristics as the platoon moves from an upstream to a downstream intersection. Based on this density function, the expected number of cars in the platoon that pass the downstream intersection, and the expected number of cars in the platoon that do not pass the downstream point are calculated. To facilitate coordination in a traffic signal control system, dispersion models for the front and the rear of the platoon are also derived. Finally, a numeric computation for the coordination of successive signals is presented to illustrate the validity of the proposed model.

  19. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  20. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues.

    Science.gov (United States)

    Foldager, Casper Bindzus; Toh, Wei Seong; Gomoll, Andreas H; Olsen, Bjørn Reino; Spector, Myron

    2014-04-01

    The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti-collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional roles of these 2 extracellular matrix proteins

  1. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues

    Science.gov (United States)

    Toh, Wei Seong; Gomoll, Andreas H.; Olsen, Bjørn Reino; Spector, Myron

    2014-01-01

    Objective: The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Design: Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti–collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. Results: When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. Conclusions: We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional

  2. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, Addendum

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.

  3. Retention and subcellular distribution of 67Ga in normal organs

    International Nuclear Information System (INIS)

    Ando, A.; Ando, I.; Hiraki, T.

    1986-01-01

    Using normal rats, retention values and subcellular distribution of 67 Ga in each organ were investigated. At 10 min after administration of 67 Ga-citrate the retention value of 67 Ga in blood was 6.77% dose/g, and this value decreased with time. The values for skeletal muscle, lung, pancreas, adrenal, heart muscle, brain, small intestine, large intestine and spinal cord were the highest at 10 min after administration, and they decreased with time. Conversely this value in bone increased until 10 days after injection. But in the liver, kidney, and stomach, these values increased with time after administration and were highest 24 h or 48 h after injection. After that, they decreased with time. The value in spleen reached a plateau 48 h after administration, and hardly varied for 10 days. From the results of subcellular fractionation, it was deduced that lysosome plays quite an important role in the concentration of 67 Ga in small intestine, stomach, lung, kidney and pancreas; a lesser role in its concentration in heart muscle, and hardly any role in the 67 Ga accumulation in skeletal muscle. In spleen, the contents in nuclear, mitochrondrial, microsomal, and supernatant fractions all contributed to the accumulation of 67 Ga. (orig.) [de

  4. Are There More Gifted People Than Would Be Expected in a Normal Distribution? An Investigation of the Overabundance Hypothesis

    Science.gov (United States)

    Warne, Russell T.; Godwin, Lindsey R.; Smith, Kyle V.

    2013-01-01

    Among some gifted education researchers, advocates, and practitioners, it is sometimes believed that there is a larger number of gifted people in the general population than would be predicted from a normal distribution (e.g., Gallagher, 2008; N. M. Robinson, Zigler, & Gallagher, 2000; Silverman, 1995, 2009), a belief that we termed the…

  5. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  6. M-dwarf exoplanet surface density distribution. A log-normal fit from 0.07 to 400 AU

    Science.gov (United States)

    Meyer, Michael R.; Amara, Adam; Reggiani, Maddalena; Quanz, Sascha P.

    2018-04-01

    Aims: We fit a log-normal function to the M-dwarf orbital surface density distribution of gas giant planets, over the mass range 1-10 times that of Jupiter, from 0.07 to 400 AU. Methods: We used a Markov chain Monte Carlo approach to explore the likelihoods of various parameter values consistent with point estimates of the data given our assumed functional form. Results: This fit is consistent with radial velocity, microlensing, and direct-imaging observations, is well-motivated from theoretical and phenomenological points of view, and predicts results of future surveys. We present probability distributions for each parameter and a maximum likelihood estimate solution. Conclusions: We suggest that this function makes more physical sense than other widely used functions, and we explore the implications of our results on the design of future exoplanet surveys.

  7. Normal bone and soft tissue distribution of fluorine-18-sodium fluoride and artifacts on 18F-NaF PET/CT bone scan: a pictorial review.

    Science.gov (United States)

    Sarikaya, Ismet; Elgazzar, Abdelhamid H; Sarikaya, Ali; Alfeeli, Mahmoud

    2017-10-01

    Fluorine-18-sodium fluoride (F-NaF) PET/CT is a relatively new and high-resolution bone imaging modality. Since the use of F-NaF PET/CT has been increasing, it is important to accurately assess the images and be aware of normal distribution and major artifacts. In this pictorial review article, we will describe the normal uptake patterns of F-NaF in the bone tissues, particularly in complex structures, as well as its physiologic soft tissue distribution and certain artifacts seen on F-NaF PET/CT images.

  8. Normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man; evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I-131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I-131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The normal adrenal glands were seldom seen and faintly imaged in 2% at 24 h after injection and in 16% at 48 h, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extra-adrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I-131 MIBG uptake at 24 through 72 h

  9. The effect of signal variability on the histograms of anthropomorphic channel outputs: factors resulting in non-normally distributed data

    Science.gov (United States)

    Elshahaby, Fatma E. A.; Ghaly, Michael; Jha, Abhinav K.; Frey, Eric C.

    2015-03-01

    Model Observers are widely used in medical imaging for the optimization and evaluation of instrumentation, acquisition parameters and image reconstruction and processing methods. The channelized Hotelling observer (CHO) is a commonly used model observer in nuclear medicine and has seen increasing use in other modalities. An anthropmorphic CHO consists of a set of channels that model some aspects of the human visual system and the Hotelling Observer, which is the optimal linear discriminant. The optimality of the CHO is based on the assumption that the channel outputs for data with and without the signal present have a multivariate normal distribution with equal class covariance matrices. The channel outputs result from the dot product of channel templates with input images and are thus the sum of a large number of random variables. The central limit theorem is thus often used to justify the assumption that the channel outputs are normally distributed. In this work, we aim to examine this assumption for realistically simulated nuclear medicine images when various types of signal variability are present.

  10. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  11. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  12. The Analysis of Bankruptcy Risk Using the Normal Distribution Gauss-Laplace in Case of a Company is the Most Modern Romanian Sea-River Port on the Danube

    Directory of Open Access Journals (Sweden)

    Rodica Pripoaie

    2015-08-01

    Full Text Available This work presents the application of the normal distribution Gauss-Laplace in case of a company is the most modern Romanian sea-river port on the Danube, specialized service providers, with a handling capacity of approx. 20,000,000 tons / year. The normal distribution Gauss-Laplace is the most known and used probability distribution, because it surprises better the evolution of economic and financial phenomena. Around the average, which has the greatest frequency, gravitate values more to less distant than average, but with the same standard deviation. It is noted that, although used in the forecasting calculations, analysis of profitability threshold - even ignores the risk of decisional operations (regarding deviations between the forecast and achievements, which may, in certain circumstances, influence much the activity of the company. This can be held into account when carefully studying the evolution of turnover follows a law of probability. In case not exist any information on the law of probability of turnover and no reason that one case appear more than another, according of Laplace law, we consider that these cases are uniformly distributed, therefore they follow a normal distribution.

  13. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Science.gov (United States)

    Watabe, Tadashi; Naka, Sadahiro; Ikeda, Hayato; Horitsugi, Genki; Kanai, Yasukazu; Isohashi, Kayako; Ishibashi, Mana; Kato, Hiroki; Shimosegawa, Eku; Watabe, Hiroshi; Hatazawa, Jun

    2014-01-01

    Acetylcholinesterase (AChE) inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11)C-Donepezil (DNP) and the AChE activity in the normal rat, with special focus on the adrenal glands. The distribution of (11)C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g). A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11)C-DNP (45.0 ± 10.7 MBq). The whole-body distribution of the (11)C-DNP PET was evaluated based on the Vt (total distribution volume) by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11)C-DNP in the body (following the liver) (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3), respectively), indicating that the distribution of (11)C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach) (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively), indicating high activity of AChE in the adrenal glands. We demonstrated the whole-body distribution of (11)C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11)C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  14. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Directory of Open Access Journals (Sweden)

    Tadashi Watabe

    Full Text Available PURPOSE: Acetylcholinesterase (AChE inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11C-Donepezil (DNP and the AChE activity in the normal rat, with special focus on the adrenal glands. METHODS: The distribution of (11C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g. A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11C-DNP (45.0 ± 10.7 MBq. The whole-body distribution of the (11C-DNP PET was evaluated based on the Vt (total distribution volume by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. RESULTS: The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11C-DNP in the body (following the liver (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3, respectively, indicating that the distribution of (11C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively, indicating high activity of AChE in the adrenal glands. CONCLUSIONS: We demonstrated the whole-body distribution of (11C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  15. Simulation study of pO2 distribution in induced tumour masses and normal tissues within a microcirculation environment.

    Science.gov (United States)

    Li, Mao; Li, Yan; Wen, Peng Paul

    2014-01-01

    The biological microenvironment is interrupted when tumour masses are introduced because of the strong competition for oxygen. During the period of avascular growth of tumours, capillaries that existed play a crucial role in supplying oxygen to both tumourous and healthy cells. Due to limitations of oxygen supply from capillaries, healthy cells have to compete for oxygen with tumourous cells. In this study, an improved Krogh's cylinder model which is more realistic than the previously reported assumption that oxygen is homogeneously distributed in a microenvironment, is proposed to describe the process of the oxygen diffusion from a capillary to its surrounding environment. The capillary wall permeability is also taken into account. The simulation study is conducted and the results show that when tumour masses are implanted at the upstream part of a capillary and followed by normal tissues, the whole normal tissues suffer from hypoxia. In contrast, when normal tissues are ahead of tumour masses, their pO2 is sufficient. In both situations, the pO2 in the whole normal tissues drops significantly due to the axial diffusion at the interface of normal tissues and tumourous cells. As the existence of the axial oxygen diffusion cannot supply the whole tumour masses, only these tumourous cells that are near the interface can be partially supplied, and have a small chance to survive.

  16. Spin fluctuations in liquid 3He: a strong-coupling calculation of T/sub c/ and the normal-state distribution function

    International Nuclear Information System (INIS)

    Fay, D.; Layzer, A.

    1975-01-01

    The Berk--Schrieffer method of strong-coupling superconductivity for nearly ferromagnetic systems is generalized to arbitrary L-state pairing and realistic (hard-core) potentials. Application to 3 He yields a P-state transition but very low values for T/sub c/ and an unsatisfactory normal-state momentum distribution

  17. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  18. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  19. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  20. [Calbindin and parvalbumin distribution in spinal cord of normal and rabies-infected mice].

    Science.gov (United States)

    Monroy-Gómez, Jeison; Torres-Fernández, Orlando

    2013-01-01

    Rabies is a fatal infectious disease of the nervous system; however, the knowledge about the pathogenic neural mechanisms in rabies is scarce. In addition, there are few studies of rabies pathology of the spinal cord. To study the distribution of calcium binding proteins calbindin and parvalbumin and assessing the effect of rabies virus infection on their expression in the spinal cord of mice. MATERIALES Y METHODS: Mice were inoculated with rabies virus, by intracerebral or intramuscular route. The spinal cord was extracted to perform some crosscuts which were treated by immunohistochemistry with monoclonal antibodies to reveal the presence of the two proteins in normal and rabies infected mice. We did qualitative and quantitative analyses of the immunoreactivity of the two proteins. Calbindin and parvalbumin showed differential distribution in Rexed laminae. Rabies infection produced a decrease in the expression of calbindin. On the contrary, the infection caused an increased expression of parvalbumin. The effect of rabies infection on the two proteins expression was similar when comparing both routes of inoculation. The differential effect of rabies virus infection on the expression of calbindin and parvalbumin in the spinal cord of mice was similar to that previously reported for brain areas. This result suggests uniformity in the response to rabies infection throughout the central nervous system. This is an important contribution to the understanding of the pathogenesis of rabies.

  1. Distribution and ultrastructure of pigment cells in the skins of normal and albino adult turbot, Scophthalmus Maximus

    Institute of Scientific and Technical Information of China (English)

    GUO Huarong; HUANG Bing; QI Fei; ZHANG Shicui

    2007-01-01

    The distribution and ultrastructure of pigment cells in skins of normal and albino adult turbots were examined with transmission electron microscopy (TEM). Three types of pigment cells of melanophore, iridophore and xanthophore have been recognized in adult turbot skins. The skin color depends mainly on the amount and distribution of melanophore and iridophore, as xanthophore is quite rare. No pigment cells can be found in the epidermis of the skins. In the pigmented ocular skin of the turbot, melanophore and iridophore are usually co-localized in the dermis. This is quite different from the distribution in larvae skin. In albino and white blind skins of adult turbots, however, only iridophore monolayer still exists, while the melanophore monolayer disappears. This cytological evidence explains why the albino adult turbot, unlike its larvae, could never resume its body color no matter what environmental and nutritional conditions were provided. Endocytosis is quite active in the cellular membrane of the iridophore. This might be related to the formation of reflective platelet and stability of the iridophore.

  2. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. TOTAL NUMBER, DISTRIBUTION, AND PHENOTYPE OF CELLS EXPRESSING CHONDROITIN SULPHATE PROTEOGLYCANS IN THE NORMAL HUMAN AMYGDALA

    Science.gov (United States)

    Pantazopoulos, Harry; Murray, Elisabeth A.; Berretta, Sabina

    2009-01-01

    Chondroitin sulphate proteoglycans (CSPGs) are a key structural component of the brain extracellular matrix. They are involved in critical neurodevelopmental functions and are one of the main components of pericellular aggregates known as perineuronal nets. As a step toward investigating their functional and pathophysiological roles in the human amygdala, we assessed the pattern of CSPG expression in the normal human amygdala using wisteria floribunda agglutinin (WFA) lectin-histochemistry. Total numbers of WFA-labeled elements were measured in the lateral (LN), basal (BN), accessory basal (ABN) and cortical (CO) nuclei of the amygdala from 15 normal adult human subjects. For interspecies qualitative comparison, we also investigated the pattern of WFA labeling in the amygdala of naïve rats (n=32) and rhesus monkeys (Macaca mulatta; n=6). In human amygdala, WFA lectin-histochemistry resulted in labeling of perineuronal nets and cells with clear glial morphology, while neurons did not show WFA-labeling. Total numbers of WFA-labeled glial cells showed high interindividual variability. These cells aggregated in clusters with a consistent between-subjects spatial distribution. In a subset of human subjects (n=5), dual color fluorescence using an antibody raised against glial fibrillary acidic protein (GFAP) and WFA showed that the majority (93.7%) of WFA-labeled glial cells correspond to astrocytes. In rat and monkey amygdala, WFA histochemistry labeled perineuronal nets, but not glial cells. These results suggest that astrocytes are the main cell type expressing CSPGs in the adult human amygdala. Their highly segregated distribution pattern suggests that these cells serve specialized functions within human amygdalar nuclei. PMID:18374308

  4. The normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man: evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I- 131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I- 131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The ''normal'' adrenal glands were seldom seen and faintly imaged in 2% at 24 hr after injection and in 16% at 48 hr, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extraadrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I- 131 MIBG uptake at 24 through 72 hr

  5. Advection-diffusion model for normal grain growth and the stagnation of normal grain growth in thin films

    International Nuclear Information System (INIS)

    Lou, C.

    2002-01-01

    An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)

  6. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research

    International Nuclear Information System (INIS)

    Currie, L.A.

    2001-01-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85 Kr and 14 C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger- Mueller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test - for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14 C- 12 C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban ''soot''. The third, environmentally skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85 Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom. (orig.)

  7. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research.

    Science.gov (United States)

    Currie, L A

    2001-07-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85Kr and 14C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger-Muller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test--for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14C-12C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban "soot". The third, environmentally, skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom.

  8. Similar distributions of repaired sites in chromatin of normal and xeroderma pigmentosum variant cells damaged by ultraviolet light

    International Nuclear Information System (INIS)

    Cleaver, J.E.

    1979-01-01

    Excision repair of damage from ultraviolet light in both normal and xeroderma pigmentosum variant fibroblasts at early times after irradiation occurred preferentially in regions of DNA accessible to micrococcal nuclease digestion. These regions are predominantly the linker regions between nucleosomes in chromatin. The alterations reported at polymerization and ligation steps of excision repair in the variant are therefore not associated with changes in the relative distributions of repair sites in linker and core particle regions of DNA. (Auth.)

  9. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1978-01-01

    This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.

  10. Comparative pharmacokinetics and tissue distribution profiles of lignan components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Zheng, Tian-Hui; Tao, Yan-Yan; Liu, Cheng-Hai

    2015-05-26

    Fuzheng Huayu recipe (FZHY) is formulated on the basis of Chinese medicine theory in treating liver fibrosis. To illuminate the influence of the pathological state of liver fibrosis on the pharmacokinetics and tissue distribution profiles of lignan components from FZHY. Male Wistar rats were randomly divided into normal group and Hepatic fibrosis group (induced by dimethylnitrosamine). Six lignan components were detected and quantified by ultrahigh performance liquid chromatography-tandem mass spectrometry(UHPLC-MS/MS)in the plasma and tissue of normal and hepatic fibrosis rats. A rapid, sensitive and convenient UHPLC-MS/MS method has been developed for the simultaneous determination of six lignan components in different rat biological samples successfully. After oral administration of FZHY at a dose of 15g/kg, the pharmacokinetic behaviors of schizandrin A (SIA), schizandrin B (SIB), schizandrin C (SIC), schisandrol A (SOA), Schisandrol B (SOB) and schisantherin A (STA) have been significantly changed in hepatic fibrosis rats compared with the normal rats, and their AUC(0-t) values were increased by 235.09%, 388.44%, 223.30%, 669.30%, 295.08% and 267.63% orderly (Pdistribution results showed the amount of SIA, SIB, SOA and SOB were significant increased in heart, lung, spleen and kidney of hepatic fibrosis rats compared with normal rats at most of the time point (Pdistribution of lignan components in normal and hepatic fibrosis rats. The hepatic fibrosis could alter the pharmacokinetics and tissue distribution properties of lignan components in rats after administration of FZHY. The results might be helpful for guide the clinical application of this medicine. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Spatial Distribution of Iron Within the Normal Human Liver Using Dual-Source Dual-Energy CT Imaging.

    Science.gov (United States)

    Abadia, Andres F; Grant, Katharine L; Carey, Kathleen E; Bolch, Wesley E; Morin, Richard L

    2017-11-01

    Explore the potential of dual-source dual-energy (DSDE) computed tomography (CT) to retrospectively analyze the uniformity of iron distribution and establish iron concentration ranges and distribution patterns found in healthy livers. Ten mixtures consisting of an iron nitrate solution and deionized water were prepared in test tubes and scanned using a DSDE 128-slice CT system. Iron images were derived from a 3-material decomposition algorithm (optimized for the quantification of iron). A conversion factor (mg Fe/mL per Hounsfield unit) was calculated from this phantom study as the quotient of known tube concentrations and their corresponding CT values. Retrospective analysis was performed of patients who had undergone DSDE imaging for renal stones. Thirty-seven patients with normal liver function were randomly selected (mean age, 52.5 years). The examinations were processed for iron concentration. Multiple regions of interest were analyzed, and iron concentration (mg Fe/mL) and distribution was reported. The mean conversion factor obtained from the phantom study was 0.15 mg Fe/mL per Hounsfield unit. Whole-liver mean iron concentrations yielded a range of 0.0 to 2.91 mg Fe/mL, with 94.6% (35/37) of the patients exhibiting mean concentrations below 1.0 mg Fe/mL. The most important finding was that iron concentration was not uniform and patients exhibited regionally high concentrations (36/37). These regions of higher concentration were observed to be dominant in the middle-to-upper part of the liver (75%), medially (72.2%), and anteriorly (83.3%). Dual-source dual-energy CT can be used to assess the uniformity of iron distribution in healthy subjects. Applying similar techniques to unhealthy livers, future research may focus on the impact of hepatic iron content and distribution for noninvasive assessment in diseased subjects.

  12. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  13. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  14. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. I - Pressure distribution

    Science.gov (United States)

    Messiter, A. F.

    1980-01-01

    Asymptotic solutions are derived for the pressure distribution in the interaction of a weak normal shock wave with a turbulent boundary layer. The undisturbed boundary layer is characterized by the law of the wall and the law of the wake for compressible flow. In the limiting case considered, for 'high' transonic speeds, the sonic line is very close to the wall. Comparisons with experiment are shown, with corrections included for the effect of longitudinal wall curvature and for the boundary-layer displacement effect in a circular pipe.

  15. Tritium distribution ratios between the 30 % tributyl phosphate(TBP)-normal dodecane(nDD) organic phase and uranyl nitrate-nitric acid aqueous phase

    International Nuclear Information System (INIS)

    Fujine, Sachio; Uchiyama, Gunzou; Sugikawa, Susumu; Maeda, Mitsuru; Tsujino, Takeshi.

    1989-10-01

    Tritium distribution ratios between the organic and aqueous phases were measured for the system of 30 % tributyl phosphate(TBP)-normal dodecane(nDD)/uranyl nitrate-nitric acid water. It was confirmed that tritium is extracted by TBP into the organic phase in both chemical forms of tritiated water (HTO) and tritiated nitric acid (TNO 3 ). The value of tritium distribution ratio ranged from 0.002 to 0.005 for the conditions of 0-6 mol/L nitric acid, 0.5-800 mCi/L tritium in aqueous phase, and 0-125 g-U/L uranium in organic phase. Isotopic distribution coefficient of tritium between the organic and aqueous phases was observed to be about 0.95. (author)

  16. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  17. Heterogeneous distribution of a diffusional tracer in the aortic wall of normal and atherosclerotic rabbits

    International Nuclear Information System (INIS)

    Tsutsui, H.; Tomoike, H.; Nakamura, M.

    1990-01-01

    Tracer distribution as an index of nutritional support across the thoracic and abdominal aortas in rabbits in the presence or absence of atherosclerotic lesions was evaluated using [ 14 C]antipyrine, a metabolically inert, diffusible indicator. Intimal plaques were produced by endothelial balloon denudation of the thoracic aorta and a 1% cholesterol diet. After a steady intravenous infusion of 200 microCi of [ 14 C]antipyrine for 60 seconds, thoracic and abdominal aortas and the heart were excised, and autoradiograms of 20-microns-thick sections were quantified, using microcomputer-aided densitometry. Regional radioactivity and regional diffusional support, as an index of nutritional flow estimated from the timed collections of arterial blood, was 367 and 421 nCi.g-1 (82 and 106 ml.min-1.100 g-1) in thoracic aortic media of the normal and atherosclerotic rabbits, respectively. Radioactivity at the thickened intima was 179 nCi.g-1 (p less than 0.01 versus media). The gruel was noted at a deeper site within the thickened intima, and diffusional support here was 110 nCi.g-1 (p less than 0.01 versus an average radioactivity at the thickened intima). After ligating the intercostal arteries, regional tracer distribution in the media beneath the fibrofatty lesion, but not the plaque-free intima, was reduced to 46%. Thus, in the presence of advanced intimal thickening, the heterogeneous distribution of diffusional flow is prominent across the vessel wall, and abluminal routes are crucial to meet the increased demands of nutritional requirements

  18. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, 2

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1976-01-01

    The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.

  19. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  20. Measurement of activity-weighted size distributions of radon decay products in a normally occupied home

    International Nuclear Information System (INIS)

    Hopke, P.K.; Wasiolek, P.; Montassier, N.; Cavallo, A.; Gadsby, K.; Socolow, R.

    1992-01-01

    In order to assess the exposure of individuals to the presence of indoor radioactivity arising from the decay of radon, an automated, semicontinuous graded screen array system was developed to permit the measurement of the activity-weighted size distributions of the radon progeny in homes. The system has been modified so that the electronics and sampling heads can be separated from the pump by approximately 15 m. The system was placed in the living room of a one-storey house with basement in Princeton, NJ and operated for 2 weeks while the house was occupied by the home owners in their normal manner. One of the house occupants was a cigarette smoker. Radon and potential alpha energy concentration (PAEC) measurements were also made, but condensation nuclei counts were not performed. PAEC values ranged from 23.4 to 461.6 mWL. In the measured activity size distributions, the amount of activity in the 0.5-1.5 nm size range can be considered to be the unattached fraction. The mean value for the 218 Po unattached fraction is 0.217 with a range of 0.054-0.549. The median value for the unattached fraction of PAEC is 0.077 with a range of 0.022-0.178. (author)

  1. Finite-size effects in transcript sequencing count distribution: its power-law correction necessarily precedes downstream normalization and comparative analysis.

    Science.gov (United States)

    Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank

    2018-02-12

    signal-to-noise ratio by 50% and the statistical/detection sensitivity by as high as 30% regardless of the downstream mapping and normalization methods. Most importantly, the power-law correction improves concordance in significant calls among different normalization methods of a data series averagely by 22%. When presented with a higher sequence depth (4 times difference), the improvement in concordance is asymmetrical (32% for the higher sequencing depth instance versus 13% for the lower instance) and demonstrates that the simple power-law correction can increase significant detection with higher sequencing depths. Finally, the correction dramatically enhances the statistical conclusions and eludes the metastasis potential of the NUGC3 cell line against AGS of our dilution analysis. The finite-size effects due to undersampling generally plagues transcript count data with reproducibility issues but can be minimized through a simple power-law correction of the count distribution. This distribution correction has direct implication on the biological interpretation of the study and the rigor of the scientific findings. This article was reviewed by Oliviero Carugo, Thomas Dandekar and Sandor Pongor.

  2. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  3. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB

    Science.gov (United States)

    Khazanov, G. V.; Gamayunov, K. V.

    2007-01-01

    We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.

  4. Financing options and economic impact: distributed generation using solar photovoltaic systems in Normal, Illinois

    Directory of Open Access Journals (Sweden)

    Jin H. Jo

    2016-04-01

    Full Text Available Due to increasing price volatility in fossil-fuel-produced energy, the demand for clean, renewable, and abundant energy is more prevalent than in past years. Solar photovoltaic (PV systems have been well documented for their ability to produce electrical energy while at the same time offering support to mitigate the negative externalities associated with fossil fuel combustion. Prices for PV systems have decreased over the past few years, however residential and commercial owners may still opt out of purchasing a system due to the overall price required for a PV system installation. Therefore, determining optimal financing options for residential and small-scale purchasers is a necessity. We report on payment methods currently used for distributed community solar projects throughout the US and suggest appropriate options for purchasers in Normal, Illinois given their economic status. We also examine the jobs and total economic impact of a PV system implementation in the case study area.

  5. Determination and correlation of spatial distribution of trace elements in normal and neoplastic breast tissues evaluated by μ-XRF

    International Nuclear Information System (INIS)

    Silva, M.P.; Oliveira, M.A.; Poletti, M.E.

    2012-01-01

    Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using μ-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 μm thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 μm microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by Spearman

  6. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  7. Global Bi-ventricular endocardial distribution of activation rate during long duration ventricular fibrillation in normal and heart failure canines.

    Science.gov (United States)

    Luo, Qingzhi; Jin, Qi; Zhang, Ning; Han, Yanxin; Wang, Yilong; Huang, Shangwei; Lin, Changjian; Ling, Tianyou; Chen, Kang; Pan, Wenqi; Wu, Liqun

    2017-04-13

    The objective of this study was to detect differences in the distribution of the left and right ventricle (LV & RV) activation rate (AR) during short-duration ventricular fibrillation (SDVF, 1 min) in normal and heart failure (HF) canine hearts. Ventricular fibrillation (VF) was electrically induced in six healthy dogs (control group) and six dogs with right ventricular pacing-induced congestive HF (HF group). Two 64-electrode basket catheters deployed in the LV and RV were used for global endocardium electrical mapping. The AR of VF was estimated by fast Fourier transform analysis from each electrode. In the control group, the LV was activated faster than the RV in the first 20 s, after which there was no detectable difference in the AR between them. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the posterior LV was activated fastest, while the anterior was slowest. In the HF group, a detectable AR gradient existed between the two ventricles within 3 min of VF, with the LV activating more quickly than the RV. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the septum of the LV was activated fastest, while the anterior was activated slowest. A global bi-ventricular endocardial AR gradient existed within the first 20 s of VF but disappeared in the LDVF in healthy hearts. However, the AR gradient was always observed in both SDVF and LDVF in HF hearts. The findings of this study suggest that LDVF in HF hearts can be maintained differently from normal hearts, which accordingly should lead to the development of different management strategies for LDVF resuscitation.

  8. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    Science.gov (United States)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  9. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  10. A preliminary evaluation of myoelectrical energy distribution of the front neck muscles in pharyngeal phase during normal swallowing.

    Science.gov (United States)

    Mingxing Zhu; Wanzhang Yang; Samuel, Oluwarotimi Williams; Yun Xiang; Jianping Huang; Haiqing Zou; Guanglin Li

    2016-08-01

    Pharyngeal phase is a central hub of swallowing in which food bolus pass through from the oral cavity to the esophageal. Proper understanding of the muscular activities in the pharyngeal phase is useful for assessing swallowing function and the occurrence of dysphagia in humans. In this study, high-density (HD) surface electromyography (sEMG) was used to study the muscular activities in the pharyngeal phase during swallowing tasks involving three healthy male subjects. The root mean square (RMS) of the HD sEMG data was computed by using a series of segmented windows as myoelectrical energy. And the RMS of each window covering all channels (16×5) formed a matrix. During the pharyngeal phase of swallowing, three of the matrixes were chosen and normalized to obtain the HD energy maps and the statistical parameter. The maps across different viscosity levels offered the energy distribution which showed the muscular activities of the left and right sides of the front neck muscles. In addition, the normalized average RMS (NARE) across different viscosity levels revealed a left-right significant correlation (r=0.868±0.629, pstronger correlation when swallowing water. This pilot study suggests that HD sEMG would be a potential tool to evaluate muscular activities in pharyngeal phase during normal swallowing. Also, it might provide useful information for dysphagia diagnosis.

  11. Aerosol lung inhalation scintigraphy in normal subjects

    Energy Technology Data Exchange (ETDEWEB)

    Sui, Osamu; Shimazu, Hideki

    1985-03-01

    We previously reported basic and clinical evaluation of aerosol lung inhalation scintigraphy with /sup 99m/Tc-millimicrosphere albumin (milli MISA) and concluded aerosol inhalation scintigraphy with /sup 99m/Tc-milli MISA was useful for routine examination. But central airway deposit of aerosol particles was found in not only the patients with chronic obstructive pulmonary disease (COPD) but also normal subjects. So we performed aerosol inhalation scintigraphy in normal subjects and evaluated their scintigrams. The subjects had normal values of FEVsub(1.0)% (more than 70%) in lung function tests, no abnormal findings in chest X-ray films and no symptoms and signs. The findings of aerosol inhalation scintigrams in them were classified into 3 patterns; type I: homogeneous distribution without central airway deposit, type II: homogeneous distribution with central airway deposit, type III: inhomogeneous distribution. These patterns were compared with lung function tests. There was no significant correlation between type I and type II in lung function tests. Type III was different from type I and type II in inhomogeneous distribution. This finding showed no correlation with %VC, FEVsub(1.0)%, MMF, V radical50 and V radical50/V radical25, but good correlation with V radical25 in a maximum forced expiratory flow-volume curve. Flow-volume curve is one of the sensitive methods in early detection of COPD, so inhomogeneous distribution of type III is considered to be due to small airway dysfunction.

  12. Stellar Distributions and NIR Colours of Normal Galaxies

    NARCIS (Netherlands)

    Peletier, R. F.; Grijs, R. de

    1997-01-01

    Abstract: We discuss some results of a morphological study of edge-on galaxies, based on optical and especially near-infrared surface photometry. We find that the vertical surface brightness distributions of galaxies are fitted very well by exponential profiles, much better than by isothermal

  13. The Semiparametric Normal Variance-Mean Mixture Model

    DEFF Research Database (Denmark)

    Korsholm, Lars

    1997-01-01

    We discuss the normal vairance-mean mixture model from a semi-parametric point of view, i.e. we let the mixing distribution belong to a non parametric family. The main results are consistency of the non parametric maximum likelihood estimat or in this case, and construction of an asymptotically...... normal and efficient estimator....

  14. Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.

    Science.gov (United States)

    Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.

    1998-01-01

    The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)

  15. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  16. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions

    International Nuclear Information System (INIS)

    Hassanein, A.; Konkashbaev, I.

    1999-01-01

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutions provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters

  17. Partial LVAD restores ventricular outputs and normalizes LV but not RV stress distributions in the acutely failing heart in silico

    OpenAIRE

    Sack, Kevin L.; Baillargeon, Brian; Acevedo-Bolton, Gabriel; Genet, Martin; Rebelo, Nuno; Kuhl, Ellen; Klein, Liviu; Weiselthaler, Georg M.; Burkhoff, Daniel; Franz, Thomas; Guccione, Julius M.

    2016-01-01

    Purpose: Heart failure is a worldwide epidemic that is unlikely to change as the population ages and life expectancy increases. We sought to detail significant recent improvements to the Dassault Systèmes Living Heart Model (LHM) and use the LHM to compute left ventricular (LV) and right ventricular (RV) myofiber stress distributions under the following 4 conditions: (1) normal cardiac function; (2) acute left heart failure (ALHF); (3) ALHF treated using an LV assist device (LVAD) flow rate o...

  18. Computer modeling the boron compound factor in normal brain tissue

    International Nuclear Information System (INIS)

    Gavin, P.R.; Huiskamp, R.; Wheeler, F.J.; Griebenow, M.L.

    1993-01-01

    The macroscopic distribution of borocaptate sodium (Na 2 B 12 H 11 SH or BSH) in normal tissues has been determined and can be accurately predicted from the blood concentration. The compound para-borono-phenylalanine (p-BPA) has also been studied in dogs and normal tissue distribution has been determined. The total physical dose required to reach a biological isoeffect appears to increase directly as the proportion of boron capture dose increases. This effect, together with knowledge of the macrodistribution, led to estimates of the influence of the microdistribution of the BSH compound. This paper reports a computer model that was used to predict the compound factor for BSH and p-BPA and, hence, the equivalent radiation in normal tissues. The compound factor would need to be calculated for other compounds with different distributions. This information is needed to design appropriate normal tissue tolerance studies for different organ systems and/or different boron compounds

  19. Absolute quantification of pharmacokinetic distribution of RES colloids in individuals with normal liver function

    International Nuclear Information System (INIS)

    Herzog, H.; Spohr, G.; Notohamiprodjo, G.; Feinendegen, L.E.

    1987-01-01

    Estimates of the radiation dose resulting from liver-spleen scintigraphy 99 TCsup(m)-labelled colloids are based on pharmacokinetic data mainly determined in animals. The aim of this study was to check the pharmacokinetic data by direct, absolute in vivo quantification in man. Liver and spleen activities were directly measured using a double-energy window technique. Activities in other organs were quantified by conjugate whole-body scans. All measurement procedures were checked using the whole-body Alderson phantom. Pharmacokinetic data for sulphur colloid, tin colloid, human serum albumin (HSA) millimicrospheres, and phytate were obtained in 13 to 20 normal subjects for each type of colloid. Depending on the colloid type liver uptake was between 54 and 75% of the total administered dose (TAD) and spleen uptake was 3.5 to 21% TAD. Activity measured in blood, urine, lung and thyroid proved to be far from negligible. The results of this work suggest a correction of the animal-based data of colloid distribution and radiation dose on the basis of the direct measurement of absolute uptake in man. (author)

  20. On the maximum entropy distributions of inherently positive nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Taavitsainen, A., E-mail: aapo.taavitsainen@gmail.com; Vanhanen, R.

    2017-05-11

    The multivariate log-normal distribution is used by many authors and statistical uncertainty propagation programs for inherently positive quantities. Sometimes it is claimed that the log-normal distribution results from the maximum entropy principle, if only means, covariances and inherent positiveness of quantities are known or assumed to be known. In this article we show that this is not true. Assuming a constant prior distribution, the maximum entropy distribution is in fact a truncated multivariate normal distribution – whenever it exists. However, its practical application to multidimensional cases is hindered by lack of a method to compute its location and scale parameters from means and covariances. Therefore, regardless of its theoretical disadvantage, use of other distributions seems to be a practical necessity. - Highlights: • Statistical uncertainty propagation requires a sampling distribution. • The objective distribution of inherently positive quantities is determined. • The objectivity is based on the maximum entropy principle. • The maximum entropy distribution is the truncated normal distribution. • Applicability of log-normal or normal distribution approximation is limited.

  1. powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks

    Science.gov (United States)

    Murray, Steven G.

    2018-05-01

    powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.

  2. Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Directory of Open Access Journals (Sweden)

    Amin Zadeh Sarraf

    2013-09-01

    Full Text Available Purpose: Numerous companies are expecting their knowledge management (KM to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement  in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem. Keywords: Knowledge management; strategy; TOPSIS; Normal distribution; entropy

  3. Pharmacokinetics of tritiated water in normal and dietary-induced obese rats

    International Nuclear Information System (INIS)

    Shum, L.Y.; Jusko, W.J.

    1986-01-01

    Tritiated water disposition was characterized in normal and dietary-induced obese rats to assess pharmacokinetic concerns in calculating water space and estimating body fat. A monoexponential decline in serum tritium activity was observed in both groups of rats, thus facilitating use of various computational methods. The volume of distribution and the total clearance of tritium in obese rats were larger than in normal rats because of the increased body weight. The values of water space (volume of distribution) estimated from moment analysis or dose divided by serum tritium activity at time zero (extrapolated) or at 2 hr were all similar. Thus, obesity does not alter the distribution equilibrium time and distribution pattern of tritium, and the conventional 2-hr single blood sampling after intravenous injection is adequate to estimate the water space of normal and obese rats

  4. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    International Nuclear Information System (INIS)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu; Hirunuma, Rieko; Enomoto, Shuichi

    2001-01-01

    Because vanadium ions such as vanadyl (VO 2+ ) and vanadate (VO 3- ) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ( 48 V) and zinc ( 65 Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  5. Immunolocalization of transforming growth factor alpha in normal human tissues

    DEFF Research Database (Denmark)

    Christensen, M E; Poulsen, Steen Seier

    1996-01-01

    anchorage-independent growth of normal cells and was, therefore, considered as an "oncogenic" growth factor. Later, its immunohistochemical presence in normal human cells as well as its biological effects in normal human tissues have been demonstrated. The aim of the present investigation was to elucidate...... the distribution of the growth factor in a broad spectrum of normal human tissues. Indirect immunoenzymatic staining methods were used. The polypeptide was detected with a polyclonal as well as a monoclonal antibody. The polyclonal and monoclonal antibodies demonstrated almost identical immunoreactivity. TGF......-alpha was found to be widely distributed in cells of normal human tissues derived from all three germ layers, most often in differentiated cells. In epithelial cells, three different kinds of staining patterns were observed, either diffuse cytoplasmic, cytoplasmic in the basal parts of the cells, or distinctly...

  6. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  7. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    Energy Technology Data Exchange (ETDEWEB)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu [Department of Analytical and Bioinorganic Chemistry, Kyoto Pharmaceutical University, Kyoto (Japan); Hirunuma, Rieko; Enomoto, Shuichi [Radioisotope Technology Division, Cyclotron Center, Institute of Physical and Chemical Research (RIKEN), Wako, Saitama (Japan)

    2001-05-01

    Because vanadium ions such as vanadyl (VO{sup 2+}) and vanadate (VO{sup 3-}) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ({sup 48}V) and zinc ({sup 65}Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  8. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  9. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  10. UPLC-MS method for quantification of pterostilbene and its application to comparative study of bioavailability and tissue distribution in normal and Lewis lung carcinoma bearing mice.

    Science.gov (United States)

    Deng, Li; Li, Yongzhi; Zhang, Xinshi; Chen, Bo; Deng, Yulin; Li, Yujuan

    2015-10-10

    A UPLC-MS method was developed for determination of pterostilbene (PTS) in plasma and tissues of mice. PTS was separated on Agilent Zorbax XDB-C18 column (50 × 2.1 mm, 1.8 μm) with gradient mobile phase at the flow rate of 0.2 ml/min. The detection was performed by negative ion electrospray ionization in multiple reaction monitoring mode. The linear calibration curve of PTS in mouse plasma and tissues ranged from 1.0 to 5000 and 0.50 to 500 ng/ml (r(2)>0.9979), respectively, with lowest limits of quantification (LLOQ) were between 0.5 and 2.0 ng/ml, respectively. The accuracy and precision of the assay were satisfactory. The validated method was applied to the study of bioavailability and tissue distribution of PTS in normal and Lewis lung carcinoma (LLC) bearing mice. The bioavailability of PTS (dose 14, 28 and 56 mg/kg) in normal mice were 11.9%, 13.9% and 26.4%, respectively; and the maximum level (82.1 ± 14.2 μg/g) was found in stomach (dose 28 mg/kg). The bioavailability, peak concentration (Cmax), time to peak concentration (Tmax) of PTS in LLC mice was increased compared with normal mice. The results indicated the UPLC-MS method is reliable and bioavailability and tissue distribution of PTS in normal and LLC mice were dramatically different. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Spatial distribution of cannabinoid receptor type 1 (CB1 in normal canine central and peripheral nervous system.

    Directory of Open Access Journals (Sweden)

    Jessica Freundt-Revilla

    Full Text Available The endocannabinoid system is a regulatory pathway consisting of two main types of cannabinoid receptors (CB1 and CB2 and their endogenous ligands, the endocannabinoids. The CB1 receptor is highly expressed in the central and peripheral nervous systems (PNS in mammalians and is involved in neuromodulatory functions. Since endocannabinoids were shown to be elevated in cerebrospinal fluid of epileptic dogs, knowledge about the species specific CB receptor expression in the nervous system is required. Therefore, we assessed the spatial distribution of CB1 receptors in the normal canine CNS and PNS. Immunohistochemistry of several regions of the brain, spinal cord and peripheral nerves from a healthy four-week-old puppy, three six-month-old dogs, and one ten-year-old dog revealed strong dot-like immunoreactivity in the neuropil of the cerebral cortex, Cornu Ammonis (CA and dentate gyrus of the hippocampus, midbrain, cerebellum, medulla oblongata and grey matter of the spinal cord. Dense CB1 expression was found in fibres of the globus pallidus and substantia nigra surrounding immunonegative neurons. Astrocytes were constantly positive in all examined regions. CB1 labelled neurons and satellite cells of the dorsal root ganglia, and myelinating Schwann cells in the PNS. These results demonstrate for the first time the spatial distribution of CB1 receptors in the healthy canine CNS and PNS. These results can be used as a basis for further studies aiming to elucidate the physiological consequences of this particular anatomical and cellular distribution.

  12. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  13. The rank of a normally distributed matrix and positive definiteness of a noncentral Wishart distributed matrix

    NARCIS (Netherlands)

    Steerneman, A. G. M.; van Perlo-ten Kleij, Frederieke

    2008-01-01

    If X similar to N-nxk(M, I-n circle times Sigma), then S = X'X has the noncentral Wishart distribution W-k(')(n, Sigma; A), where Lambda = M'M. Here Sigma is allowed to be singular. It is well known that if Lambda = 0, then S has a (central) Wishart distribution and. S is positive definite with

  14. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  15. X-ray emssion from normal galaxies

    International Nuclear Information System (INIS)

    Speybroeck, L. van; Bechtold, J.

    1981-01-01

    A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)

  16. The clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast

    International Nuclear Information System (INIS)

    Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung

    2006-01-01

    We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings

  17. Data Normalization to Accelerate Training for Linear Neural Net to Predict Tropical Cyclone Tracks

    Directory of Open Access Journals (Sweden)

    Jian Jin

    2015-01-01

    Full Text Available When pure linear neural network (PLNN is used to predict tropical cyclone tracks (TCTs in South China Sea, whether the data is normalized or not greatly affects the training process. In this paper, min.-max. method and normal distribution method, instead of standard normal distribution, are applied to TCT data before modeling. We propose the experimental schemes in which, with min.-max. method, the min.-max. value pair of each variable is mapped to (−1, 1 and (0, 1; with normal distribution method, each variable’s mean and standard deviation pair is set to (0, 1 and (100, 1. We present the following results: (1 data scaled to the similar intervals have similar effects, no matter the use of min.-max. or normal distribution method; (2 mapping data to around 0 gains much faster training speed than mapping them to the intervals far away from 0 or using unnormalized raw data, although all of them can approach the same lower level after certain steps from their training error curves. This could be useful to decide data normalization method when PLNN is used individually.

  18. Distribution of age at menopause in two Danish samples

    DEFF Research Database (Denmark)

    Boldsen, J L; Jeune, B

    1990-01-01

    We analyzed the distribution of reported age at natural menopause in two random samples of Danish women (n = 176 and n = 150) to determine the shape of the distribution and to disclose any possible trends in the distribution parameters. It was necessary to correct the frequencies of the reported...... ages for the effect of differing ages at reporting. The corrected distribution of age at menopause differs from the normal distribution in the same way in both samples. Both distributions could be described by a mixture of two normal distributions. It appears that most of the parameters of the normal...... distribution mixtures remain unchanged over a 50-year time lag. The position of the distribution, that is, the mean age at menopause, however, increases slightly but significantly....

  19. Subchondral bone density distribution of the talus in clinically normal Labrador Retrievers.

    Science.gov (United States)

    Dingemanse, W; Müller-Gerbl, M; Jonkers, I; Vander Sloten, J; van Bree, H; Gielen, I

    2016-03-15

    Bones continually adapt their morphology to their load bearing function. At the level of the subchondral bone, the density distribution is highly correlated with the loading distribution of the joint. Therefore, subchondral bone density distribution can be used to study joint biomechanics non-invasively. In addition physiological and pathological joint loading is an important aspect of orthopaedic disease, and research focusing on joint biomechanics will benefit veterinary orthopaedics. This study was conducted to evaluate density distribution in the subchondral bone of the canine talus, as a parameter reflecting the long-term joint loading in the tarsocrural joint. Two main density maxima were found, one proximally on the medial trochlear ridge and one distally on the lateral trochlear ridge. All joints showed very similar density distribution patterns and no significant differences were found in the localisation of the density maxima between left and right limbs and between dogs. Based on the density distribution the lateral trochlear ridge is most likely subjected to highest loads within the tarsocrural joint. The joint loading distribution is very similar between dogs of the same breed. In addition, the joint loading distribution supports previous suggestions of the important role of biomechanics in the development of OC lesions in the tarsus. Important benefits of computed tomographic osteoabsorptiometry (CTOAM), i.e. the possibility of in vivo imaging and temporal evaluation, make this technique a valuable addition to the field of veterinary orthopaedic research.

  20. Vibrational Spectra And Potential Energy Distributions of Normal Modes of N,N'-Etilenbis(P-Toluen sulfonamide)

    International Nuclear Information System (INIS)

    Alyar, S.

    2008-01-01

    N-substituted sulfonamides are well known for their diuretic, antidiabetic, antibacterial and antifungal, anticancer e.g., and are widely used in the therapy of patients. These important bioactive properties are strongly affected by the special features of -CH 2 -SO 2 -NR-linker and intramolecular motion Thus, the studies of energetic and spatial properties on N-substituted sulfonamides are of great importance to improve our understanding of their biological activities and enhance abilities to predict new drugs. Density Functional Theory B3LYP /6-31G(d,p) level has been applied to obtain the vibrational force field for the most stable conformation of N,N'-etilenbis(p-toluensulfonamit)(ptsen)having sulfonamide moiety. The results of these calculation have been compared with spectroscopic data to verify accuracy of calculation and applicability of the DFT approach to ptsen. Additionally, complete normal coordinate analyses with quantum mechanical scaling (SQM) were performed to derive the potential energy distributions (PE)

  1. Multivariate phase type distributions - Applications and parameter estimation

    DEFF Research Database (Denmark)

    Meisch, David

    The best known univariate probability distribution is the normal distribution. It is used throughout the literature in a broad field of applications. In cases where it is not sensible to use the normal distribution alternative distributions are at hand and well understood, many of these belonging...... and statistical inference, is the multivariate normal distribution. Unfortunately only little is known about the general class of multivariate phase type distribution. Considering the results concerning parameter estimation and inference theory of univariate phase type distributions, the class of multivariate...... projects and depend on reliable cost estimates. The Successive Principle is a group analysis method primarily used for analyzing medium to large projects in relation to cost or duration. We believe that the mathematical modeling used in the Successive Principle can be improved. We suggested a novel...

  2. Normal Anti-Invariant Submanifolds of Paraquaternionic Kähler Manifolds

    Directory of Open Access Journals (Sweden)

    Novac-Claudiu Chiriac

    2006-12-01

    Full Text Available We introduce normal anti-invariant submanifolds of paraquaternionic Kähler manifolds and study the geometric structures induced on them. We obtain necessary and sufficient conditions for the integrability of the distributions defined on a normal anti-invariant submanifold. Also, we present characterizations of local (global anti-invariant products.

  3. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model.

    Directory of Open Access Journals (Sweden)

    Habib Baghirov

    Full Text Available The treatment of brain diseases is hindered by the blood-brain barrier (BBB preventing most drugs from entering the brain. Focused ultrasound (FUS with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma.

  4. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model

    Science.gov (United States)

    Baghirov, Habib; Snipstad, Sofie; Sulheim, Einar; Berg, Sigrid; Hansen, Rune; Thorsen, Frits; Mørch, Yrr; Åslund, Andreas K. O.

    2018-01-01

    The treatment of brain diseases is hindered by the blood-brain barrier (BBB) preventing most drugs from entering the brain. Focused ultrasound (FUS) with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate) nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma. PMID:29338016

  5. Comparison of the procedures of Fleishman and Ramberg et al. for generating non-normal data in simulation studies

    Directory of Open Access Journals (Sweden)

    Rebecca Bendayan

    2014-01-01

    Full Text Available Simulation techniques must be able to generate the types of distributions most commonly encountered in real data, for example, non-normal distributions. Two recognized procedures for generating non-normal data are Fleishman's linear transformation method and the method proposed by Ramberg et al. that is based on generalization of the Tukey lambda distribution. This study compares tríese procedures in terms of the extent to which the distributions they generate fit their respective theoretical models, and it also examines the number of simulations needed to achieve this fit. To this end, the paper considers, in addition to the normal distribution, a series of non-normal distributions that are commonly found in real data, and then analyses fit according to the extent to which normality is violated and the number of simulations performed. The results show that the two data generation procedures behave similarly. As the degree of contamination of the theoretical distribution increases, so does the number of simulations required to ensure a good fit to the generated data. The two procedures generate more accurate normal and non-normal distributions when at least 7000 simulations are performed, although when the degree of contamination is severe (with values of skewness and kurtosis of 2 and 6, respectively it is advisable to perform 15000 simulations.

  6. On the generation of log-Levy distributions and extreme randomness

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2011-01-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Levy distributions. The log-Levy distributions are the Levy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Levy distributions emerge universally-the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot's extreme randomness. (paper)

  7. Log-normality of indoor radon data in the Walloon region of Belgium

    International Nuclear Information System (INIS)

    Cinelli, Giorgia; Tondeur, François

    2015-01-01

    The deviations of the distribution of Belgian indoor radon data from the log-normal trend are examined. Simulated data are generated to provide a theoretical frame for understanding these deviations. It is shown that the 3-component structure of indoor radon (radon from subsoil, outdoor air and building materials) generates deviations in the low- and high-concentration tails, but this low-C trend can be almost completely compensated by the effect of measurement uncertainties and by possible small errors in background subtraction. The predicted low-C and high-C deviations are well observed in the Belgian data, when considering the global distribution of all data. The agreement with the log-normal model is improved when considering data organised in homogeneous geological groups. As the deviation from log-normality is often due to the low-C tail for which there is no interest, it is proposed to use the log-normal fit limited to the high-C half of the distribution. With this prescription, the vast majority of the geological groups of data are compatible with the log-normal model, the remaining deviations being mostly due to a few outliers, and rarely to a “fat tail”. With very few exceptions, the log-normal modelling of the high-concentration part of indoor radon data is expected to give reasonable results, provided that the data are organised in homogeneous geological groups. - Highlights: • Deviations of the distribution of Belgian indoor Rn data from the log-normal trend. • 3-component structure of indoor Rn: subsoil, outdoor air and building materials. • Simulated data generated to provide a theoretical frame for understanding deviations. • Data organised in homogeneous geological groups; better agreement with the log-normal

  8. The distribution of interlaboratory comparison data

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    2008-01-01

    The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality.......The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality....

  9. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2013-01-01

    Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  10. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution.

    Science.gov (United States)

    Lo, Kenneth; Gottardo, Raphael

    2012-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components.

  11. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  12. A NEW STATISTICAL PERSPECTIVE TO THE COSMIC VOID DISTRIBUTION

    International Nuclear Information System (INIS)

    Pycke, J-R; Russell, E.

    2016-01-01

    In this study, we obtain the size distribution of voids as a three-parameter redshift-independent log-normal void probability function (VPF) directly from the Cosmic Void Catalog (CVC). Although many statistical models of void distributions are based on the counts in randomly placed cells, the log-normal VPF that we obtain here is independent of the shape of the voids due to the parameter-free void finder of the CVC. We use three void populations drawn from the CVC generated by the Halo Occupation Distribution (HOD) Mocks, which are tuned to three mock SDSS samples to investigate the void distribution statistically and to investigate the effects of the environments on the size distribution. As a result, it is shown that void size distributions obtained from the HOD Mock samples are satisfied by the three-parameter log-normal distribution. In addition, we find that there may be a relation between the hierarchical formation, skewness, and kurtosis of the log-normal distribution for each catalog. We also show that the shape of the three-parameter distribution from the samples is strikingly similar to the galaxy log-normal mass distribution obtained from numerical studies. This similarity between void size and galaxy mass distributions may possibly indicate evidence of nonlinear mechanisms affecting both voids and galaxies, such as large-scale accretion and tidal effects. Considering the fact that in this study, all voids are generated by galaxy mocks and show hierarchical structures in different levels, it may be possible that the same nonlinear mechanisms of mass distribution affect the void size distribution.

  13. STOCHASTIC PRICING MODEL FOR THE REAL ESTATE MARKET: FORMATION OF LOG-NORMAL GENERAL POPULATION

    Directory of Open Access Journals (Sweden)

    Oleg V. Rusakov

    2015-01-01

    Full Text Available We construct a stochastic model of real estate pricing. The method of the pricing construction is based on a sequential comparison of the supply prices. We proof that under standard assumptions imposed upon the comparison coefficients there exists an unique non-degenerated limit in distribution and this limit has the lognormal law of distribution. The accordance of empirical distributions of prices to thetheoretically obtained log-normal distribution we verify by numerous statistical data of real estate prices from Saint-Petersburg (Russia. For establishing this accordance we essentially apply the efficient and sensitive test of fit of Kolmogorov-Smirnov. Basing on “The Russian Federal Estimation Standard N2”, we conclude that the most probable price, i.e. mode of distribution, is correctly and uniquely defined under the log-normal approximation. Since the mean value of log-normal distribution exceeds the mode - most probable value, it follows that the prices valued by the mathematical expectation are systematically overstated.

  14. Fluid Distribution Pattern in Adult-Onset Congenital, Idiopathic, and Secondary Normal-Pressure Hydrocephalus: Implications for Clinical Care.

    Science.gov (United States)

    Yamada, Shigeki; Ishikawa, Masatsune; Yamamoto, Kazuo

    2017-01-01

    In spite of growing evidence of idiopathic normal-pressure hydrocephalus (NPH), a viewpoint about clinical care for idiopathic NPH is still controversial. A continuous divergence of viewpoints might be due to confusing classifications of idiopathic and adult-onset congenital NPH. To elucidate the classification of NPH, we propose that adult-onset congenital NPH should be explicitly distinguished from idiopathic and secondary NPH. On the basis of conventional CT scan or MRI, idiopathic NPH was defined as narrow sulci at the high convexity in concurrent with enlargement of the ventricles, basal cistern and Sylvian fissure, whereas adult-onset congenital NPH was defined as huge ventricles without high-convexity tightness. We compared clinical characteristics and cerebrospinal fluid distribution among 85 patients diagnosed with idiopathic NPH, 17 patients with secondary NPH, and 7 patients with adult-onset congenital NPH. All patients underwent 3-T MRI examinations and tap-tests. The volumes of ventricles and subarachnoid spaces were measured using a 3D workstation based on T2-weighted 3D sequences. The mean intracranial volume for the patients with adult-onset congenital NPH was almost 100 mL larger than the volumes for patients with idiopathic and secondary NPH. Compared with the patients with idiopathic or secondary NPH, patients with adult-onset congenital NPH exhibited larger ventricles but normal sized subarachnoid spaces. The mean volume ratio of the high-convexity subarachnoid space was significantly less in idiopathic NPH than in adult-onset congenital NPH, whereas the mean volume ratio of the basal cistern and Sylvian fissure in idiopathic NPH was >2 times larger than that in adult-onset congenital NPH. The symptoms of gait disturbance, cognitive impairment, and urinary incontinence in patients with adult-onset congenital NPH tended to progress more slowly compared to their progress in patients with idiopathic NPH. Cerebrospinal fluid distributions and

  15. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  16. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2010-12-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  17. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Genton, Marc G.

    2010-01-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  18. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  19. Combining counts and incidence data: an efficient approach for estimating the log-normal species abundance distribution and diversity indices.

    Science.gov (United States)

    Bellier, Edwige; Grøtan, Vidar; Engen, Steinar; Schartau, Ann Kristin; Diserud, Ola H; Finstad, Anders G

    2012-10-01

    Obtaining accurate estimates of diversity indices is difficult because the number of species encountered in a sample increases with sampling intensity. We introduce a novel method that requires that the presence of species in a sample to be assessed while the counts of the number of individuals per species are only required for just a small part of the sample. To account for species included as incidence data in the species abundance distribution, we modify the likelihood function of the classical Poisson log-normal distribution. Using simulated community assemblages, we contrast diversity estimates based on a community sample, a subsample randomly extracted from the community sample, and a mixture sample where incidence data are added to a subsample. We show that the mixture sampling approach provides more accurate estimates than the subsample and at little extra cost. Diversity indices estimated from a freshwater zooplankton community sampled using the mixture approach show the same pattern of results as the simulation study. Our method efficiently increases the accuracy of diversity estimates and comprehension of the left tail of the species abundance distribution. We show how to choose the scale of sample size needed for a compromise between information gained, accuracy of the estimates and cost expended when assessing biological diversity. The sample size estimates are obtained from key community characteristics, such as the expected number of species in the community, the expected number of individuals in a sample and the evenness of the community.

  20. Distribution Functions of Sizes and Fluxes Determined from Supra-Arcade Downflows

    Science.gov (United States)

    McKenzie, D.; Savage, S.

    2011-01-01

    The frequency distributions of sizes and fluxes of supra-arcade downflows (SADs) provide information about the process of their creation. For example, a fractal creation process may be expected to yield a power-law distribution of sizes and/or fluxes. We examine 120 cross-sectional areas and magnetic flux estimates found by Savage & McKenzie for SADs, and find that (1) the areas are consistent with a log-normal distribution and (2) the fluxes are consistent with both a log-normal and an exponential distribution. Neither set of measurements is compatible with a power-law distribution nor a normal distribution. As a demonstration of the applicability of these findings to improved understanding of reconnection, we consider a simple SAD growth scenario with minimal assumptions, capable of producing a log-normal distribution.

  1. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  2. A new normalization method based on electrical field lines for electrical capacitance tomography

    International Nuclear Information System (INIS)

    Zhang, L F; Wang, H X

    2009-01-01

    Electrical capacitance tomography (ECT) is considered to be one of the most promising process tomography techniques. The image reconstruction for ECT is an inverse problem to find the spatially distributed permittivities in a pipe. Usually, the capacitance measurements obtained from the ECT system are normalized at the high and low permittivity for image reconstruction. The parallel normalization model is commonly used during the normalization process, which assumes the distribution of materials in parallel. Thus, the normalized capacitance is a linear function of measured capacitance. A recently used model is a series normalization model which results in the normalized capacitance as a nonlinear function of measured capacitance. The newest presented model is based on electrical field centre lines (EFCL), and is a mixture of two normalization models. The multi-threshold method of this model is presented in this paper. The sensitivity matrices based on different normalization models were obtained, and image reconstruction was carried out accordingly. Simulation results indicate that reconstructed images with higher quality can be obtained based on the presented model

  3. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  4. Bimodal distribution of the magnetic dipole moment in nanoparticles with a monomodal distribution of the physical size

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2015-01-01

    High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal distribution of the magnetic dipole moment. Here, we test this assumption for different types of superparamagnetic iron oxide nanoparticles in the 5–20 nm range, by multimodal fitting of magnetization curves using the MINORIM inversion method. The particles are studied while in dilute colloidal dispersion in a liquid, thereby preventing hysteresis and diminishing the effects of magnetic anisotropy on the interpretation of the magnetization curves. For two different types of well crystallized particles, the magnetic distribution is indeed log-normal, as expected from the physical size distribution. However, two other types of particles, with twinning defects or inhomogeneous oxide phases, are found to have a bimodal magnetic distribution. Our qualitative explanation is that relatively low fields are sufficient to begin aligning the particles in the liquid on the basis of their net dipole moment, whereas higher fields are required to align the smaller domains or less magnetic phases inside the particles. - Highlights: • Multimodal fits of dilute ferrofluids reveal when the particles are multidomain. • No a priori shape of the distribution is assumed by the MINORIM inversion method. • Well crystallized particles have log-normal TEM and magnetic size distributions. • Defective particles can combine a monomodal size and a bimodal dipole moment

  5. Neutron and PIMC determination of the longitudinal momentum distribution of HCP, BCC and normal liquid 4He

    International Nuclear Information System (INIS)

    Blasdell, R.C.; Ceperley, D.M.; Simmons, R.O.

    1993-07-01

    Deep inelastic neutron scattering has been used to measure the neutron Compton profile (NCP) of a series of condensed 4 He samples at densities from 28.8 atoms/nm 3 (essentially the minimum possible density in the solid phase) up to 39.8 atoms/nm 3 using a chopper spectrometer at the Argonne National Laboratory Intense Pulsed Neutron Source. At the lowest density, the NCP was measured along an isochore through the hcp, bcc, and normal liquid phases. Average atomic kinetic energies are extracted from each of the data sets and are compared to both published and new path integral Monte-Carlo (PIMC) calculations as well as other theoretical predictions. In this preliminary analysis of the data, account is taken of the effects of instrumental resolution, multiple scattering, and final-state interactions. Both our measurements and the PIMC theory show that there are only small differences in the kinetic energy and longitudinal momentum distribution of isochoric helium samples, regardless of their phase or crystal structure

  6. Dose distribution following selective internal radiation therapy

    International Nuclear Information System (INIS)

    Fox, R.A.; Klemp, P.F.; Egan, G.; Mina, L.L.; Burton, M.A.; Gray, B.N.

    1991-01-01

    Selective Internal Radiation Therapy is the intrahepatic arterial injection of microspheres labelled with 90Y. The microspheres lodge in the precapillary circulation of tumor resulting in internal radiation therapy. The activity of the 90Y injected is managed by successive administrations of labelled microspheres and after each injection probing the liver with a calibrated beta probe to assess the dose to the superficial layers of normal tissue. Predicted doses of 75 Gy have been delivered without subsequent evidence of radiation damage to normal cells. This contrasts with the complications resulting from doses in excess of 30 Gy delivered from external beam radiotherapy. Detailed analysis of microsphere distribution in a cubic centimeter of normal liver and the calculation of dose to a 3-dimensional fine grid has shown that the radiation distribution created by the finite size and distribution of the microspheres results in an highly heterogeneous dose pattern. It has been shown that a third of normal liver will receive less than 33.7% of the dose predicted by assuming an homogeneous distribution of 90Y

  7. Organ distribution of 111In-oxine labeled lymphocytes in normal subjects and in patients with chronic lymphocytic leukemia and malignant lymphoma

    International Nuclear Information System (INIS)

    Matsuda, Shin; Uchida, Tatsumi; Yui, Tokuo; Kariyone, Shigeo

    1982-01-01

    T and B lymphocyte survival and organ distribution were studied by using 111 In-oxine labeled autologous lymphocytes in 3 normal subjects, 3 patients with chronic lymphocytic leukemia (CLL) and 9 with malignant lymphoma (ML).FDisappearance curves of the labeled lymphocytes showed two exponential components in all cases. The half time of the first component was within 1 hour in all cases. That of the second one was 50.7 +- 6.4 hours for all lymphocytes, 52.0 +- 5.5 hours for T lymphocytes and 31.6 +- 4.9 hours for B lymphocytes in normal subjects, 192.6 hours for T-CLL and 57.7 +- 46.9 hours for B-CLL, and 60.2 +- 30.7 hours for T cell type of malignant lymphoma (T-ML) and 63.7 +- 24.5 hours for B cell type of malignant lymphoma (B-ML). These data might suggest that all lymphocyte disappearance curve reflected T lymphocyte disappearance curve chiefly, and the half time of B lymphocytes was shorter than that of T lymphocytes. In the T-CLL, the half time of the second component prolonged extremely in comparison with that of normal T lymphocytes. The labeled cells were accumulated in the lungs, spleen and liver immediately after the infusion, then in the spleen most remarkably 1 hour after the infusion in all cases. The radioactivity over the bone marrow was observed from 1 hour in all cases and that of lymph nodes were first noticed 18 hours after the infusion in T-CLL and T-ML, 68 hours in B-CLL but were not noticed in normal subjects and B-ML. The recovery of labeled cells in the blood was 28.5 +- 7.9% for all lymphocytes, 19.7 +- 1.9% for T lymphocytes and 11.0 +- 5.1% for B lymphocytes in normal subjects, 25.8 +- 1.6% for CLL, and 17.6 +- 11.0% for T-ML, 7.7 +- 5.2% for B-ML, respectively. (J.P.N.)

  8. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  9. Research on Normal Human Plantar Pressure Test

    Directory of Open Access Journals (Sweden)

    Liu Xi Yang

    2016-01-01

    Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.

  10. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  11. Site-dependent distribution of macrophages in normal human extraocular muscles

    NARCIS (Netherlands)

    Schmidt, E. D.; van der Gaag, R.; Mourits, M. P.; Koornneef, L.

    1993-01-01

    PURPOSE: Clinical data indicate that extraocular muscles have different susceptibilities for some orbital immune disorders depending on their anatomic location. The resident immunocompetent cells may be important mediators in the local pathogenesis of such disorders so the distribution of these

  12. Analysis of the normal optical, Michel and molecular potentials on ...

    Indian Academy of Sciences (India)

    6. — journal of. June 2016 physics pp. 1275–1286. Analysis of the normal ... the levels are obtained for the three optical potentials to estimate the quality ... The experimental angular distribution data for the 40Ca(6Li, d)44Ti reaction .... analysed using the normal optical, Michel and molecular potentials within the framework.

  13. On the distribution of DDO galaxies

    International Nuclear Information System (INIS)

    Sharp, N.A.; Jones, B.J.T.; Jones, J.E.

    1978-01-01

    The distribution of DDO galaxies on the sky and their relationship to normal galaxies have been examined. The results appear to contradict the universality of the luminosity function for galaxies. They also indicate that DDO galaxies are preferentially found in the vicinity of normal galaxies, but not uniformly in that they tend to avoid clusters. This may be due to the dependence of distribution upon morphological type. (author)

  14. PHARMACOKINETIC VARIATIONS OF OFLOXACIN IN NORMAL AND FEBRILE RABBITS

    Directory of Open Access Journals (Sweden)

    M. AHMAD, H. RAZA, G. MURTAZA AND N. AKHTAR

    2008-12-01

    Full Text Available The influence of experimentally Escherichia coli-induced fever (EEIF on the pharmacokinetics of ofloxacin was evaluated. Ofloxacin was administered @ 20 mg.kg-1 body weight intravenously to a group of eight healthy rabbits and compared these results to values in same eight rabbits with EEIF. Pharmacokinetic parameters of ofloxacin in normal and febrile rabbits were determined by using two compartment open kinetic model. Peak plasma level (Cmax and area under the plasma concentration-time curve (AUC0-α in normal and febrile rabbits did not differ (P>0.05. However, area under first moment of plasma concentration-time curve (AUMC0-α in febrile rabbits was significantly (P<0.05 higher than that in normal rabbits. Mean values for elimination rate constant (Ke, elimination half life (t1/2β and apparent volume of distribution (Vd were significantly (P<0.05 lower in febrile rabbits compared to normal rabbits, while mean residence time (MRT and total body clearance (Cl of ofloxacin did not show any significant difference in the normal and febrile rabbits. Clinical significance of the above results can be related to the changes in the volume of distribution and elimination half life that illustrates an altered steady state in febrile condition; hence, the need for an adjustment of dosage regimen in EEIF is required.

  15. Determining prescription durations based on the parametric waiting time distribution

    DEFF Research Database (Denmark)

    Støvring, Henrik; Pottegård, Anton; Hallas, Jesper

    2016-01-01

    two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users...... in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies......-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide...

  16. Apparent Transition in the Human Height Distribution Caused by Age-Dependent Variation during Puberty Period

    Science.gov (United States)

    Iwata, Takaki; Yamazaki, Yoshihiro; Kuninaka, Hiroto

    2013-08-01

    In this study, we examine the validity of the transition of the human height distribution from the log-normal distribution to the normal distribution during puberty, as suggested in an earlier study [Kuninaka et al.: J. Phys. Soc. Jpn. 78 (2009) 125001]. Our data analysis reveals that, in late puberty, the variation in height decreases as children grow. Thus, the classification of a height dataset by age at this stage leads us to analyze a mixture of distributions with larger means and smaller variations. This mixture distribution has a negative skewness and is consequently closer to the normal distribution than to the log-normal distribution. The opposite case occurs in early puberty and the mixture distribution is positively skewed, which resembles the log-normal distribution rather than the normal distribution. Thus, this scenario mimics the transition during puberty. Additionally, our scenario is realized through a numerical simulation based on a statistical model. The present study does not support the transition suggested by the earlier study.

  17. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    International Nuclear Information System (INIS)

    Smallidge, S.T.; Baker, T.T.; VanLeeuwen, D.; Gould, W.R.; Thompson, B.C.

    2010-01-01

    Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.

  18. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  19. Microscopic prediction of speech intelligibility in spatially distributed speech-shaped noise for normal-hearing listeners.

    Science.gov (United States)

    Geravanchizadeh, Masoud; Fallah, Ali

    2015-12-01

    A binaural and psychoacoustically motivated intelligibility model, based on a well-known monaural microscopic model is proposed. This model simulates a phoneme recognition task in the presence of spatially distributed speech-shaped noise in anechoic scenarios. In the proposed model, binaural advantage effects are considered by generating a feature vector for a dynamic-time-warping speech recognizer. This vector consists of three subvectors incorporating two monaural subvectors to model the better-ear hearing, and a binaural subvector to simulate the binaural unmasking effect. The binaural unit of the model is based on equalization-cancellation theory. This model operates blindly, which means separate recordings of speech and noise are not required for the predictions. Speech intelligibility tests were conducted with 12 normal hearing listeners by collecting speech reception thresholds (SRTs) in the presence of single and multiple sources of speech-shaped noise. The comparison of the model predictions with the measured binaural SRTs, and with the predictions of a macroscopic binaural model called extended equalization-cancellation, shows that this approach predicts the intelligibility in anechoic scenarios with good precision. The square of the correlation coefficient (r(2)) and the mean-absolute error between the model predictions and the measurements are 0.98 and 0.62 dB, respectively.

  20. Deformation associated with continental normal faults

    Science.gov (United States)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  1. Local charge nonequilibrium and anomalous energy dependence of normalized moments in narrow rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    From the study of even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows, we propose a simple picture for charge correlation with nonzero correlation length and calculate the multiplicity distributions and the normalized moments in different rapidity windows at different energies. The results explain the experimentally observed coincidence and separation of even and odd distributions and also the anomalous energy dependence of normalized moments in narrow rapidity windows. The reason for the separation of even-odd distributions, appearing first at large multiplicities, is shown to be energy conservation. The special role of no-particle events in narrow rapidity windows is pointed out

  2. The skin immune system (SIS): distribution and immunophenotype of lymphocyte subpopulations in normal human skin

    NARCIS (Netherlands)

    Bos, J. D.; Zonneveld, I.; Das, P. K.; Krieg, S. R.; van der Loos, C. M.; Kapsenberg, M. L.

    1987-01-01

    The complexity of immune response-associated cells present in normal human skin was recently redefined as the skin immune system (SIS). In the present study, the exact immunophenotypes of lymphocyte subpopulations with their localizations in normal human skin were determined quantitatively. B cells

  3. Unit Root Testing and Estimation in Nonlinear ESTAR Models with Normal and Non-Normal Errors.

    Directory of Open Access Journals (Sweden)

    Umair Khalil

    Full Text Available Exponential Smooth Transition Autoregressive (ESTAR models can capture non-linear adjustment of the deviations from equilibrium conditions which may explain the economic behavior of many variables that appear non stationary from a linear viewpoint. Many researchers employ the Kapetanios test which has a unit root as the null and a stationary nonlinear model as the alternative. However this test statistics is based on the assumption of normally distributed errors in the DGP. Cook has analyzed the size of the nonlinear unit root of this test in the presence of heavy-tailed innovation process and obtained the critical values for both finite variance and infinite variance cases. However the test statistics of Cook are oversized. It has been found by researchers that using conventional tests is dangerous though the best performance among these is a HCCME. The over sizing for LM tests can be reduced by employing fixed design wild bootstrap remedies which provide a valuable alternative to the conventional tests. In this paper the size of the Kapetanios test statistic employing hetroscedastic consistent covariance matrices has been derived and the results are reported for various sample sizes in which size distortion is reduced. The properties for estimates of ESTAR models have been investigated when errors are assumed non-normal. We compare the results obtained through the fitting of nonlinear least square with that of the quantile regression fitting in the presence of outliers and the error distribution was considered to be from t-distribution for various sample sizes.

  4. Competition between clonal plasma cells and normal cells for potentially overlapping bone marrow niches is associated with a progressively altered cellular distribution in MGUS vs myeloma.

    Science.gov (United States)

    Paiva, B; Pérez-Andrés, M; Vídriales, M-B; Almeida, J; de las Heras, N; Mateos, M-V; López-Corral, L; Gutiérrez, N C; Blanco, J; Oriol, A; Hernández, M T; de Arriba, F; de Coca, A G; Terol, M-J; de la Rubia, J; González, Y; Martín, A; Sureda, A; Schmidt-Hieber, M; Schmitz, A; Johnsen, H E; Lahuerta, J-J; Bladé, J; San-Miguel, J F; Orfao, A

    2011-04-01

    Disappearance of normal bone marrow (BM) plasma cells (PC) predicts malignant transformation of monoclonal gammopathy of undetermined significance (MGUS) and smoldering myeloma (SMM) into symptomatic multiple myeloma (MM). The homing, behavior and survival of normal PC, but also CD34(+) hematopoietic stem cells (HSC), B-cell precursors, and clonal PC largely depends on their interaction with stromal cell-derived factor-1 (SDF-1) expressing, potentially overlapping BM stromal cell niches. Here, we investigate the distribution, phenotypic characteristics and competitive migration capacity of these cell populations in patients with MGUS, SMM and MM vs healthy adults (HA) aged >60 years. Our results show that BM and peripheral blood (PB) clonal PC progressively increase from MGUS to MM, the latter showing a slightly more immature immunophenotype. Of note, such increased number of clonal PC is associated with progressive depletion of normal PC, B-cell precursors and CD34(+) HSC in the BM, also with a parallel increase in PB. In an ex vivo model, normal PC, B-cell precursors and CD34(+) HSC from MGUS and SMM, but not MM patients, were able to abrogate the migration of clonal PC into serial concentrations of SDF-1. Overall, our results show that progressive competition and replacement of normal BM cells by clonal PC is associated with more advanced disease in patients with MGUS, SMM and MM.

  5. A closer look at the effect of preliminary goodness-of-fit testing for normality for the one-sample t-test.

    Science.gov (United States)

    Rochon, Justine; Kieser, Meinhard

    2011-11-01

    Student's one-sample t-test is a commonly used method when inference about the population mean is made. As advocated in textbooks and articles, the assumption of normality is often checked by a preliminary goodness-of-fit (GOF) test. In a paper recently published by Schucany and Ng it was shown that, for the uniform distribution, screening of samples by a pretest for normality leads to a more conservative conditional Type I error rate than application of the one-sample t-test without preliminary GOF test. In contrast, for the exponential distribution, the conditional level is even more elevated than the Type I error rate of the t-test without pretest. We examine the reasons behind these characteristics. In a simulation study, samples drawn from the exponential, lognormal, uniform, Student's t-distribution with 2 degrees of freedom (t(2) ) and the standard normal distribution that had passed normality screening, as well as the ingredients of the test statistics calculated from these samples, are investigated. For non-normal distributions, we found that preliminary testing for normality may change the distribution of means and standard deviations of the selected samples as well as the correlation between them (if the underlying distribution is non-symmetric), thus leading to altered distributions of the resulting test statistics. It is shown that for skewed distributions the excess in Type I error rate may be even more pronounced when testing one-sided hypotheses. ©2010 The British Psychological Society.

  6. A Post-Truncation Parameterization of Truncated Normal Technical Inefficiency

    OpenAIRE

    Christine Amsler; Peter Schmidt; Wen-Jen Tsay

    2013-01-01

    In this paper we consider a stochastic frontier model in which the distribution of technical inefficiency is truncated normal. In standard notation, technical inefficiency u is distributed as N^+ (μ,σ^2). This distribution is affected by some environmental variables z that may or may not affect the level of the frontier but that do affect the shortfall of output from the frontier. We will distinguish the pre-truncation mean (μ) and variance (σ^2) from the post-truncation mean μ_*=E(u) and var...

  7. Are your covariates under control? How normalization can re-introduce covariate effects.

    Science.gov (United States)

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  8. Quasi-normal modes from non-commutative matrix dynamics

    Science.gov (United States)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  9. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study.

    Science.gov (United States)

    Seed, Mike; van Amerom, Joshua F P; Yoo, Shi-Joon; Al Nafisi, Bahiyah; Grosse-Wortmann, Lars; Jaeggi, Edgar; Jansz, Michael S; Macgowan, Christopher K

    2012-11-26

    We present the first phase contrast (PC) cardiovascular magnetic resonance (CMR) measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG). A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30-39 weeks). Flow was measured in the major fetal vessels and indexed to the fetal weight. There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96). Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO) 540 ± 101, main pulmonary artery (MPA) 327 ± 68, ascending aorta (AAo) 198 ± 38, superior vena cava (SVC) 147 ± 46, ductus arteriosus (DA) 220 ± 39,pulmonary blood flow (PBF) 106 ± 59,descending aorta (DAo) 273 ± 85, umbilical vein (UV) 160 ± 62, foramen ovale (FO)107 ± 54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60 ± 4, AAo37 ± 4, SVC 28 ± 7, DA 41 ± 8, PBF 19 ± 10, DAo50 ± 12, UV 30 ± 9, FO 21 ± 12. This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  10. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Seed Mike

    2012-11-01

    Full Text Available Abstract Background We present the first phase contrast (PC cardiovascular magnetic resonance (CMR measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG. Methods A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30–39 weeks. Flow was measured in the major fetal vessels and indexed to the fetal weight. Results There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96. Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO 540±101, main pulmonary artery (MPA 327±68, ascending aorta (AAo 198±38, superior vena cava (SVC 147±46, ductus arteriosus (DA 220±39,pulmonary blood flow (PBF 106±59,descending aorta (DAo 273±85, umbilical vein (UV 160±62, foramen ovale (FO107±54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60±4, AAo37±4, SVC 28±7, DA 41±8, PBF 19±10, DAo50±12, UV 30±9, FO 21±12. Conclusion This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  11. Evaluation of distribution patterns and decision of distribution coefficients of trace elements in high-purity aluminium by INAA

    International Nuclear Information System (INIS)

    Hayakawa, Yasuhiro; Suzuki, Shogo; Hirai, Shoji

    1986-01-01

    Recently, a high-purity aluminium has been used in semi-coductor device, so on. It was required that trace impurities should be reduced and that its content should be quantitatively evaluated. In this study, distribution patterns of many trace impurities in 99.999 % aluminium ingots, which was purified using a normal freezing method, were evaluated by an INAA. The effective distribution coefficient k for each detected elements was calculated using a theoretical distribution equation in the normal freezing method. As a result, the elements of k 1 was Hf. Especially, La, Sm, U and Th could be effectively purified, but Sc and Hf could be scarcely purified. Further more, it was found that the slower freezing gave the effective distribution coefficient close to the equilibrium distribution coefficient, and that the effective distribution coefficient became smaller with the larger atomic radius. (author)

  12. Normal modes of vibration in nickel

    Energy Technology Data Exchange (ETDEWEB)

    Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B

    1964-07-01

    The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)

  13. Is Middle-Upper Arm Circumference “normally” distributed? Secondary data analysis of 852 nutrition surveys

    Directory of Open Access Journals (Sweden)

    Severine Frison

    2016-05-01

    Full Text Available Abstract Background Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 % are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH and/or low Mid-Upper Arm Circumference (MUAC (since 2005. Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. Methods This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise “non-normal” distributions. Results The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 % distributions using the Shapiro–Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 % were skewed (D’Agostino test and 196 (36.8 % had a kurtosis different to the one observed in the normal distribution (Anscombe–Glynn test. Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 % showed high digit preference, 164 (30.8 % had a large design effect, and 204 (38.3 % a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were “normalised” and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating “normal” after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7

  14. On Introducing Asymmetry into Circular Distributions

    Directory of Open Access Journals (Sweden)

    Dale Umbach

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} We give a brief history of the results which led to the introduction of asymmetry into symmetric circular distributions. This is followed by the presentation of another method of introducing asymmetry. Some properties of the induced distributions are studied. Finally, this new distribution is shown to be a reasonable fit to the Jander ant data as presented in Fisher (1993.

  15. Exponential model normalization for electrical capacitance tomography with external electrodes under gap permittivity conditions

    International Nuclear Information System (INIS)

    Baidillah, Marlin R; Takei, Masahiro

    2017-01-01

    A nonlinear normalization model which is called exponential model for electrical capacitance tomography (ECT) with external electrodes under gap permittivity conditions has been developed. The exponential model normalization is proposed based on the inherently nonlinear relationship characteristic between the mixture permittivity and the measured capacitance due to the gap permittivity of inner wall. The parameters of exponential equation are derived by using an exponential fitting curve based on the simulation and a scaling function is added to adjust the experiment system condition. The exponential model normalization was applied to two dimensional low and high contrast dielectric distribution phantoms by using simulation and experimental studies. The proposed normalization model has been compared with other normalization models i.e. Parallel, Series, Maxwell and Böttcher models. Based on the comparison of image reconstruction results, the exponential model is reliable to predict the nonlinear normalization of measured capacitance in term of low and high contrast dielectric distribution. (paper)

  16. Approach of the value of an annuity when non-central moments of the capitalization factor are known: an R application with interest rates following normal and beta distributions

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    2015-07-01

    Full Text Available This paper proposes an expression of the value of an annuity with payments of 1 unit each when the interest rate is random. In order to attain this objective, we proceed on the assumption that the non-central moments of the capitalization factor are known. Specifically, to calculate the value of these annuities, we propose two different expressions. First, we suppose that the random interest rate is normally distributed; then, we assume that it follows the beta distribution. A practical application of these two methodologies is also implemented using the R statistical software.

  17. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  18. Normal human bone marrow and its variations in MRI

    International Nuclear Information System (INIS)

    Vahlensieck, M.; Schmidt, H.M.

    2000-01-01

    Physiology and age dependant changes of human bone marrow are described. The resulting normal distribution patterns of active and inactive bone marrow including the various contrasts on different MR-sequences are discussed. (orig.) [de

  19. Back to Normal! Gaussianizing posterior distributions for cosmological probes

    Science.gov (United States)

    Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.

    2014-05-01

    We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.

  20. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Min Hoan; Cho, Jeong Yeon [Samsung Cheil Hospital, Sungkunkwan University school of Medicine, Seoul (Korea, Republic of)

    2003-09-15

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  1. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    International Nuclear Information System (INIS)

    Moon, Min Hoan; Cho, Jeong Yeon

    2003-01-01

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  2. Three-dimensional finite analysis of acetabular contact pressure and contact area during normal walking.

    Science.gov (United States)

    Wang, Guangye; Huang, Wenjun; Song, Qi; Liang, Jinfeng

    2017-11-01

    This study aims to analyze the contact areas and pressure distributions between the femoral head and mortar during normal walking using a three-dimensional finite element model (3D-FEM). Computed tomography (CT) scanning technology and a computer image processing system were used to establish the 3D-FEM. The acetabular mortar model was used to simulate the pressures during 32 consecutive normal walking phases and the contact areas at different phases were calculated. The distribution of the pressure peak values during the 32 consecutive normal walking phases was bimodal, which reached the peak (4.2 Mpa) at the initial phase where the contact area was significantly higher than that at the stepping phase. The sites that always kept contact were concentrated on the acetabular top and leaned inwards, while the anterior and posterior acetabular horns had no pressure concentration. The pressure distributions of acetabular cartilage at different phases were significantly different, the zone of increased pressure at the support phase distributed at the acetabular top area, while that at the stepping phase distributed in the inside of acetabular cartilage. The zones of increased contact pressure and the distributions of acetabular contact areas had important significance towards clinical researches, and could indicate the inductive factors of acetabular osteoarthritis. Copyright © 2016. Published by Elsevier Taiwan.

  3. Normal zone soliton in large composite superconductors

    International Nuclear Information System (INIS)

    Kupferman, R.; Mints, R.G.; Ben-Jacob, E.

    1992-01-01

    The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain

  4. The self-normalized Donsker theorem revisited

    OpenAIRE

    Parczewski, Peter

    2016-01-01

    We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.

  5. Neuronal variability during handwriting: lognormal distribution.

    Directory of Open Access Journals (Sweden)

    Valery I Rupasov

    Full Text Available We examined time-dependent statistical properties of electromyographic (EMG signals recorded from intrinsic hand muscles during handwriting. Our analysis showed that trial-to-trial neuronal variability of EMG signals is well described by the lognormal distribution clearly distinguished from the Gaussian (normal distribution. This finding indicates that EMG formation cannot be described by a conventional model where the signal is normally distributed because it is composed by summation of many random sources. We found that the variability of temporal parameters of handwriting--handwriting duration and response time--is also well described by a lognormal distribution. Although, the exact mechanism of lognormal statistics remains an open question, the results obtained should significantly impact experimental research, theoretical modeling and bioengineering applications of motor networks. In particular, our results suggest that accounting for lognormal distribution of EMGs can improve biomimetic systems that strive to reproduce EMG signals in artificial actuators.

  6. Modeling and forecasting foreign exchange daily closing prices with normal inverse Gaussian

    Science.gov (United States)

    Teneng, Dean

    2013-09-01

    We fit the normal inverse Gaussian(NIG) distribution to foreign exchange closing prices using the open software package R and select best models by Käärik and Umbleja (2011) proposed strategy. We observe that daily closing prices (12/04/2008 - 07/08/2012) of CHF/JPY, AUD/JPY, GBP/JPY, NZD/USD, QAR/CHF, QAR/EUR, SAR/CHF, SAR/EUR, TND/CHF and TND/EUR are excellent fits while EGP/EUR and EUR/GBP are good fits with a Kolmogorov-Smirnov test p-value of 0.062 and 0.08 respectively. It was impossible to estimate normal inverse Gaussian parameters (by maximum likelihood; computational problem) for JPY/CHF but CHF/JPY was an excellent fit. Thus, while the stochastic properties of an exchange rate can be completely modeled with a probability distribution in one direction, it may be impossible the other way around. We also demonstrate that foreign exchange closing prices can be forecasted with the normal inverse Gaussian (NIG) Lévy process, both in cases where the daily closing prices can and cannot be modeled by NIG distribution.

  7. Extravascular transport in normal and tumor tissues.

    Science.gov (United States)

    Jain, R K; Gerlowski, L E

    1986-01-01

    The transport characteristics of the normal and tumor tissue extravascular space provide the basis for the determination of the optimal dosage and schedule regimes of various pharmacological agents in detection and treatment of cancer. In order for the drug to reach the cellular space where most therapeutic action takes place, several transport steps must first occur: (1) tissue perfusion; (2) permeation across the capillary wall; (3) transport through interstitial space; and (4) transport across the cell membrane. Any of these steps including intracellular events such as metabolism can be the rate-limiting step to uptake of the drug, and these rate-limiting steps may be different in normal and tumor tissues. This review examines these transport limitations, first from an experimental point of view and then from a modeling point of view. Various types of experimental tumor models which have been used in animals to represent human tumors are discussed. Then, mathematical models of extravascular transport are discussed from the prespective of two approaches: compartmental and distributed. Compartmental models lump one or more sections of a tissue or body into a "compartment" to describe the time course of disposition of a substance. These models contain "effective" parameters which represent the entire compartment. Distributed models consider the structural and morphological aspects of the tissue to determine the transport properties of that tissue. These distributed models describe both the temporal and spatial distribution of a substance in tissues. Each of these modeling techniques is described in detail with applications for cancer detection and treatment in mind.

  8. Normal values of regional left ventricular myocardial thickness, mass and distribution-assessed by 320-detector computed tomography angiography in the Copenhagen General Population Study

    DEFF Research Database (Denmark)

    Hindsø, Louise; Fuchs, Andreas; Kühl, Jørgen Tobias

    2017-01-01

    regional normal reference values of the left ventricle. The aim of this study was to derive reference values of regional LV myocardial thickness (LVMT) and mass (LVMM) from a healthy study group of the general population using cardiac computed tomography angiography (CCTA). We wanted to introduce LV...... myocardial distribution (LVMD) as a measure of regional variation of the LVMT. Moreover, we wanted to determine whether these parameters varied between men and women. We studied 568 (181 men; 32%) adults, free of cardiovascular disease and risk factors, who underwent 320-detector CCTA. Mean age was 55 (range...... 40-84) years. Regional LVMT and LVMM were measured, according to the American Heart Association's 17 segment model, using semi-automatic software. Mean LVMT were 6.6 mm for men and 5.4 mm for women (p normal LV was thickest in the basal septum (segment 3; men = 8.3 mm; women = 7.2 mm...

  9. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  10. The stochastic distribution of available coefficient of friction on quarry tiles for human locomotion.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    The available coefficient of friction (ACOF) for human locomotion is the maximum coefficient of friction that can be supported without a slip at the shoe and floor interface. A statistical model was introduced to estimate the probability of slip by comparing the ACOF with the required coefficient of friction, assuming that both coefficients have stochastic distributions. This paper presents an investigation of the stochastic distributions of the ACOF of quarry tiles under dry, water and glycerol conditions. One hundred friction measurements were performed on a walkway under the surface conditions of dry, water and 45% glycerol concentration. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF appears to fit the normal and log-normal distributions better than the Weibull distribution for the water and glycerol conditions. However, no match was found between the distribution of ACOF under the dry condition and any of the three continuous distributions evaluated. Based on limited data, a normal distribution might be more appropriate due to its simplicity, practicality and familiarity among the three distributions evaluated.

  11. Statistical Evidence for the Preference of Frailty Distributions with Regularly-Varying-at-Zero Densities

    DEFF Research Database (Denmark)

    Missov, Trifon I.; Schöley, Jonas

    to this criterion admissible distributions are, for example, the gamma, the beta, the truncated normal, the log-logistic and the Weibull, while distributions like the log-normal and the inverse Gaussian do not satisfy this condition. In this article we show that models with admissible frailty distributions...... and a Gompertz baseline provide a better fit to adult human mortality data than the corresponding models with non-admissible frailty distributions. We implement estimation procedures for mixture models with a Gompertz baseline and frailty that follows a gamma, truncated normal, log-normal, or inverse Gaussian...

  12. A non-Gaussian multivariate distribution with all lower-dimensional Gaussians and related families

    KAUST Repository

    Dutta, Subhajit

    2014-07-28

    Several fascinating examples of non-Gaussian bivariate distributions which have marginal distribution functions to be Gaussian have been proposed in the literature. These examples often clarify several properties associated with the normal distribution. In this paper, we generalize this result in the sense that we construct a pp-dimensional distribution for which any proper subset of its components has the Gaussian distribution. However, the jointpp-dimensional distribution is inconsistent with the distribution of these subsets because it is not Gaussian. We study the probabilistic properties of this non-Gaussian multivariate distribution in detail. Interestingly, several popular tests of multivariate normality fail to identify this pp-dimensional distribution as non-Gaussian. We further extend our construction to a class of elliptically contoured distributions as well as skewed distributions arising from selections, for instance the multivariate skew-normal distribution.

  13. A non-Gaussian multivariate distribution with all lower-dimensional Gaussians and related families

    KAUST Repository

    Dutta, Subhajit; Genton, Marc G.

    2014-01-01

    Several fascinating examples of non-Gaussian bivariate distributions which have marginal distribution functions to be Gaussian have been proposed in the literature. These examples often clarify several properties associated with the normal distribution. In this paper, we generalize this result in the sense that we construct a pp-dimensional distribution for which any proper subset of its components has the Gaussian distribution. However, the jointpp-dimensional distribution is inconsistent with the distribution of these subsets because it is not Gaussian. We study the probabilistic properties of this non-Gaussian multivariate distribution in detail. Interestingly, several popular tests of multivariate normality fail to identify this pp-dimensional distribution as non-Gaussian. We further extend our construction to a class of elliptically contoured distributions as well as skewed distributions arising from selections, for instance the multivariate skew-normal distribution.

  14. Generalised extreme value distributions provide a natural hypothesis for the shape of seed mass distributions.

    Directory of Open Access Journals (Sweden)

    Will Edwards

    Full Text Available Among co-occurring species, values for functionally important plant traits span orders of magnitude, are uni-modal, and generally positively skewed. Such data are usually log-transformed "for normality" but no convincing mechanistic explanation for a log-normal expectation exists. Here we propose a hypothesis for the distribution of seed masses based on generalised extreme value distributions (GEVs, a class of probability distributions used in climatology to characterise the impact of event magnitudes and frequencies; events that impose strong directional selection on biological traits. In tests involving datasets from 34 locations across the globe, GEVs described log10 seed mass distributions as well or better than conventional normalising statistics in 79% of cases, and revealed a systematic tendency for an overabundance of small seed sizes associated with low latitudes. GEVs characterise disturbance events experienced in a location to which individual species' life histories could respond, providing a natural, biological explanation for trait expression that is lacking from all previous hypotheses attempting to describe trait distributions in multispecies assemblages. We suggest that GEVs could provide a mechanistic explanation for plant trait distributions and potentially link biology and climatology under a single paradigm.

  15. Bimodal distribution of the magnetic dipole moment in nanoparticles with a monomodal distribution of the physical size

    NARCIS (Netherlands)

    van Rijssel, Jozef; Kuipers, Bonny W M; Erne, Ben

    2015-01-01

    High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal

  16. GC-Content Normalization for RNA-Seq Data

    Science.gov (United States)

    2011-01-01

    Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264

  17. Prediction of the filtrate particle size distribution from the pore size distribution in membrane filtration: Numerical correlations from computer simulations

    Science.gov (United States)

    Marrufo-Hernández, Norma Alejandra; Hernández-Guerrero, Maribel; Nápoles-Duarte, José Manuel; Palomares-Báez, Juan Pedro; Chávez-Rojo, Marco Antonio

    2018-03-01

    We present a computational model that describes the diffusion of a hard spheres colloidal fluid through a membrane. The membrane matrix is modeled as a series of flat parallel planes with circular pores of different sizes and random spatial distribution. This model was employed to determine how the size distribution of the colloidal filtrate depends on the size distributions of both, the particles in the feed and the pores of the membrane, as well as to describe the filtration kinetics. A Brownian dynamics simulation study considering normal distributions was developed in order to determine empirical correlations between the parameters that characterize these distributions. The model can also be extended to other distributions such as log-normal. This study could, therefore, facilitate the selection of membranes for industrial or scientific filtration processes once the size distribution of the feed is known and the expected characteristics in the filtrate have been defined.

  18. Nonstationary ARCH and GARCH with t-distributed Innovations

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard; Rahbek, Anders

    Consistency and asymptotic normality are established for the maximum likelihood estimators in the nonstationary ARCH and GARCH models with general t-distributed innovations. The results hold for joint estimation of (G)ARCH effects and the degrees of freedom parameter parametrizing the t-distribut......Consistency and asymptotic normality are established for the maximum likelihood estimators in the nonstationary ARCH and GARCH models with general t-distributed innovations. The results hold for joint estimation of (G)ARCH effects and the degrees of freedom parameter parametrizing the t......-distribution. With T denoting sample size, classic square-root T-convergence is shown to hold with closed form expressions for the multivariate covariances....

  19. Effects of normalization on quantitative traits in association test

    Science.gov (United States)

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  20. Effects of normalization on quantitative traits in association test

    Directory of Open Access Journals (Sweden)

    Yap Von Bing

    2009-12-01

    Full Text Available Abstract Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest.

  1. Normality of raw data in general linear models: The most widespread myth in statistics

    Science.gov (United States)

    Kery, Marc; Hatfield, Jeff S.

    2003-01-01

    In years of statistical consulting for ecologists and wildlife biologists, by far the most common misconception we have come across has been the one about normality in general linear models. These comprise a very large part of the statistical models used in ecology and include t tests, simple and multiple linear regression, polynomial regression, and analysis of variance (ANOVA) and covariance (ANCOVA). There is a widely held belief that the normality assumption pertains to the raw data rather than to the model residuals. We suspect that this error may also occur in countless published studies, whenever the normality assumption is tested prior to analysis. This may lead to the use of nonparametric alternatives (if there are any), when parametric tests would indeed be appropriate, or to use of transformations of raw data, which may introduce hidden assumptions such as multiplicative effects on the natural scale in the case of log-transformed data. Our aim here is to dispel this myth. We very briefly describe relevant theory for two cases of general linear models to show that the residuals need to be normally distributed if tests requiring normality are to be used, such as t and F tests. We then give two examples demonstrating that the distribution of the response variable may be nonnormal, and yet the residuals are well behaved. We do not go into the issue of how to test normality; instead we display the distributions of response variables and residuals graphically.

  2. Confidence Limits for the Indirect Effect: Distribution of the Product and Resampling Methods

    Science.gov (United States)

    MacKinnon, David P.; Lockwood, Chondra M.; Williams, Jason

    2010-01-01

    The most commonly used method to test an indirect effect is to divide the estimate of the indirect effect by its standard error and compare the resulting z statistic with a critical value from the standard normal distribution. Confidence limits for the indirect effect are also typically based on critical values from the standard normal distribution. This article uses a simulation study to demonstrate that confidence limits are imbalanced because the distribution of the indirect effect is normal only in special cases. Two alternatives for improving the performance of confidence limits for the indirect effect are evaluated: (a) a method based on the distribution of the product of two normal random variables, and (b) resampling methods. In Study 1, confidence limits based on the distribution of the product are more accurate than methods based on an assumed normal distribution but confidence limits are still imbalanced. Study 2 demonstrates that more accurate confidence limits are obtained using resampling methods, with the bias-corrected bootstrap the best method overall. PMID:20157642

  3. Distributional Properties of Order Statistics and Record Statistics

    Directory of Open Access Journals (Sweden)

    Abdul Hamid Khan

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Distributional properties of the order statistics, upper and lower records have been utilized to characterize distributions of interest. Further, one sided random dilation and contraction are utilized to obtain the distribution of non-adjacent ordered statistics and also their important deductions are discussed.

  4. Normal Spin Asymmetries in Elastic Electron-Proton Scattering

    International Nuclear Information System (INIS)

    M. Gorchtein; P.A.M. Guichon; M. Vanderhaeghen

    2004-01-01

    We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes

  5. Normal Spin Asymmetries in Elastic Electron-Proton Scattering

    International Nuclear Information System (INIS)

    Gorchtein, M.; Guichon, P.A.M.; Vanderhaeghen, M.

    2005-01-01

    We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes

  6. A skewed distribution with asset pricing applications

    NARCIS (Netherlands)

    de Roon, Frans; Karehnke, P.

    2017-01-01

    Recent research has identified skewness and downside risk as one of the most important features of risk. We present a new distribution which makes modeling skewed risks no more difficult than normally distributed (symmetric) risks. Our distribution is a combination of the “downside” and “upside”

  7. A One-Sample Test for Normality with Kernel Methods

    OpenAIRE

    Kellner , Jérémie; Celisse , Alain

    2015-01-01

    We propose a new one-sample test for normality in a Reproducing Kernel Hilbert Space (RKHS). Namely, we test the null-hypothesis of belonging to a given family of Gaussian distributions. Hence our procedure may be applied either to test data for normality or to test parameters (mean and covariance) if data are assumed Gaussian. Our test is based on the same principle as the MMD (Maximum Mean Discrepancy) which is usually used for two-sample tests such as homogeneity or independence testing. O...

  8. Studies on the distribution of hematopoietic bone marrow by bone marrow scintigraphy, 2. The bone marrow distribution in leukemia

    Energy Technology Data Exchange (ETDEWEB)

    Fujimori, K [Kyoto Univ. (Japan). Faculty of Medicine

    1976-04-01

    Distribution of the leukemic marrow was investigated in 42 cases by bone marrow scintigraphy using sup(99m)Tc sulfur colloid in association with clinical findings and ferrokinetics studies in order to clarify hematopoietic function in leukemia. 17 of chronic myelogenous leukemia, 3 of lymphatic leukemia, 2 of monocytic leukemia, 7 of atypical leukemia and one of erythroleukemia. 12 acute myelogenous leukemia were classified into 3 types A, B and C. Type A showed the distribution similar to those obtained with normal controls. Ferrokinetics studies, however, indicated complete absence of erythropoiesis. Type B showed complete lack of sup(99m)Tc activity in usual marrow sites, although ferrokinetics data showed normal erythropoeitic function. Type C showed abnormal concentration of sup(99m)Tc sulfur colloid in the tibiae. 17 chronic myelogenous leukemia showed reduced sup(99m)Tc activity in usual marrow sites and remarkable expanded marrow extending into distal femurs, proximal and distal tibiae and bones of feet. 2 acute lymphotic leukemia patients showed complete absence of sup(99m)Tc activity. The one chronic type showed almost normal distribution. Monocytic leukemia showed decreased marrow distribution in the sternum and vertebrae. Of 6 atypical leukemias one showed almost normal distribution. The others, including a case with hypoplastic luekemia, demonstrated marrow extension similar to that observed in chronic myelogenous leukemia or monocytic leukemia. Erythroleukemia showed increased concentrations of sup(99m)Tc activity in the usual marrow sites and marked marrow expansion throughout all long bones. These results suggest that there is a discrepancy between bone marrow distribution and hematopoietic function in the cases of acute myelogenous leukemia.

  9. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts

    International Nuclear Information System (INIS)

    Sakashita, Tetsuya; Kobayashi, Yasuhiko; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Saito, Kimiaki

    2014-01-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/μm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log–log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE. (author)

  10. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    Science.gov (United States)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  11. On the fission gas release from oxide fuels during normal grain growth

    International Nuclear Information System (INIS)

    Paraschiv, M.C.; Paraschiv, A.; Glodeanu, F.

    1997-01-01

    A mathematical formalism for calculating the fission gas release from oxide fuels considering an arbitrary distribution of fuel grain size with only zero boundary condition for gas diffusion at the grain boundary is proposed. It has also been proved that it becomes unnecessary to consider the grain volume distribution function for fission products diffusion when the grain boundary gas resolution is considered, if thermodynamic forces on grain boundaries are only time dependent. In order to highlight the effect of the normal grain growth on fission gas release from oxide fuels Hillert's and Lifshitz and Slyozov's theories have been selected. The last one was used to give an adequate treatment of normal grain growth for the diffusion-controlled grain boundary movement in oxide fuels. It has been shown that during the fuel irradiation, the asymptotic form of the grain volume distribution functions given by Hillert and Lifshitz and Slyozov models can be maintained but the grain growth rate constant becomes time dependent itself. Experimental results have been used to correlate the two theoretical models of normal grain growth to the fission gas release from oxide fuels. (orig.)

  12. Exciting Normal Distribution

    Science.gov (United States)

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  13. Behaviour interpretation log-normal tenor of uranium in the context of intrusive rocks

    International Nuclear Information System (INIS)

    Valencia, Jacinto; Palacios, Andres; Maguina, Jose

    2015-01-01

    Analysis and processing of the results of the tenor of uranium obtained from a rock intrusive by the method of gamma spectrometry, which result in a better correlation between uranium and thorium when the logarithm of these analyzes is used is discussed and is represented in a thorium/uranium diagram obtaining a better response. This is provided that the expression of the lognormal distribution provides a closer relation to the spatial distribution of uranium in a mineral deposit. The representation of a normal distribution and a log-normal distribution is shown. In the interpretative part explained by diagrams the behavior of the thorium/uranium and relation to potassium from direct measurements of tenors obtained in the field of sampling points of a section of granite San Ramon (SR) relationship, and volcanic Mitu Group (GM) where it has identified the granite rock of this unit as a source of uranium. (author)

  14. Multiple imputation in the presence of non-normal data.

    Science.gov (United States)

    Lee, Katherine J; Carlin, John B

    2017-02-20

    Multiple imputation (MI) is becoming increasingly popular for handling missing data. Standard approaches for MI assume normality for continuous variables (conditionally on the other variables in the imputation model). However, it is unclear how to impute non-normally distributed continuous variables. Using simulation and a case study, we compared various transformations applied prior to imputation, including a novel non-parametric transformation, to imputation on the raw scale and using predictive mean matching (PMM) when imputing non-normal data. We generated data from a range of non-normal distributions, and set 50% to missing completely at random or missing at random. We then imputed missing values on the raw scale, following a zero-skewness log, Box-Cox or non-parametric transformation and using PMM with both type 1 and 2 matching. We compared inferences regarding the marginal mean of the incomplete variable and the association with a fully observed outcome. We also compared results from these approaches in the analysis of depression and anxiety symptoms in parents of very preterm compared with term-born infants. The results provide novel empirical evidence that the decision regarding how to impute a non-normal variable should be based on the nature of the relationship between the variables of interest. If the relationship is linear in the untransformed scale, transformation can introduce bias irrespective of the transformation used. However, if the relationship is non-linear, it may be important to transform the variable to accurately capture this relationship. A useful alternative is to impute the variable using PMM with type 1 matching. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Statistical distributions of optimal global alignment scores of random protein sequences

    Directory of Open Access Journals (Sweden)

    Tang Jiaowei

    2005-10-01

    Full Text Available Abstract Background The inference of homology from statistically significant sequence similarity is a central issue in sequence alignments. So far the statistical distribution function underlying the optimal global alignments has not been completely determined. Results In this study, random and real but unrelated sequences prepared in six different ways were selected as reference datasets to obtain their respective statistical distributions of global alignment scores. All alignments were carried out with the Needleman-Wunsch algorithm and optimal scores were fitted to the Gumbel, normal and gamma distributions respectively. The three-parameter gamma distribution performs the best as the theoretical distribution function of global alignment scores, as it agrees perfectly well with the distribution of alignment scores. The normal distribution also agrees well with the score distribution frequencies when the shape parameter of the gamma distribution is sufficiently large, for this is the scenario when the normal distribution can be viewed as an approximation of the gamma distribution. Conclusion We have shown that the optimal global alignment scores of random protein sequences fit the three-parameter gamma distribution function. This would be useful for the inference of homology between sequences whose relationship is unknown, through the evaluation of gamma distribution significance between sequences.

  16. A heteroscedastic generalized linear model with a non-normal speed factor for responses and response times.

    Science.gov (United States)

    Molenaar, Dylan; Bolsinova, Maria

    2017-05-01

    In generalized linear modelling of responses and response times, the observed response time variables are commonly transformed to make their distribution approximately normal. A normal distribution for the transformed response times is desirable as it justifies the linearity and homoscedasticity assumptions in the underlying linear model. Past research has, however, shown that the transformed response times are not always normal. Models have been developed to accommodate this violation. In the present study, we propose a modelling approach for responses and response times to test and model non-normality in the transformed response times. Most importantly, we distinguish between non-normality due to heteroscedastic residual variances, and non-normality due to a skewed speed factor. In a simulation study, we establish parameter recovery and the power to separate both effects. In addition, we apply the model to a real data set. © 2017 The Authors. British Journal of Mathematical and Statistical Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  17. Exploring the Potential Use of the Birnbaum-Saunders Distribution in Inventory Management

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2015-01-01

    Full Text Available Choosing the suitable demand distribution during lead-time is an important issue in inventory models. Much research has explored the advantage of following a distributional assumption different from the normality. The Birnbaum-Saunders (BS distribution is a probabilistic model that has its genesis in engineering but is also being widely applied to other fields including business, industry, and management. We conduct numeric experiments using the R statistical software to assess the adequacy of the BS distribution against the normal and gamma distributions in light of the traditional lot size-reorder point inventory model, known as (Q, r. The BS distribution is well-known to be robust to extreme values; indeed, results indicate that it is a more adequate assumption under higher values of the lead-time demand coefficient of variation, thus outperforming the gamma and the normal assumptions.

  18. The distribution choice for the threshold of solid state relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)

  19. The Distribution of the Sum of Signed Ranks

    Science.gov (United States)

    Albright, Brian

    2012-01-01

    We describe the calculation of the distribution of the sum of signed ranks and develop an exact recursive algorithm for the distribution as well as an approximation of the distribution using the normal. The results have applications to the non-parametric Wilcoxon signed-rank test.

  20. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    International Nuclear Information System (INIS)

    Gigase, Yves

    2007-01-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)

  1. Air Distribution Effectiveness for Residential Mechanical Ventilation: Simulation and Comparison of Normalized Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Petithuguenin, T.D.P.; Sherman, M.H.

    2009-05-01

    The purpose of ventilation is to dilute indoor contaminants that an occupant is exposed to. Even when providing the same nominal rate of outdoor air, different ventilation systems may distribute air in different ways, affecting occupants' exposure to household contaminants. Exposure ultimately depends on the home being considered, on source disposition and strength, on occupants' behavior, on the ventilation strategy, and on operation of forced air heating and cooling systems. In any multi-zone environment dilution rates and source strengths may be different in every zone and change in time, resulting in exposure being tied to occupancy patterns.This paper will report on simulations that compare ventilation systems by assessing their impact on exposure by examining common house geometries, contaminant generation profiles, and occupancy scenarios. These simulations take into account the unsteady, occupancy-tied aspect of ventilation such as bathroom and kitchen exhaust fans. As most US homes have central HVAC systems, the simulation results will be used to make appropriate recommendations and adjustments for distribution and mixing to residential ventilation standards such as ASHRAE Standard 62.2.This paper will report on work being done to model multizone airflow systems that are unsteady and elaborate the concept of distribution matrix. It will examine several metrics for evaluating the effect of air distribution on exposure to pollutants, based on previous work by Sherman et al. (2006).

  2. Distribution and migration of aftershocks of the 2010 Mw 7.4 Ogasawara Islands intraplate normal-faulting earthquake related to a fracture zone in the Pacific plate

    Science.gov (United States)

    Obana, Koichiro; Takahashi, Tsutomu; No, Tetsuo; Kaiho, Yuka; Kodaira, Shuichi; Yamashita, Mikiya; Sato, Takeshi; Nakamura, Takeshi

    2014-04-01

    describe the aftershocks of a Mw 7.4 intraplate normal-faulting earthquake that occurred 150 km east Ogasawara (Bonin) Islands, Japan, on 21 December 2010. It occurred beneath the outer trench slope of the Izu-Ogasawara trench, where the Pacific plate subducts beneath the Philippine Sea plate. Aftershock observations using ocean bottom seismographs (OBSs) began soon after the earthquake and multichannel seismic reflection surveys were conducted across the aftershock area. Aftershocks were distributed in a NW-SE belt 140 km long, oblique to the N-S trench axis. They formed three subparallel lineations along a fracture zone in the Pacific plate. The OBS observations combined with data from stations on Chichi-jima and Haha-jima Islands revealed a migration of the aftershock activity. The first hour, which likely outlines the main shock rupture, was limited to an 80 km long area in the central part of the subsequent aftershock area. The first hour activity occurred mainly around, and appears to have been influenced by, nearby large seamounts and oceanic plateau, such as the Ogasawara Plateau and the Uyeda Ridge. Over the following days, the aftershocks expanded beyond or into these seamounts and plateau. The aftershock distribution and migration suggest that crustal heterogeneities related to a fracture zone and large seamounts and oceanic plateau in the incoming Pacific plate affected the rupture of the main shock. Such preexisting structures may influence intraplate normal-faulting earthquakes in other regions of plate flexure prior to subduction.

  3. Effect of Smart Meter Measurements Data On Distribution State Estimation

    DEFF Research Database (Denmark)

    Pokhrel, Basanta Raj; Nainar, Karthikeyan; Bak-Jensen, Birgitte

    2018-01-01

    Smart distribution grids with renewable energy based generators and demand response resources (DRR) requires accurate state estimators for real time control. Distribution grid state estimators are normally based on accumulated smart meter measurements. However, increase of measurements in the phy......Smart distribution grids with renewable energy based generators and demand response resources (DRR) requires accurate state estimators for real time control. Distribution grid state estimators are normally based on accumulated smart meter measurements. However, increase of measurements...... in the physical grid can enforce significant stress not only on the communication infrastructure but also in the control algorithms. This paper aims to propose a methodology to analyze needed real time smart meter data from low voltage distribution grids and their applicability in distribution state estimation...

  4. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  5. Log-normal spray drop distribution...analyzed by two new computer programs

    Science.gov (United States)

    Gerald S. Walton

    1968-01-01

    Results of U.S. Forest Service research on chemical insecticides suggest that large drops are not as effective as small drops in carrying insecticides to target insects. Two new computer programs have been written to analyze size distribution properties of drops from spray nozzles. Coded in Fortran IV, the programs have been tested on both the CDC 6400 and the IBM 7094...

  6. Anatomic distribution of culprit lesions in patients with non-ST-segment elevation myocardial infarction and normal ECG.

    Science.gov (United States)

    Moustafa, Abdelmoniem; Abi-Saleh, Bernard; El-Baba, Mohammad; Hamoui, Omar; AlJaroudi, Wael

    2016-02-01

    In patients presenting with non-ST-elevation myocardial infarction (NSTEMI), left anterior descending (LAD) coronary artery and three-vessel disease are the most commonly encountered culprit lesions in the presence of ST depression, while one third of patients with left circumflex (LCX) artery related infarction have normal ECG. We sought to determine the predictors of presence of culprit lesion in NSTEMI patients based on ECG, echocardiographic, and clinical characteristics. Patients admitted to the coronary care unit with the diagnosis of NSTEMI between June 2012 and December 2013 were retrospectively identified. Admission ECG was interpreted by an electrophysiologist that was blinded to the result of the coronary angiogram. Patients were dichotomized into either normal or abnormal ECG group. The primary endpoint was presence of culprit lesion. Secondary endpoints included length of stay, re-hospitalization within 60 days, and in-hospital mortality. A total of 118 patients that were identified; 47 with normal and 71 with abnormal ECG. At least one culprit lesion was identified in 101 patients (86%), and significantly more among those with abnormal ECG (91.5% vs. 76.6%, P=0.041).The LAD was the most frequently detected culprit lesion in both groups. There was a higher incidence of two and three-vessel disease in the abnormal ECG group (P=0.041).On the other hand, there was a trend of higher LCX involvement (25% vs. 13.8%, P=0.18) and more normal coronary arteries in the normal ECG group (23.4% vs. 8.5%, P=0.041). On multivariate analysis, prior history of coronary artery disease (CAD) [odds ratio (OR) 6.4 (0.8-52)], male gender [OR 5.0 (1.5-17)], and abnormal admission ECG [OR 3.6 (1.12-12)], were independent predictors of a culprit lesion. There was no difference in secondary endpoints between those with normal and abnormal ECG. Among patients presenting with NSTEMI, prior history of CAD, male gender and abnormal admission ECG were independent predictors of a

  7. Normal lymphographic findings and diagnostic errors in the retroperitoneal space

    International Nuclear Information System (INIS)

    Klein, U.; Heinze, H.G.

    1980-01-01

    Diagnostic erros in the lymphograms of 194 normal patients are shown to be due to topographical variations of the lymph system, influx and distribution of contrast media in the lymph ducts and nodes, and degenerative changes caused by old age. (orig.) [de

  8. Likelihood Inference of Nonlinear Models Based on a Class of Flexible Skewed Distributions

    Directory of Open Access Journals (Sweden)

    Xuedong Chen

    2014-01-01

    Full Text Available This paper deals with the issue of the likelihood inference for nonlinear models with a flexible skew-t-normal (FSTN distribution, which is proposed within a general framework of flexible skew-symmetric (FSS distributions by combining with skew-t-normal (STN distribution. In comparison with the common skewed distributions such as skew normal (SN, and skew-t (ST as well as scale mixtures of skew normal (SMSN, the FSTN distribution can accommodate more flexibility and robustness in the presence of skewed, heavy-tailed, especially multimodal outcomes. However, for this distribution, a usual approach of maximum likelihood estimates based on EM algorithm becomes unavailable and an alternative way is to return to the original Newton-Raphson type method. In order to improve the estimation as well as the way for confidence estimation and hypothesis test for the parameters of interest, a modified Newton-Raphson iterative algorithm is presented in this paper, based on profile likelihood for nonlinear regression models with FSTN distribution, and, then, the confidence interval and hypothesis test are also developed. Furthermore, a real example and simulation are conducted to demonstrate the usefulness and the superiority of our approach.

  9. Normalized mutual information based PET-MR registration using K-Means clustering and shading correction

    NARCIS (Netherlands)

    Knops, Z.F.; Maintz, J.B.A.; Viergever, M.A.; Pluim, J.P.W.; Gee, J.C.; Maintz, J.B.A.; Vannier, M.W.

    2003-01-01

    A method for the efficient re-binning and shading based correction of intensity distributions of the images prior to normalized mutual information based registration is presented. Our intensity distribution re-binning method is based on the K-means clustering algorithm as opposed to the generally

  10. C4.4A gene ablation is compatible with normal epidermal development and causes modest overt phenotypes

    DEFF Research Database (Denmark)

    Kriegbaum, Mette Camilla; Jacobsen, Benedikte; Füchtbauer, Annette

    2016-01-01

    of C4.4A in normal physiology and cancer progression. The unchallenged C4.4A-deficient mice were viable, fertile, born in a normal Mendelian distribution and, surprisingly, displayed normal development of squamous epithelia. The C4.4A-deficient mice were, nonetheless, significantly lighter than...

  11. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  12. Fractionation in normal tissues: the (α/β)eff concept can account for dose heterogeneity and volume effects.

    Science.gov (United States)

    Hoffmann, Aswin L; Nahum, Alan E

    2013-10-07

    The simple Linear-Quadratic (LQ)-based Withers iso-effect formula (WIF) is widely used in external-beam radiotherapy to derive a new tumour dose prescription such that there is normal-tissue (NT) iso-effect when changing the fraction size and/or number. However, as conventionally applied, the WIF is invalid unless the normal-tissue response is solely determined by the tumour dose. We propose a generalized WIF (gWIF) which retains the tumour prescription dose, but replaces the intrinsic fractionation sensitivity measure (α/β) by a new concept, the normal-tissue effective fractionation sensitivity, [Formula: see text], which takes into account both the dose heterogeneity in, and the volume effect of, the late-responding normal-tissue in question. Closed-form analytical expressions for [Formula: see text] ensuring exact normal-tissue iso-effect are derived for: (i) uniform dose, and (ii) arbitrary dose distributions with volume-effect parameter n = 1 from the normal-tissue dose-volume histogram. For arbitrary dose distributions and arbitrary n, a numerical solution for [Formula: see text] exhibits a weak dependence on the number of fractions. As n is increased, [Formula: see text] increases from its intrinsic value at n = 0 (100% serial normal-tissue) to values close to or even exceeding the tumour (α/β) at n = 1 (100% parallel normal-tissue), with the highest values of [Formula: see text] corresponding to the most conformal dose distributions. Applications of this new concept to inverse planning and to highly conformal modalities are discussed, as is the effect of possible deviations from LQ behaviour at large fraction sizes.

  13. Tumour control probability (TCP) for non-uniform activity distribution in radionuclide therapy

    International Nuclear Information System (INIS)

    Uusijaervi, Helena; Bernhardt, Peter; Forssell-Aronsson, Eva

    2008-01-01

    Non-uniform radionuclide distribution in tumours will lead to a non-uniform absorbed dose. The aim of this study was to investigate how tumour control probability (TCP) depends on the radionuclide distribution in the tumour, both macroscopically and at the subcellular level. The absorbed dose in the cell nuclei of tumours was calculated for 90 Y, 177 Lu, 103m Rh and 211 At. The radionuclides were uniformly distributed within the subcellular compartment and they were uniformly, normally or log-normally distributed among the cells in the tumour. When all cells contain the same amount of activity, the cumulated activities required for TCP = 0.99 (A-tilde TCP=0.99 ) were 1.5-2 and 2-3 times higher when the activity was distributed on the cell membrane compared to in the cell nucleus for 103m Rh and 211 At, respectively. TCP for 90 Y was not affected by different radionuclide distributions, whereas for 177 Lu, it was slightly affected when the radionuclide was in the nucleus. TCP for 103m Rh and 211 At were affected by different radionuclide distributions to a great extent when the radionuclides were in the cell nucleus and to lesser extents when the radionuclides were distributed on the cell membrane or in the cytoplasm. When the activity was distributed in the nucleus, A-tilde TCP=0.99 increased when the activity distribution became more heterogeneous for 103m Rh and 211 At, and the increase was large when the activity was normally distributed compared to log-normally distributed. When the activity was distributed on the cell membrane, A-tilde TCP=0.99 was not affected for 103m Rh and 211 At when the activity distribution became more heterogeneous. A-tilde TCP=0.99 for 90 Y and 177 Lu were not affected by different activity distributions, neither macroscopic nor subcellular

  14. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  15. Diffusion Strategy-Based Distributed Operation of Microgrids Using Multiagent System

    Directory of Open Access Journals (Sweden)

    Van-Hai Bui

    2017-07-01

    Full Text Available In distributed operation, each unit is operated by its local controller instead of using a centralized controller, which allows the action to be based on local information rather than global information. Most of the distributed solutions have implemented the consensus method, however, convergence time of the consensus method is quite long, while diffusion strategy includes a stochastic gradient term and can reach convergence much faster compared with consensus method. Therefore, in this paper, a diffusion strategy-based distributed operation of microgrids (MGs is proposed using multiagent system for both normal and emergency operation modes. In normal operation, the MG system is operated by a central controller instead of the distributed controller to minimize the operation cost. If any event (fault occurs in the system, MG system can be divided into two parts to isolate the faulty region. In this case, the MG system is changed to emergency operation mode. The normal part is rescheduled by the central controller while the isolated part schedules its resources in a distributed manner. The isolated part carries out distributed communication using diffusion between neighboring agents for optimal operation of this part. The proposed method enables peer-to-peer communication among the agents without the necessity of a centralized controller, and simultaneously performs resource optimization. Simulation results show that the system can be operated in an economic way in both normal operation and emergency operation modes.

  16. Role of the normal gut microbiota.

    Science.gov (United States)

    Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D

    2015-08-07

    Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.

  17. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2012-02-27

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  18. Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Contreras-Reyes, Javier E.; Genton, Marc G.

    2012-01-01

    The entropy and mutual information index are important concepts developed by Shannon in the context of information theory. They have been widely studied in the case of the multivariate normal distribution. We first extend these tools to the full symmetric class of multivariate elliptical distributions and then to the more flexible families of multivariate skew-elliptical distributions. We study in detail the cases of the multivariate skew-normal and skew-t distributions. We implement our findings to the application of the optimal design of an ozone monitoring station network in Santiago de Chile. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  19. Confidence Intervals for Asbestos Fiber Counts: Approximate Negative Binomial Distribution.

    Science.gov (United States)

    Bartley, David; Slaven, James; Harper, Martin

    2017-03-01

    The negative binomial distribution is adopted for analyzing asbestos fiber counts so as to account for both the sampling errors in capturing only a finite number of fibers and the inevitable human variation in identifying and counting sampled fibers. A simple approximation to this distribution is developed for the derivation of quantiles and approximate confidence limits. The success of the approximation depends critically on the use of Stirling's expansion to sufficient order, on exact normalization of the approximating distribution, on reasonable perturbation of quantities from the normal distribution, and on accurately approximating sums by inverse-trapezoidal integration. Accuracy of the approximation developed is checked through simulation and also by comparison to traditional approximate confidence intervals in the specific case that the negative binomial distribution approaches the Poisson distribution. The resulting statistics are shown to relate directly to early research into the accuracy of asbestos sampling and analysis. Uncertainty in estimating mean asbestos fiber concentrations given only a single count is derived. Decision limits (limits of detection) and detection limits are considered for controlling false-positive and false-negative detection assertions and are compared to traditional limits computed assuming normal distributions. Published by Oxford University Press on behalf of the British Occupational Hygiene Society 2017.

  20. Stochastic distribution of the required coefficient of friction for level walking--an in-depth study.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2012-01-01

    This study investigated the stochastic distribution of the required coefficient of friction (RCOF) which is a critical element for estimating slip probability. Fifty participants walked under four walking conditions. The results of the Kolmogorov-Smirnov two-sample test indicate that 76% of the RCOF data showed a difference in distribution between both feet for the same participant under each walking condition; the data from both feet were kept separate. The results of the Kolmogorov-Smirnov goodness-of-fit test indicate that most of the distribution of the RCOF appears to have a good match with the normal (85.5%), log-normal (84.5%) and Weibull distributions (81.5%). However, approximately 7.75% of the cases did not have a match with any of these distributions. It is reasonable to use the normal distribution for representation of the RCOF distribution due to its simplicity and familiarity, but each foot had a different distribution from the other foot in 76% of cases. The stochastic distribution of the required coefficient of friction (RCOF) was investigated for use in a statistical model to improve the estimate of slip probability in risk assessment. The results indicate that 85.5% of the distribution of the RCOF appears to have a good match with the normal distribution.

  1. The pharmacokinetics, distribution and degradation of human recombinant interleukin 1 beta in normal rats

    DEFF Research Database (Denmark)

    Wogensen, L D; Welinder, B; Hejnaes, K R

    1991-01-01

    -lives of distribution (T1/2 alpha) and elimination phases (T1/2 beta) of human recombinant interleukin 1 beta (rIL-1 beta), and its tissue distribution and cellular localization by means of mono-labelled, biologically active 125I-rIL-1 beta. After intravenous (i.v.) injection, 125I-rIL-1 beta was eliminated from...... the circulation with a T1/2 alpha of 2.9 min and a T1/2 beta of 41.1 min. The central and peripheral volume of distribution was 20.7 and 19.1 ml/rat, respectively, and the metabolic clearance rate was 16.9 ml/min/kg. The kidney and liver showed the highest accumulation of tracer, and autoradiography demonstrated...

  2. Stochastic frontier model approach for measuring stock market efficiency with different distributions.

    Science.gov (United States)

    Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul

    2012-01-01

    The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE) market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time-varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation.

  3. Stochastic frontier model approach for measuring stock market efficiency with different distributions.

    Directory of Open Access Journals (Sweden)

    Md Zobaer Hasan

    Full Text Available The stock market is considered essential for economic growth and expected to contribute to improved productivity. An efficient pricing mechanism of the stock market can be a driving force for channeling savings into profitable investments and thus facilitating optimal allocation of capital. This study investigated the technical efficiency of selected groups of companies of Bangladesh Stock Market that is the Dhaka Stock Exchange (DSE market, using the stochastic frontier production function approach. For this, the authors considered the Cobb-Douglas Stochastic frontier in which the technical inefficiency effects are defined by a model with two distributional assumptions. Truncated normal and half-normal distributions were used in the model and both time-variant and time-invariant inefficiency effects were estimated. The results reveal that technical efficiency decreased gradually over the reference period and that truncated normal distribution is preferable to half-normal distribution for technical inefficiency effects. The value of technical efficiency was high for the investment group and low for the bank group, as compared with other groups in the DSE market for both distributions in time-varying environment whereas it was high for the investment group but low for the ceramic group as compared with other groups in the DSE market for both distributions in time-invariant situation.

  4. Positioning effects on lung ventilation in older normal subjects: a technegas study

    International Nuclear Information System (INIS)

    Krieg, S.; McCarren, B.; Alison, J.; Cowell, S.F.; Leiper, C.; Bankstown-Lidcombe Hospital, Sydney, NSW; El Zein, H.

    2002-01-01

    Full text: While the effects of positioning on the distribution of ventilation in the lungs of younger subjects has been relatively well investigated, this is not so in the older age group. Known age-associated changes in the respiratory system are proposed to alter the distribution of ventilation in the lungs of older people. The aim of the present study was therefore to determine the effects of positioning on the distribution of ventilation in the lungs of older normal subjects. The distribution of ventilation in upright sitting and right side lying was measured in ten subjects using Technegas lung ventilation during tidal breathing. In the upright sitting position ventilation was preferentially distributed to the middle and basal regions (dependent regions). Right side lying ventilation was preferentially distributed to the right lung (dependent region). These results suggest that preferential distribution of ventilation to the dependent lung regions in older subjects is mainly due to the gravity-dependent gradient in pleural pressure. It is proposed that this distribution may partly result from loss of elasticity in the lungs with ageing. Predominantly, the distribution of ventilation in the lungs of older normal subjects in our study is similar to that previously described in younger subjects (Amis et al., 1984, Kaneko et al, 1966, Milic-Emili et al, 1966. This suggests that a similar pleural pressure gradient may exist in the lungs of older and younger subjects. This is an important implication as the majority of patients that physiotherapists treat with cardiopulmonary dysfunction are in the older age group. Further research is required to determine the effects of positioning on the distribution of ventilation in older patients with cardiopulmonary dysfunction to enable direct clinical implications to be made. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  5. A note on inconsistent families of discrete multivariate distributions

    KAUST Repository

    Ghosh, Sugata; Dutta, Subhajit; Genton, Marc G.

    2017-01-01

    We construct a d-dimensional discrete multivariate distribution for which any proper subset of its components belongs to a specific family of distributions. However, the joint d-dimensional distribution fails to belong to that family and in other words, it is ‘inconsistent’ with the distribution of these subsets. We also address preservation of this ‘inconsistency’ property for the symmetric Binomial distribution, and some discrete distributions arising from the multivariate discrete normal distribution.

  6. A note on inconsistent families of discrete multivariate distributions

    KAUST Repository

    Ghosh, Sugata

    2017-07-05

    We construct a d-dimensional discrete multivariate distribution for which any proper subset of its components belongs to a specific family of distributions. However, the joint d-dimensional distribution fails to belong to that family and in other words, it is ‘inconsistent’ with the distribution of these subsets. We also address preservation of this ‘inconsistency’ property for the symmetric Binomial distribution, and some discrete distributions arising from the multivariate discrete normal distribution.

  7. Forecasting Value-at-Risk under Different Distributional Assumptions

    Directory of Open Access Journals (Sweden)

    Manuela Braione

    2016-01-01

    Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.

  8. Pharmacodynamic and pharmacokinetic studies and prostatic tissue distribution of fosfomycin tromethamine in bacterial prostatitis or normal rats.

    Science.gov (United States)

    Fan, L; Shang, X; Zhu, J; Ma, B; Zhang, Q

    2018-05-02

    In this study, we assessed the therapeutic effects of fosfomycin tromethamine (FT) in a bacterial prostatitis (BP) rat model. The BP model was induced by Escherichia coli and was demonstrated after 7 days microbiologically and histologically. Then, 25 BP rats selected were randomly divided into five treatment groups: model group, positive group, FT-3 day group, FT-7 day group and FT-14 day group. Ventral lobes of prostate from all animals were removed, and the serum samples were collected at the end of the experiments. Microbiological cultures and histological findings of the prostate samples demonstrated reduced bacterial growth and improved inflammatory responses in FT-treatment groups compared with the model group, indicating that FT against prostatic infection induced by E. coli showed good antibacterial effects. Moreover, plasma pharmacokinetics and prostatic distribution of fosfomycin were studied and compared in BP and normal rats. The concentrations of fosfomycin in samples were analysed by liquid chromatography-tandem mass spectrometry. There were no differences in plasma pharmacokinetic parameters between two groups. But significantly higher penetration of fosfomycin into prostatic tissues was found in BP rats. We therefore suggested that FT had a good therapeutic effect on BP and it might be used in curing masculine reproductive system diseases. © 2018 Blackwell Verlag GmbH.

  9. Rare Earth Elements Distribution in Beryl

    International Nuclear Information System (INIS)

    El Gawish, H.K.; Nada, N.; Ghaly, W.A.; Helal, A.I.

    2012-01-01

    Laser ablation method is applied to a double focusing inductively coupled plasma mass spectrometer to determine the rare earth element distribution in some selected beryl samples. White, green and blue beryl samples are selected from the Egyptian eastern desert. Distributions of chondrite- normalized plot for the rare earth element in the selected beryl samples are investigated

  10. Handbook of exponential and related distributions for engineers and scientists

    CERN Document Server

    Pal, Nabendu; Lim, Wooi K

    2005-01-01

    The normal distribution is widely known and used by scientists and engineers. However, there are many cases when the normal distribution is not appropriate, due to the data being skewed. Rather than leaving you to search through journal articles, advanced theoretical monographs, or introductory texts for alternative distributions, the Handbook of Exponential and Related Distributions for Engineers and Scientists provides a concise, carefully selected presentation of the properties and principles of selected distributions that are most useful for application in the sciences and engineering.The book begins with all the basic mathematical and statistical background necessary to select the correct distribution to model real-world data sets. This includes inference, decision theory, and computational aspects including the popular Bootstrap method. The authors then examine four skewed distributions in detail: exponential, gamma, Weibull, and extreme value. For each one, they discuss general properties and applicabi...

  11. An analysis of longitudinal data with nonignorable dropout using the truncated multivariate normal distribution

    NARCIS (Netherlands)

    Jolani, Shahab

    2014-01-01

    For a vector of multivariate normal when some elements, but not necessarily all, are truncated, we derive the moment generating function and obtain expressions for the first two moments involving the multivariate hazard gradient. To show one of many applications of these moments, we then extend the

  12. Hemoglobin levels in normal Filipino pregnant women.

    Science.gov (United States)

    Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P

    1981-09-01

    The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.

  13. Comparative tissue distribution profiles of five major bio-active components in normal and blood deficiency rats after oral administration of Danggui Buxue Decoction by UPLC-TQ/MS.

    Science.gov (United States)

    Shi, Xuqin; Tang, Yuping; Zhu, Huaxu; Li, Weixia; Li, Zhenhao; Li, Wei; Duan, Jin-ao

    2014-01-01

    Astragali Radix (AR) and Angelicae Sinensis Radix (ASR) were frequently combined and used in China as herbal pair called as Danggui Buxue Decoction (DBD) for treatment of blood deficiency syndrome, such as women's ailments. This study is to investigate the tissue distribution profiles of five major bio-active constituents (ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV) in DBD after oral administration of DBD in blood deficiency rats, and to compare the difference between normal and blood deficiency rats. The blood deficiency rats were induced by bleeding from orbit at the dosages of 5.0mLkg(-1) every day, and the experimental period was 12 days. At the finally day of experimental period, both normal and blood deficiency rats were orally administrated with DBD, and then the tissues samples were collected at different time points. Ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV in different tissues were detected simultaneously by UPLC-TQ/MS, and the histograms were drawn. The results showed that the overall trend was CLiver>CKidney>CHeart>CSpleen>CLung, CC-30min>CM-30min>CM-60min>CC-5min>CM-5min>CC-60min>CM-240min>CC-240min. The contents of the detected compounds in liver were more than that in other tissues no matter in normal or blood deficiency rats. Compared to normal rats, partial contents of the compounds in blood deficiency rats' tissues at different time points had significant difference (Pdistribution investigation in blood deficiency animals which is conducted by bleeding. And the results demonstrated that the five DBD components in normal and blood deficiency rats had obvious differences in some organs and time points, suggesting that the blood flow and perfusion rate of the organ were altered in blood deficiency animals. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Dosimetric precision requirements and quantities for characterizing the response of tumors and normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Brahme, A [Karolinska Inst., Stockholm (Sweden). Dept. of Radiation Physics

    1996-08-01

    Based on simple radiobiological models the effect of the distribution of absorbed dose in therapy beams on the radiation response of tumor and normal tissue volumes are investigated. Under the assumption that the dose variation in the treated volume is small it is shown that the response of the tissue to radiation is determined mainly by the mean dose to the tumor or normal tissue volume in question. Quantitative expressions are also given for the increased probability of normal tissue complications and the decreased probability of tumor control as a function of increasing dose variations around the mean dose level to these tissues. When the dose variations are large the minimum tumor dose (to cm{sup 3} size volumes) will generally be better related to tumor control and the highest dose to significant portions of normal tissue correlates best to complications. In order not to lose more than one out of 20 curable patients (95% of highest possible treatment outcome) the required accuracy in the dose distribution delivered to the target volume should be 2.5% (1{sigma}) for a mean dose response gradient {gamma} in the range 2 - 3. For more steeply responding tumors and normal tissues even stricter requirements may be desirable. (author). 15 refs, 6 figs.

  15. The triangular density to approximate the normal density: decision rules-of-thumb

    International Nuclear Information System (INIS)

    Scherer, William T.; Pomroy, Thomas A.; Fuller, Douglas N.

    2003-01-01

    In this paper we explore the approximation of the normal density function with the triangular density function, a density function that has extensive use in risk analysis. Such an approximation generates a simple piecewise-linear density function and a piecewise-quadratic distribution function that can be easily manipulated mathematically and that produces surprisingly accurate performance under many instances. This mathematical tractability proves useful when it enables closed-form solutions not otherwise possible, as with problems involving the embedded use of the normal density. For benchmarking purposes we compare the basic triangular approximation with two flared triangular distributions and with two simple uniform approximations; however, throughout the paper our focus is on using the triangular density to approximate the normal for reasons of parsimony. We also investigate the logical extensions of using a non-symmetric triangular density to approximate a lognormal density. Several issues associated with using a triangular density as a substitute for the normal and lognormal densities are discussed, and we explore the resulting numerical approximation errors for the normal case. Finally, we present several examples that highlight simple decision rules-of-thumb that the use of the approximation generates. Such rules-of-thumb, which are useful in risk and reliability analysis and general business analysis, can be difficult or impossible to extract without the use of approximations. These examples include uses of the approximation in generating random deviates, uses in mixture models for risk analysis, and an illustrative decision analysis problem. It is our belief that this exploratory look at the triangular approximation to the normal will provoke other practitioners to explore its possible use in various domains and applications

  16. Overview Report: Normal and Emergency Operation Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.

    2011-05-01

    This is an overview report to document and illustrate methods used in a project entitled “Normal and Emergency Operations Visualization” for a utility company, conducted in 2009-2010 timeframe with funding from the utility company and the U.S. Department of Energy. The original final report (about 180 pages) for the project is not available for distribution because it alludes to findings that assessed the design of an operational system that contained proprietary information; this abridged version contains descriptions of methods and some findings to illustrate the approach used, while avoiding discussion of sensitive or proprietary information. The client has approved this abridged version of the report for unlimited distribution to give researchers and collaborators the benefit of reviewing the research concepts and methods that were applied in this study.

  17. Distribution characteristics of stock market liquidity

    Science.gov (United States)

    Luo, Jiawen; Chen, Langnan; Liu, Hao

    2013-12-01

    We examine the distribution characteristics of stock market liquidity by employing the generalized additive models for location, scale and shape (GAMLSS) model and three-minute frequency data from Chinese stock markets. We find that the BCPE distribution within the GAMLSS framework fits the distributions of stock market liquidity well with the diagnosis test. We also find that the stock market index exhibits a significant impact on the distributions of stock market liquidity. The stock market liquidity usually exhibits a positive skewness, but a normal distribution at a low level of stock market index and a high-peak and fat-tail shape at a high level of stock market index.

  18. Black-Litterman model on non-normal stock return (Case study four banks at LQ-45 stock index)

    Science.gov (United States)

    Mahrivandi, Rizki; Noviyanti, Lienda; Setyanto, Gatot Riwi

    2017-03-01

    The formation of the optimal portfolio is a method that can help investors to minimize risks and optimize profitability. One model for the optimal portfolio is a Black-Litterman (BL) model. BL model can incorporate an element of historical data and the views of investors to form a new prediction about the return of the portfolio as a basis for preparing the asset weighting models. BL model has two fundamental problems, the assumption of normality and estimation parameters on the market Bayesian prior framework that does not from a normal distribution. This study provides an alternative solution where the modelling of the BL model stock returns and investor views from non-normal distribution.

  19. Robustness to non-normality of common tests for the many-sample location problem

    Directory of Open Access Journals (Sweden)

    Azmeri Khan

    2003-01-01

    Full Text Available This paper studies the effect of deviating from the normal distribution assumption when considering the power of two many-sample location test procedures: ANOVA (parametric and Kruskal-Wallis (non-parametric. Power functions for these tests under various conditions are produced using simulation, where the simulated data are produced using MacGillivray and Cannon's [10] recently suggested g-and-k distribution. This distribution can provide data with selected amounts of skewness and kurtosis by varying two nearly independent parameters.

  20. Point processes and the position distribution of infinite boson systems

    International Nuclear Information System (INIS)

    Fichtner, K.H.; Freudenberg, W.

    1987-01-01

    It is shown that to each locally normal state of a boson system one can associate a point process that can be interpreted as the position distribution of the state. The point process contains all information one can get by position measurements and is determined by the latter. On the other hand, to each so-called Σ/sup c/-point process Q they relate a locally normal state with position distribution Q

  1. Comparison of plantar pressure distribution in subjects with normal and flat feet during gait DOI: 10.5007/1980-0037.2010v12n4p290

    Directory of Open Access Journals (Sweden)

    Patrik Felipe Nazario

    2010-01-01

    Full Text Available The aim of this study was to determine the possible relationship between loss of the normal medial longitudinal arch measured by the height of the navicular bone in a static situation and variables related to plantar pressure distribution measured in a dynamic situation. Eleven men (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-AT System was used for the acquisition of plantar pressure distribution data (peak pressure, mean pressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop test proposed by Brody (1982 was used to assess the height of the navicular bone for classification of the subjects. The results were compared by the Mann-Whitney U test, with the level of significance set at p ≤ 0.05. Differences were observed between the two groups in the mid-foot region for all variables studied, with the observation of higher mean values in subjects with flat feet. There were also significant differences in contact area, relative load, peak pressure, and mean pressure between groups. The present study demonstrates the importance of paying attention to subjects with flat feet because changes in plantar pressure distribution are associated with discomfort and injuries.

  2. Modeling of parallel-plate regenerators with non-uniform plate distributions

    DEFF Research Database (Denmark)

    Jensen, Jesper Buch; Engelbrecht, Kurt; Bahl, Christian Robert Haffenden

    2010-01-01

    plate spacing distributions are presented in order to understand the impact of spacing non-uniformity. Simulations of more realistic distributions where the plate spacings follow normal distributions are then discussed in order to describe the deviation of the performance of a regenerator relative...

  3. Temporal Statistical Analysis of Degree Distributions in an Undirected Landline Phone Call Network Graph Series

    Directory of Open Access Journals (Sweden)

    Orgeta Gjermëni

    2017-10-01

    Full Text Available This article aims to provide new results about the intraday degree sequence distribution considering phone call network graph evolution in time. More specifically, it tackles the following problem. Given a large amount of landline phone call data records, what is the best way to summarize the distinct number of calling partners per client per day? In order to answer this question, a series of undirected phone call network graphs is constructed based on data from a local telecommunication source in Albania. All network graphs of the series are simplified. Further, a longitudinal temporal study is made on this network graphs series related to the degree distributions. Power law and log-normal distribution fittings on the degree sequence are compared on each of the network graphs of the series. The maximum likelihood method is used to estimate the parameters of the distributions, and a Kolmogorov–Smirnov test associated with a p-value is used to define the plausible models. A direct distribution comparison is made through a Vuong test in the case that both distributions are plausible. Another goal was to describe the parameters’ distributions’ shape. A Shapiro-Wilk test is used to test the normality of the data, and measures of shape are used to define the distributions’ shape. Study findings suggested that log-normal distribution models better the intraday degree sequence data of the network graphs. It is not possible to say that the distributions of log-normal parameters are normal.

  4. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  5. Distributed hierarchical control architecture for integrating smart grid assets during normal and disrupted operations

    Science.gov (United States)

    Kalsi, Karan; Fuller, Jason C.; Somani, Abhishek; Pratt, Robert G.; Chassin, David P.; Hammerstrom, Donald J.

    2017-09-12

    Disclosed herein are representative embodiments of methods, apparatus, and systems for facilitating operation and control of a resource distribution system (such as a power grid). Among the disclosed embodiments is a distributed hierarchical control architecture (DHCA) that enables smart grid assets to effectively contribute to grid operations in a controllable manner, while helping to ensure system stability and equitably rewarding their contribution. Embodiments of the disclosed architecture can help unify the dispatch of these resources to provide both market-based and balancing services.

  6. Estimating statistical uncertainty of Monte Carlo efficiency-gain in the context of a correlated sampling Monte Carlo code for brachytherapy treatment planning with non-normal dose distribution.

    Science.gov (United States)

    Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr

    2012-01-01

    Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. On the distribution of the stochastic component in SUE traffic assignment models

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker

    1997-01-01

    The paper discuss the use of different distributions of the stochastic component in SUE. A main conclusion is that they generally gave reasonable similar results, except for the LogNormal distribution which use is dissuaded. However, in cases with low link-costs (e.g. in dense urban areas, ramps...... and modelling of intersections and inter-changes), distributions with long tails (Gumbel and Normal) gave biased results com-pared with the Rectangular distribution. The Triangular distribution gave results somewhat between. Besides giving the most reasonable results, the Rectangular dis-tribution is the most...... calculation effective.All distributions gave a unique solution at link level after a sufficient large number of iterations (up to 1,000 at full-scale networks) while the usual aggregated measures of convergence converged quite fast (under 50 iterations). The tests also showed, that the distributions must...

  8. Light-cone distribution amplitudes of the baryon octet

    International Nuclear Information System (INIS)

    Bali, Gunnar S.; Braun, Vladimir M.; Göckeler, Meinulf; Gruber, Michael; Hutzler, Fabian; Schäfer, Andreas; Schiel, Rainer W.; Simeth, Jakob; Söldner, Wolfgang; Sternbeck, Andre; Wein, Philipp

    2016-01-01

    We present results of the first ab initio lattice QCD calculation of the normalization constants and first moments of the leading twist distribution amplitudes of the full baryon octet, corresponding to the small transverse distance limit of the associated S-wave light-cone wave functions. The P-wave (higher twist) normalization constants are evaluated as well. The calculation is done using N_f=2+1 flavors of dynamical (clover) fermions on lattices of different volumes and pion masses down to 222 MeV. Significant SU(3) flavor symmetry violation effects in the shape of the distribution amplitudes are observed.

  9. Light-cone distribution amplitudes of the baryon octet

    Energy Technology Data Exchange (ETDEWEB)

    Bali, Gunnar S. [Institut für Theoretische Physik, Universität Regensburg,Universitätsstraße 31, D-93040 Regensburg (Germany); Department of Theoretical Physics, Tata Institute of Fundamental Research,Homi Bhabha Road, Mumbai 400005 (India); Braun, Vladimir M.; Göckeler, Meinulf; Gruber, Michael; Hutzler, Fabian; Schäfer, Andreas; Schiel, Rainer W.; Simeth, Jakob; Söldner, Wolfgang [Institut für Theoretische Physik, Universität Regensburg,Universitätsstraße 31, D-93040 Regensburg (Germany); Sternbeck, Andre [Theoretisch-Physikalisches Institut, Friedrich-Schiller-Universität Jena,Max-Wien-Platz 1, D-07743 Jena (Germany); Wein, Philipp [Institut für Theoretische Physik, Universität Regensburg,Universitätsstraße 31, D-93040 Regensburg (Germany)

    2016-02-10

    We present results of the first ab initio lattice QCD calculation of the normalization constants and first moments of the leading twist distribution amplitudes of the full baryon octet, corresponding to the small transverse distance limit of the associated S-wave light-cone wave functions. The P-wave (higher twist) normalization constants are evaluated as well. The calculation is done using N{sub f}=2+1 flavors of dynamical (clover) fermions on lattices of different volumes and pion masses down to 222 MeV. Significant SU(3) flavor symmetry violation effects in the shape of the distribution amplitudes are observed.

  10. Comparison of GLUT1, GLUT3, and GLUT4 mRNA and the subcellular distribution of their proteins in normal human muscle

    Science.gov (United States)

    Stuart, C. A.; Wen, G.; Gustafson, W. C.; Thompson, E. A.

    2000-01-01

    Basal, "insulin-independent" glucose uptake into skeletal muscle is provided by glucose transporters positioned at the plasma membrane. The relative amount of the three glucose transporters expressed in muscle has not been previously quantified. Using a combination of qualitative and quantitative ribonuclease protection assay (RPA) methods, we found in normal human muscle that GLUT1, GLUT3, and GLUT4 mRNA were expressed at 90 +/- 10, 46 +/- 4, and 156 +/- 12 copies/ng RNA, respectively. Muscle was fractionated by DNase digestion and differential sedimentation into membrane fractions enriched in plasma membranes (PM) or low-density microsomes (LDM). GLUT1 and GLUT4 proteins were distributed 57% to 67% in LDM, whereas GLUT3 protein was at least 88% in the PM-enriched fractions. These data suggest that basal glucose uptake into resting human muscle could be provided in part by each of these three isoforms.

  11. Rhythm-based heartbeat duration normalization for atrial fibrillation detection.

    Science.gov (United States)

    Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim

    2016-05-01

    Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization

  12. A new stochastic algorithm for inversion of dust aerosol size distribution

    Science.gov (United States)

    Wang, Li; Li, Feng; Yang, Ma-ying

    2015-08-01

    Dust aerosol size distribution is an important source of information about atmospheric aerosols, and it can be determined from multiwavelength extinction measurements. This paper describes a stochastic inverse technique based on artificial bee colony (ABC) algorithm to invert the dust aerosol size distribution by light extinction method. The direct problems for the size distribution of water drop and dust particle, which are the main elements of atmospheric aerosols, are solved by the Mie theory and the Lambert-Beer Law in multispectral region. And then, the parameters of three widely used functions, i.e. the log normal distribution (L-N), the Junge distribution (J-J), and the normal distribution (N-N), which can provide the most useful representation of aerosol size distributions, are inversed by the ABC algorithm in the dependent model. Numerical results show that the ABC algorithm can be successfully applied to recover the aerosol size distribution with high feasibility and reliability even in the presence of random noise.

  13. SYVAC3 parameter distribution package

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T; Skeet, A

    1995-01-01

    SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes a software object type (a generalization of a data type) called Parameter Distribution. This object type is used in SYVAC3, and can also be used independently. Parameter Distribution has the following subtypes: beta distribution; binomial distribution; constant distribution; lognormal distribution; loguniform distribution; normal distribution; piecewise uniform distribution; Triangular distribution; and uniform distribution. Some of these distributions can be altered by correlating two parameter distribution objects. This report provides complete specifications for parameter distributions, and also explains how to use them. It should meet the needs of casual users, reviewers, and programmers who wish to add their own subtypes. (author). 30 refs., 75 tabs., 56 figs.

  14. Problems in ensuring normal actuation of pneumatic drilling machines during their manufacture

    Energy Technology Data Exchange (ETDEWEB)

    Vasilev, V

    1982-01-01

    The problems of ensuring normal actuation and reliable operation of pneumatic drilling machines during their manufacturing are examined. The modes of operation are cited with which the normal actuation must be tested. Oscillograms are used to examine the characteristic cases of completion of machines with the impossibility of actuation during operation for one cycle or with an interruption in the mode, along with methods for eliminating these deficiencies. The effect on the actuation process of factors such as the type of rock being drilled, the change in the position of the machine, the axial force, the design of the air distribution device and the impact mechanism, the relationship between the surface areas of the distribution device and the travel of the distributor, the choking of the power and so on is studied. Different methods for completing design are analyzed.

  15. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  16. Normal SPECT thallium-201 bull's-eye display: gender differences

    International Nuclear Information System (INIS)

    Eisner, R.L.; Tamas, M.J.; Cloninger, K.

    1988-01-01

    The bull's-eye technique synthesizes three-dimensional information from single photon emission computed tomographic 201 TI images into two dimensions so that a patient's data can be compared quantitatively against a normal file. To characterize the normal database and to clarify differences between males and females, clinical data and exercise electrocardiography were used to identify 50 males and 50 females with less than 5% probability of coronary artery disease. Results show inhomogeneity of the 201 TI distributions at stress and delay: septal to lateral wall count ratios are less than 1.0 in both females and males; anterior to inferior wall count ratios are greater than 1.0 in males but are approximately equal to 1.0 in females. Washout rate is faster in females than males at the same peak exercise heart rate and systolic blood pressure, despite lower exercise time. These important differences suggest that quantitative analysis of single photon emission computed tomographic 201 TI images requires gender-matched normal files

  17. Reappraisal of the reference dose distribution in the UNSCEAR 1977 report

    International Nuclear Information System (INIS)

    Kumazawa, Shigeru

    2008-01-01

    This paper provides the update of the reference dose distribution proposed by G.A.M. Web and D. Beninson in Annex E to the UNSCEAR 1977 Report. To demonstrate compliance with regulatory obligations regarding doses to individuals, they defined it with the following properties: 1) The distribution of annual doses is log-normal; 2) The mean of the annual dose distribution is 5 m Gy (10% of the ICRP 1977 dose limit); 3) The proportion of workers exceeding 50 m Gy is 0.1%. The concept of the reference dose distribution is still important to understand the inherent variation of individual doses to workers controlled by source-related and individual-related efforts of best dose reduction. In the commercial nuclear power plant, the dose distribution becomes the more apart from the log-normal due to the stronger ALARA efforts and the revised dose limits. The monitored workers show about 1 m Sv of annual mean and far less than 0.1% of workers above 20 m Sv. The updated models of dose distribution consist of log-normal (no feedback on dose X) ln(X)∼N(μ,σ 2 ), hybrid log-normal (feedback on higher X by ρ) hyb(ρX)=ρX+ln(ρX)∼N(μ,σ 2 ), hybrid S B (feedback on higher dose quotient X/(D-X) not close to D by ρ) hyb[ρX/(D.X)]∼N(μ,σ 2 ) and Johnson's S B (limit to D) ln[X/(D-X)]∼N(μ,σ 2 ). These models afford interpreting the degree of dose control including dose constraint/limit to the reference distribution. Some of distributions are examined to characterize the variation of doses to members of the public with uncertainty. (author)

  18. The impact of sample non-normality on ANOVA and alternative methods.

    Science.gov (United States)

    Lantz, Björn

    2013-05-01

    In this journal, Zimmerman (2004, 2011) has discussed preliminary tests that researchers often use to choose an appropriate method for comparing locations when the assumption of normality is doubtful. The conceptual problem with this approach is that such a two-stage process makes both the power and the significance of the entire procedure uncertain, as type I and type II errors are possible at both stages. A type I error at the first stage, for example, will obviously increase the probability of a type II error at the second stage. Based on the idea of Schmider et al. (2010), which proposes that simulated sets of sample data be ranked with respect to their degree of normality, this paper investigates the relationship between population non-normality and sample non-normality with respect to the performance of the ANOVA, Brown-Forsythe test, Welch test, and Kruskal-Wallis test when used with different distributions, sample sizes, and effect sizes. The overall conclusion is that the Kruskal-Wallis test is considerably less sensitive to the degree of sample normality when populations are distinctly non-normal and should therefore be the primary tool used to compare locations when it is known that populations are not at least approximately normal. © 2012 The British Psychological Society.

  19. Interlayer material transport during layer-normal shortening. Part I. The model

    NARCIS (Netherlands)

    Molen, I. van der

    1985-01-01

    To analyse mass-transfer during deformation, the case is considered of a multilayer experiencing a layer-normal shortening that is volume constant on the scale of many layers. Strain rate is homogeneously distributed on the layer-scale if diffusion is absent; when transport of matter between the

  20. Selection and Pairing of ’Normal’ Rhesus Monkeys (Macaca mulatta) for Research.

    Science.gov (United States)

    1978-11-08

    week intervals. Fecal bacteriological cultures did not detect any Salmonella or Shigella car- riers in the population. The male monkeys ranged in age...1Special Roert 78-6 LVEL•$ SELECTION AND PAIRING OF "NORMAL" RHESUS MONKEYS (Macaca mulatto) FOR RESEARC Matthew J. Kessler, James L. Kupper, James D...public release; distribution unlimited. SELECTION AND PAIRING OF "NORMAL" RHESUS MONKEYS (Macaca mulatta) FOR RESEARCH Matthew J. Kessler, James L

  1. The price momentum of stock in distribution

    Science.gov (United States)

    Liu, Haijun; Wang, Longfei

    2018-02-01

    In this paper, a new momentum of stock in distribution is proposed and applied in real investment. Firstly, assuming that a stock behaves as a multi-particle system, its share-exchange distribution and cost distribution are introduced. Secondly, an estimation of the share-exchange distribution is given with daily transaction data by 3 σ rule from the normal distribution. Meanwhile, an iterative method is given to estimate the cost distribution. Based on the cost distribution, a new momentum is proposed for stock system. Thirdly, an empirical test is given to compare the new momentum with others by contrarian strategy. The result shows that the new one outperforms others in many places. Furthermore, entropy of stock is introduced according to its cost distribution.

  2. Adaptive Variance Scaling in Continuous Multi-Objective Estimation-of-Distribution Algorithms

    NARCIS (Netherlands)

    P.A.N. Bosman (Peter); D. Thierens (Dirk); D. Thierens (Dirk)

    2007-01-01

    htmlabstractRecent research into single-objective continuous Estimation-of-Distribution Algorithms (EDAs) has shown that when maximum-likelihood estimations are used for parametric distributions such as the normal distribution, the EDA can easily suffer from premature convergence. In this paper we

  3. Score distributions in information retrieval

    NARCIS (Netherlands)

    Arampatzis, A.; Robertson, S.; Kamps, J.

    2009-01-01

    We review the history of modeling score distributions, focusing on the mixture of normal-exponential by investigating the theoretical as well as the empirical evidence supporting its use. We discuss previously suggested conditions which valid binary mixture models should satisfy, such as the

  4. Distribution Development for STORM Ingestion Input Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fulton, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    The Sandia-developed Transport of Radioactive Materials (STORM) code suite is used as part of the Radioisotope Power System Launch Safety (RPSLS) program to perform statistical modeling of the consequences due to release of radioactive material given a launch accident. As part of this modeling, STORM samples input parameters from probability distributions with some parameters treated as constants. This report described the work done to convert four of these constant inputs (Consumption Rate, Average Crop Yield, Cropland to Landuse Database Ratio, and Crop Uptake Factor) to sampled values. Consumption rate changed from a constant value of 557.68 kg / yr to a normal distribution with a mean of 102.96 kg / yr and a standard deviation of 2.65 kg / yr. Meanwhile, Average Crop Yield changed from a constant value of 3.783 kg edible / m 2 to a normal distribution with a mean of 3.23 kg edible / m 2 and a standard deviation of 0.442 kg edible / m 2 . The Cropland to Landuse Database ratio changed from a constant value of 0.0996 (9.96%) to a normal distribution with a mean value of 0.0312 (3.12%) and a standard deviation of 0.00292 (0.29%). Finally the crop uptake factor changed from a constant value of 6.37e-4 (Bq crop /kg)/(Bq soil /kg) to a lognormal distribution with a geometric mean value of 3.38e-4 (Bq crop /kg)/(Bq soil /kg) and a standard deviation value of 3.33 (Bq crop /kg)/(Bq soil /kg)

  5. New large-deviation local theorems for sums of independent and identically distributed random vectors when the limit distribution is α-stable

    OpenAIRE

    Nagaev, Alexander; Zaigraev, Alexander

    2005-01-01

    A class of absolutely continuous distributions in Rd is considered. Each distribution belongs to the domain of normal attraction of an α-stable law. The limit law is characterized by a spectral measure which is absolutely continuous with respect to the spherical Lebesgue measure. The large-deviation problem for sums of independent and identically distributed random vectors when the underlying distribution belongs to that class is studied. At the focus of attention are the deviations in the di...

  6. Thermal dosimetry studies of ultrasonically induced hyperthermia in normal dog brain and in experimental brain tumors

    International Nuclear Information System (INIS)

    Britt, R.H.; Pounds, D.W.; Stuart, J.S.; Lyons, B.E.; Saxer, E.L.

    1984-01-01

    In a series of 16 acute experiments on pentobarbital anesthetized dogs, thermal distributions generated by ultrasonic heating using a 1 MHz PZT transducer were compared with intensity distributions mapped in a test tank. Relatively flat distributions from 1 to 3 cm have been mapped in normal dog brain using ''shaped'' intensity distributions generated from ultrasonic emission patterns which are formed by the interaction between compressional, transverse and flexural modes activated within the crystal. In contrast, these same intensity distributions generated marked temperature variations in 3 malignant brain tumors presumably due to variations in tumor blood flow. The results of this study suggest that a practical clinical system for uniform heating of large tumor volumes with varying volumes and geometries is not an achievable goal. The author's laboratory is developing a scanning ultrasonic rapid hyperthermia treatment system which will be able to sequentially heat small volume of tumor tissue either to temperatures which will sterilize tumor or to a more conventional thermal dose. Time-temperature studies of threshold for thermal damage in normal dog brain are currently in progress

  7. Tumor and normal tissue responses to fractioned non-uniform dose delivery

    Energy Technology Data Exchange (ETDEWEB)

    Kaellman, P; Aegren, A; Brahme, A [Karolinska Inst., Stockholm (Sweden). Dept. of Radiation Physics

    1996-08-01

    The volume dependence of the radiation response of a tumor is straight forward to quantify because it depends primarily on the eradication of all its clonogenic cells. A tumor therefore has a parallel organization as any surviving clonogen in principle can repopulate the tumor. The difficulty with the response of the tumor is instead to know the density and sensitivity distribution of the most resistant clonogenic cells. The increase in the 50% tumor control dose and the decrease in the maximum normalized slope of the dose response relation, {gamma}, in presence of small compartments of resistant tumor cells have therefore been quantified to describe their influence on the dose response relation. Injury to normal tissue is a much more complex and gradual process. It depends on earlier effects induced long before depletion of the differentiated and clonogenic cells that in addition may have a complex structural and functional organization. The volume dependence of the dose response relation of normal tissues is therefore described here by the relative seriality, s, of the infrastructure of the organ. The model can also be generalized to describe the response of heterogeneous tissues to non uniform dose distributions. The new model is compared with clinical and experimental data on normal tissue response, and shows good agreement both with regard to the shape of dose response relation and the volume dependence of the isoeffect dose. The response of tumors and normal tissues are quantified for arbitrary dose fractionations using the linear quadratic cell survival parameters {alpha} and {beta}. The parameters of the dose response relation are derived both for a constant dose per fraction and a constant number of dose fractions, thus in the latter case accounting also for non uniform dose delivery. (author). 26 refs, 4 figs.

  8. Modeling a Distribution of Mortgage Credit Losses

    Czech Academy of Sciences Publication Activity Database

    Gapko, Petr; Šmíd, Martin

    2010-01-01

    Roč. 23, č. 23 (2010), s. 1-23 R&D Projects: GA ČR GA402/09/0965; GA ČR GD402/09/H045 Grant - others:Univerzita Karlova - GAUK(CZ) 46108 Institutional research plan: CEZ:AV0Z10750506 Keywords : Credit Risk * Mortgage * Delinquency Rate * Generalized Hyperbolic Distribution * Normal Distribution Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/gapko-modeling a distribution of mortgage credit losses-ies wp.pdf

  9. A spatial scan statistic for survival data based on Weibull distribution.

    Science.gov (United States)

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Development of distributed target

    CERN Document Server

    Yu Hai Jun; Li Qin; Zhou Fu Xin; Shi Jin Shui; Ma Bing; Chen Nan; Jing Xiao Bing

    2002-01-01

    Linear introduction accelerator is expected to generate small diameter X-ray spots with high intensity. The interaction of the electron beam with plasmas generated at the X-ray converter will make the spot on target increase with time and debase the X-ray dose and the imaging resolving power. A distributed target is developed which has about 24 pieces of thin 0.05 mm tantalum films distributed over 1 cm. due to the structure adoption, the distributed target material over a large volume decreases the energy deposition per unit volume and hence reduces the temperature of target surface, then reduces the initial plasma formalizing and its expansion velocity. The comparison and analysis with two kinds of target structures are presented using numerical calculation and experiments, the results show the X-ray dose and normalized angle distribution of the two is basically the same, while the surface of the distributed target is not destroyed like the previous block target

  11. Expression of periglandular tenascin-C and basement membrane laminin in normal prostate, benign prostatic hyperplasia and prostate carcinoma

    NARCIS (Netherlands)

    Xue, Y.; Li, J.; Latijnhouwers, M. A.; Smedts, F.; Umbas, R.; Aalders, T. W.; Debruyne, F. M.; de la Rosette, J. J.; Schalken, J. A.

    1998-01-01

    To evaluate the structural relationship of the distribution between tenascin (tenascin-C, an extra-cellular matrix glycoprotein involved in stromal-epithelial interactions in both normal and pathological conditions) and laminin, an important component of the basement membrane, in normal and

  12. Robustness to non-normality of various tests for the one-sample location problem

    Directory of Open Access Journals (Sweden)

    Michelle K. McDougall

    2004-01-01

    Full Text Available This paper studies the effect of the normal distribution assumption on the power and size of the sign test, Wilcoxon's signed rank test and the t-test when used in one-sample location problems. Power functions for these tests under various skewness and kurtosis conditions are produced for several sample sizes from simulated data using the g-and-k distribution of MacGillivray and Cannon [5].

  13. Potential clinical impact of normal-tissue intrinsic radiosensitivity testing

    International Nuclear Information System (INIS)

    Bentzen, Soeren M.

    1997-01-01

    A critical appraisal is given of the possible benefit from a reliable pre-treatment knowledge of individual normal-tissue sensitivity to radiotherapy. The considerations are in part, but not exclusively, based on the recent experience with in vitro colony-forming assays of the surviving fraction at 2 Gy, the SF 2 . Three strategies are reviewed: (1) to screen for rare cases with extreme radiosensitivity, so-called over-reactors, and treat these with reduced total dose, (2) to identify the sensitive tail of the distribution of 'normal' radiosensitivities, refer these patients to other treatment, and to escalate the dose to the remaining patients, or (3) to individualize dose prescriptions based on individual radiosensitivity, i.e. treating to isoeffect rather than to a specific dose-fractionation schedule. It is shown that these strategies will have a small, if any, impact on routine radiotherapy. Screening for over-reactors is hampered by the low prevalence of these among otherwise un-selected patients that leads to a low positive predictive value of in vitro radiosensitivity assays. It is argued, that this problem may persist even if the noise on current assays could be reduced to (the unrealistic value of) zero, simply because of the large biological variation in SF 2 . Removing the sensitive tail of the patient population, will only have a minor effect on the dose that could be delivered to the remaining patients, because of the sigmoid shape of empirical dose-response relationships. Finally, individualizing dose prescriptions based exclusively on information from a normal-tissue radiosensitivity assay, leads to a nearly symmetrical distribution of dose-changes that would produce a very small gain, or even a loss, of tumor control probability if implemented in the clinic. From a theoretical point of view, other strategies could be devised and some of these are considered in this review. Right now the most promising clinical use of in vitro radiosensitivity

  14. Multiple-indicator dilution technique for characterization of normal and retrograde flow in once-through rat liver perfusions

    International Nuclear Information System (INIS)

    St-Pierre, M.V.; Schwab, A.J.; Goresky, C.A.; Lee, W.F.; Pang, K.S.

    1989-01-01

    The technique of normal and retrograde rat liver perfusion has been widely used to probe zonal differences in drug-metabolizing activities. The validity of this approach mandates the same tissue spaces being accessed by substrates during both normal and retrograde perfusions. Using the multiple-indicator dilution technique, we presently examine the extent to which retrograde perfusion alters the spaces accessible to noneliminated references. A bolus dose of 51Cr-labeled red blood cells, 125I-albumin, 14C-sucrose and 3H2O was injected into the portal (normal) or hepatic (retrograde) vein of rat livers perfused at 10 ml per min per liver. The outflow perfusate was serially collected over 220 sec to characterize the transit times and the distribution spaces of the labels. During retrograde perfusion, red blood cells, albumin and sucrose profiles peaked later and lower than during normal perfusion, whereas the water curves were similar. The transit times of red blood cells, albumin and sucrose were longer (p less than 0.005), whereas those for water did not change. Consequently, retrograde flow resulted in significantly larger sinusoidal blood volumes (45%), albumin Disse space (42%) and sucrose Disse space (25%) than during normal flow, whereas the distribution spaces for total and intracellular water remained unaltered. The distension of the vascular tree was confirmed by electron microscopy, by which occasional isolated foci of widened intercellular recesses and spaces of Disse were observed. Cellular ultrastructure was otherwise unchanged, and there was no difference found between normal and retrograde perfusion for bile flow rates, AST release, perfusion pressure, oxygen consumption and metabolic removal of ethanol, a substrate with flow-limited distribution, which equilibrates rapidly with cell water (hepatic extraction ratios were virtually identical: normal vs. retrograde, 0.50 vs. 0.48 at 6 to 7.4 mM input concentration)

  15. Cerebral perfusion in homogeneity in normal volunteers

    International Nuclear Information System (INIS)

    Gruenwald, S.M.; Larcos, G.

    1998-01-01

    Full text: In the interpretation of cerebral perfusion scans, it is important to know the normal variation in perfusion which may occur between the cerebral hemispheres. For this reason 24 normal volunteers with no neurological or psychiatric history, and who were on no medications, had 99m Tc-HMPAO brain SPECT studies using a single headed gamma camera computer system. Oblique, coronal and sagittal images were reviewed separately by two experienced observers and any differences were resolved by consensus. Semi-quantitation was performed by summing two adjacent oblique slices and drawing right and left mirror image ROIs corresponding to the mid section level of anterior and posterior frontal lobes, anterior and posterior parietal lobes, temporal lobes and cerebellum. From the mean counts per pixel, right: left ROI ratios and ROI: cerebellar ratios were calculated. On qualitative review 6/24 subjects had mild asymmetry in tracer distribution between right and left cerebral lobes. Semi-quantitation revealed a 5-10% difference in counts between right and left ROIs in 12/24 subjects and an additional three subjects had 10-20% difference in counts between right and left temporal lobes. This study demonstrates the presence of mild asymmetry of cerebral perfusion in a significant minority of normal subjects

  16. Fitting the Statistical Distribution for Daily Rainfall in Ibadan, Based ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... Abstract. This paper presents several types of statistical distributions to describe rainfall distribution in Ibadan metropolis over a period of 30 years. The exponential, gamma, normal and poison distributions are compared to identify the optimal model for daily rainfall amount based on data recorded at rain ...

  17. A GMM-Based Test for Normal Disturbances of the Heckman Sample Selection Model

    Directory of Open Access Journals (Sweden)

    Michael Pfaffermayr

    2014-10-01

    Full Text Available The Heckman sample selection model relies on the assumption of normal and homoskedastic disturbances. However, before considering more general, alternative semiparametric models that do not need the normality assumption, it seems useful to test this assumption. Following Meijer and Wansbeek (2007, the present contribution derives a GMM-based pseudo-score LM test on whether the third and fourth moments of the disturbances of the outcome equation of the Heckman model conform to those implied by the truncated normal distribution. The test is easy to calculate and in Monte Carlo simulations it shows good performance for sample sizes of 1000 or larger.

  18. Empirical evaluation of data normalization methods for molecular classification.

    Science.gov (United States)

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  19. A Box-Cox normal model for response times.

    Science.gov (United States)

    Klein Entink, R H; van der Linden, W J; Fox, J-P

    2009-11-01

    The log-transform has been a convenient choice in response time modelling on test items. However, motivated by a dataset of the Medical College Admission Test where the lognormal model violated the normality assumption, the possibilities of the broader class of Box-Cox transformations for response time modelling are investigated. After an introduction and an outline of a broader framework for analysing responses and response times simultaneously, the performance of a Box-Cox normal model for describing response times is investigated using simulation studies and a real data example. A transformation-invariant implementation of the deviance information criterium (DIC) is developed that allows for comparing model fit between models with different transformation parameters. Showing an enhanced description of the shape of the response time distributions, its application in an educational measurement context is discussed at length.

  20. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  1. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  2. Opsoclonus-induced occipital deactivation with a region-specific distribution

    NARCIS (Netherlands)

    de Jong, BM; van Weerden, TW; Haaxma, R

    The cerebral distribution of 2-[18F]fluoro 2-deoxy-D-glucose (FDG) uptake in a patient with opsoclonus was measured by positron emission tomography (PET) and subsequently compared with the distribution in ten normal subjects. Statistical parametric mapping (SPM) revealed a decreased occipital FDG

  3. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  4. The Czech Wage Distribution and the Minimum Wage Impacts: the Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Kateřina Duspivová

    2013-06-01

    Full Text Available A well-fi tting wage distribution is a crucial precondition for economic modeling of the labour market processes.In the fi rst part, this paper provides the evidence that – as for wages in the Czech Republic – the most oft enused log-normal distribution failed and the best-fi tting one is the Dagum distribution. Th en we investigatethe role of wage distribution in the process of the economic modeling. By way of an example of the minimumwage impacts on the Czech labour market, we examine the response of Meyer and Wise’s (1983 model to theDagum and log-normal distributions. Th e results suggest that the wage distribution has important implicationsfor the eff ects of the minimum wage on the shape of the lower tail of the measured wage distribution andis thus an important feature for interpreting the eff ects of minimum wages.

  5. Comparison of plantar pressure distribution in subjects with normal and flat feet during gait

    Directory of Open Access Journals (Sweden)

    Aluisio Otavio Vargas Avila

    2010-06-01

    Full Text Available The aim of this study was to determine the possible relationship between loss of thenormal medial longitudinal arch measured by the height of the navicular bone in a static situationand variables related to plantar pressure distribution measured in a dynamic situation. Elevenmen (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-ATSystem was used for the acquisition of plantar pressure distribution data (peak pressure, meanpressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop testproposed by Brody (1982 was used to assess the height of the navicular bone for classificationof the subjects. The results were compared by the Mann-Whitney U test, with the level of significanceset at p ≤ 0.05. Differences were observed between the two groups in the mid-foot regionfor all variables studied, with the observation of higher mean values in subjects with flat feet.There were also significant differences in contact area, relative load, peak pressure, and meanpressure between groups. The present study demonstrates the importance of paying attentionto subjects with flat feet because changes in plantar pressure distribution are associated withdiscomfort and injuries.

  6. Study of trace element metabolism in normal and cancerous mice using multitracer technique

    International Nuclear Information System (INIS)

    Wang Xiao; Kong Fuquan; Zhao Kui; Zhang Xiang; Qin Zhi

    2008-01-01

    A radioactive multitracer solution of the 24 elements, e.g. Be, Na, K, Rb, Mg, Ca, Sr, Ga, As, Sc, V, Cr, Mn, Co, Fe, Zn, Y, Zr, Mo, Nb, To, Ru, Ag and In, was obtained from the nuclear reaction of 25 MeV/u 40 Ar + Se with a series of chemical process. The multitracer solution was orally administered to normal and muscular turnout-bearing mice of male Balb/c mice. Urine and faeces samples of mice were collected. The two group mice were saerificed after 96 h. The uptake of 17 elements, Na, Rb, Ga, As, Sc, V, Cr, Mn, Co, Fe, Zn, Y, Zr, Tc, Ru, Ag and In, were simultaneously detected in normal mice while 15 elements, Na, Rb, Ga, Sc, V, Cr, Mn, Co, Fe, Y, Zr, Tc, Ru, Ag and In, were simultaneously detected in tumour-bearing mice. Our results indicate that the majority of the detected elements were distributed in liver, kidney, pelt, turnout while a small fraction of the biotrace elements were distributed in heart and spleen. (tumour-bearing mice) in the two groups of mice. The higher concentrations of Fe, Na, Mn were detected in heart or kidney of normal mice. Na, Mn, Fe and Co showed better absorption in most tissues in the normal mice, except for Na and Mn in heart. (authors)

  7. Individual vision and peak distribution in collective actions

    Science.gov (United States)

    Lu, Peng

    2017-06-01

    People make decisions on whether they should participate as participants or not as free riders in collective actions with heterogeneous visions. Besides of the utility heterogeneity and cost heterogeneity, this work includes and investigates the effect of vision heterogeneity by constructing a decision model, i.e. the revised peak model of participants. In this model, potential participants make decisions under the joint influence of utility, cost, and vision heterogeneities. The outcomes of simulations indicate that vision heterogeneity reduces the values of peaks, and the relative variance of peaks is stable. Under normal distributions of vision heterogeneity and other factors, the peaks of participants are normally distributed as well. Therefore, it is necessary to predict distribution traits of peaks based on distribution traits of related factors such as vision heterogeneity and so on. We predict the distribution of peaks with parameters of both mean and standard deviation, which provides the confident intervals and robust predictions of peaks. Besides, we validate the peak model of via the Yuyuan Incident, a real case in China (2014), and the model works well in explaining the dynamics and predicting the peak of real case.

  8. EMPIRICAL DISTRIBUTION OF STOCK RETURNS OF SOUTHEAST EUROPEAN EMERGING MARKETS

    Directory of Open Access Journals (Sweden)

    Aleksandar Naumoski

    2017-06-01

    Full Text Available The assumption that equity returns follow the normal distribution, most commonly made in financial economics theory and applications, is strongly rejected by empirical evidence presented in this paper. As it was found in many other studies, we confirm that stock returns follow a leptokurtic distribution and skewness, which in most of the Southeast European (SEE markets is negative. This paper investigates further whether there is any distribution that may be considered an optimal fit for stock returns in the SEE region. Using daily, weekly and monthly data samples for a period of five years from ten Southeast European emerging countries, we applied the Anderson-Darling test of Goodness-of-fit. We strongly rejected the aforementioned assumption of normality for all considered data samples and found that the daily stock returns are best fitted by the Johnson SU distribution whereas for the weekly and monthly stock returns there was not one predominant, but many distributions that can be considered a best fit.

  9. Properties of magnetocaloric materials with a distribution of Curie temperatures

    DEFF Research Database (Denmark)

    Bahl, Christian Robert Haffenden; Bjørk, Rasmus; Smith, Anders

    2012-01-01

    The magnetocaloric properties of inhomogeneous ferromagnets that contain distributions of Curie temperatures are considered as a function of the width of such a distribution. Assuming a normal distribution of the Curie temperature, the average adiabatic temperature change, ΔTad, the isothermal...... of the distribution, explaining the observed mismatch of peak temperatures reported in experiments. Also, the field dependence of ΔTad and Δs is found to depend on the width of the distribution....

  10. A Generic Procedure for BRDF Normalization of Remotely Sensed Data

    Energy Technology Data Exchange (ETDEWEB)

    D. Yuan

    2003-04-01

    A generic procedure for Bidirectional Reflectance Distribution Function (BRDF) normalization for airborne multispectral images has been developed and implemented as an add-on module of ENVI at the U.S. Department of Energy's Remote Sensing Laboratory. The main advantage of this procedure is that it does not require multiple image acquisitions over the same area for establishing empirical BRDF functions.

  11. A Multi-Resolution Spatial Model for Large Datasets Based on the Skew-t Distribution

    KAUST Repository

    Tagle, Felipe

    2017-12-06

    Large, non-Gaussian spatial datasets pose a considerable modeling challenge as the dependence structure implied by the model needs to be captured at different scales, while retaining feasible inference. Skew-normal and skew-t distributions have only recently begun to appear in the spatial statistics literature, without much consideration, however, for the ability to capture dependence at multiple resolutions, and simultaneously achieve feasible inference for increasingly large data sets. This article presents the first multi-resolution spatial model inspired by the skew-t distribution, where a large-scale effect follows a multivariate normal distribution and the fine-scale effects follow a multivariate skew-normal distributions. The resulting marginal distribution for each region is skew-t, thereby allowing for greater flexibility in capturing skewness and heavy tails characterizing many environmental datasets. Likelihood-based inference is performed using a Monte Carlo EM algorithm. The model is applied as a stochastic generator of daily wind speeds over Saudi Arabia.

  12. Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum

    Directory of Open Access Journals (Sweden)

    Farshid eSepehrband

    2016-05-01

    Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.

  13. Transponder-aided joint calibration and synchronization compensation for distributed radar systems.

    Science.gov (United States)

    Wang, Wen-Qin

    2015-01-01

    High-precision radiometric calibration and synchronization compensation must be provided for distributed radar system due to separate transmitters and receivers. This paper proposes a transponder-aided joint radiometric calibration, motion compensation and synchronization for distributed radar remote sensing. As the transponder signal can be separated from the normal radar returns, it is used to calibrate the distributed radar for radiometry. Meanwhile, the distributed radar motion compensation and synchronization compensation algorithms are presented by utilizing the transponder signals. This method requires no hardware modifications to both the normal radar transmitter and receiver and no change to the operating pulse repetition frequency (PRF). The distributed radar radiometric calibration and synchronization compensation require only one transponder, but the motion compensation requires six transponders because there are six independent variables in the distributed radar geometry. Furthermore, a maximum likelihood method is used to estimate the transponder signal parameters. The proposed methods are verified by simulation results.

  14. Conormal distributions in the Shubin calculus of pseudodifferential operators

    Science.gov (United States)

    Cappiello, Marco; Schulz, René; Wahlberg, Patrik

    2018-02-01

    We characterize the Schwartz kernels of pseudodifferential operators of Shubin type by means of a Fourier-Bros-Iagolnitzer transform. Based on this, we introduce as a generalization a new class of tempered distributions called Shubin conormal distributions. We study their transformation behavior, normal forms, and microlocal properties.

  15. Uncertainty Evaluation for SMART Synthesized Power Distribution

    International Nuclear Information System (INIS)

    Cho, J. Y.; Song, J. S.; Lee, C. C.; Park, S. Y.; Kim, K. Y.; Lee, K. H.

    2010-07-01

    This report performs the uncertainty analysis for the SMART synthesis power distribution generated by a SSUN (SMART core SUpporting system coupled by Nuclear design code) code. SSUN runs coupled with the MASTER neutronics code and generates the core 3-D synthesis power distribution by using DPCM3D. The MASTER code plays a role to provide the DPCM3D constants to the SSUN code for the current core states. The uncertainties evaluated in this report are the form of 95%/95% probability/confidence one-sided tolerance limits and can be used in conjunction with Technical Specification limits on these quantities to establish appropriate LCO (Limiting Conditions of Operation) and LSSS (Limiting Safety System Settings) limits. This report is applicable to SMART nuclear reactor using fixed rhodium detector systems. The unknown true power distribution should be given for the uncertainty evaluation of the synthesis power distribution. This report produces virtual distributions for the true power distribution by imposing the CASMO-3/MASTER uncertainty to the MASTER power distribution. Detector signals are generated from these virtual distribution and the DPCM3D constants are from the MASTER power distribution. The SSUN code synthesizes the core 3-D power distribution by using these detector signals and the DPCM3D constants. The following summarizes the uncertainty evaluation procedure for the synthesis power distribution. (1) Generation of 3-D power distribution by MASTER -> Determination of the DPCM3D constants. (2) Generation of virtual power distribution (assumed to be true power distribution) -> Generation of detector signals. (3) Generation of synthesis power distribution. (4) Uncertainty evaluation for the synthesis power distribution. Chi-Square normality test rejects the hypothesis of normal distribution for the synthesis power error distribution. Therefore, the KRUSKAL WALLIS test and the non-parametric statistics are used for data pooling and the tolerance limits. The

  16. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    Gupta, S.S.; Panchapakesan, S.

    1975-01-01

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  17. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    Science.gov (United States)

    Fowler, Mike S; Ruokolainen, Lasse

    2013-01-01

    The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies) dominate in red environments, rapid fluctuations (high frequencies) in blue environments and white environments are purely random (no frequencies dominate). Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental) series used in combination with population (dynamical feedback) models: autoregressive [AR(1)] and sinusoidal (1/f) models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1) models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing) populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1) methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We must let

  18. The pharmacokinetics, distribution and degradation of human recombinant interleukin 1 beta in normal rats

    DEFF Research Database (Denmark)

    Reimers, J; Wogensen, L D; Welinder, B

    1991-01-01

    Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half-lives of distribut......Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half......-lives of distribution (T1/2 alpha) and elimination phases (T1/2 beta) of human recombinant interleukin 1 beta (rIL-1 beta), and its tissue distribution and cellular localization by means of mono-labelled, biologically active 125I-rIL-1 beta. After intravenous (i.v.) injection, 125I-rIL-1 beta was eliminated from.......v., intraperitoneal (i.p.) and subcutaneous (s.c.) injections, as demonstrated by high performance size exclusion chromatography, trichloracetic acid precipitation and SDS-PAGE until 5 h after tracer injection. Pre-treatment with 'cold' rIL-1 beta enhanced degradation of a subsequent injection of tracer. The route...

  19. Comparisons of pharmacokinetic and tissue distribution profile of four major bioactive components after oral administration of Xiang-Fu-Si-Wu Decoction effective fraction in normal and dysmenorrheal symptom rats.

    Science.gov (United States)

    Liu, Pei; Li, Wei; Li, Zhen-hao; Qian, Da-wei; Guo, Jian-ming; Shang, Er-xin; Su, Shu-lan; Tang, Yu-ping; Duan, Jin-ao

    2014-07-03

    Xiang-Fu-Si-Wu Decoction (XFSWD) has been widely used to treat primary dysmenorrhea in clinical practice for hundreds of years and shown great efficacy. One fraction of XFSWD, which was an elution product by macroporous adsorption resin from aqueous extract solution with 60% ethanol (XFSWE), showed great analgesic effect. The present study was conducted to investigate the possible pharmacokinetic and tissue distribution profiles of four major bioactive constituents (berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine) after oral administration of XFSWE in dysmenorrheal symptom rats, and to compare the difference between normal and dysmenorrheal symptom rats. Estradiol benzoate and oxytocin were used to produce dysmenorrheal symptom rat model. The experimental period was seven days. At the final day of experimental period, both normal and dysmenorrheal symptom rats were orally administrated with XFSWE, and then the blood and tissues samples were collected at different time points. Berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine in blood and tissue samples were determined by LC-MS/MS. Pharmacokinetic parameters were calculated from the plasma concentration-time data using non-compartmental methods. The differences of pharmacokinetic parameters among groups were tested by one-way analysis of variance (ANOVA). There were statistically significant differences (Pnormal and dysmenorrheal symptom rats that orally administered with same dosage of XFSWE. In tissue distribution study, the results showed that the overall trend was C(Spleen)>C(Liver)>C(Kidney)>C(Uterus)>C(Heart)>C(Lung)>C(Ovary)>C(Brain)>C(Thymus), C(M-60 min)>C(M-120 min)>C(M-30 min)>C(C-60 min)>C(C-120 min)>C(C-30 min). The contents of protopine in liver, spleen and uterus were more than that in other tissues of dysmenorrheal symptom rats. Compared to normal rats, partial contents of the compounds in dysmenorrheal symptom rats׳ tissues at different time points had significant

  20. β class II tubulin predominates in normal and tumor breast tissues

    International Nuclear Information System (INIS)

    Dozier, James H; Hiser, Laree; Davis, Jennifer A; Thomas, Nancy Stubbs; Tucci, Michelle A; Benghuzzi, Hamed A; Frankfurter, Anthony; Correia, John J; Lobert, Sharon

    2003-01-01

    Antimitotic chemotherapeutic agents target tubulin, the major protein in mitotic spindles. Tubulin isotype composition is thought to be both diagnostic of tumor progression and a determinant of the cellular response to chemotherapy. This implies that there is a difference in isotype composition between normal and tumor tissues. To determine whether such a difference occurs in breast tissues, total tubulin was fractionated from lysates of paired normal and tumor breast tissues, and the amounts of β-tubulin classes I + IV, II, and III were measured by competitive enzyme-linked immunosorbent assay (ELISA). Only primary tumor tissues, before chemotherapy, were examined. Her2/neu protein amplification occurs in about 30% of breast tumors and is considered a marker for poor prognosis. To gain insight into whether tubulin isotype levels might be correlated with prognosis, ELISAs were used to quantify Her2/neu protein levels in these tissues. β-Tubulin isotype distributions in normal and tumor breast tissues were similar. The most abundant β-tubulin isotypes in these tissues were β-tubulin classes II and I + IV. Her2/neu levels in tumor tissues were 5–30-fold those in normal tissues, although there was no correlation between the Her2/neu biomarker and tubulin isotype levels. These results suggest that tubulin isotype levels, alone or in combination with Her2/neu protein levels, might not be diagnostic of tumorigenesis in breast cancer. However, the presence of a broad distribution of these tubulin isotypes (for example, 40–75% β-tubulin class II) in breast tissue, in conjunction with other factors, might still be relevant to disease progression and cellular response to antimitotic drugs

  1. [A study on the distribution of the consumption of tobacco and alcohol].

    Science.gov (United States)

    Damiani, P; Masse, H; Aubenque, M

    1983-01-01

    An analysis of the distribution of tobacco consumption and alcohol-related mortality in France by sex and department is presented for the population aged 45 to 64. It is shown that the "population can be decomposed into two sets such that, for each of them, tobacco and alcohol consumption distributions are log-normal. [It is suggested] that consumption is normal for one set and an endogenous predisposition for the other." (summary in ENG) excerpt

  2. Normal urinary albumin excretion in recently diagnosed type 1 diabetic patients

    DEFF Research Database (Denmark)

    Lind, B; Jensen, T; Feldt-Rasmussen, B

    1989-01-01

    of diabetes. Urinary albumin excretion (median and 95% confidence interval) was similar in the diabetic patients and normal control subjects (8 (6-11) vs 8 (6-11) mg 24-h-1, NS). Four diabetic patients had urinary albumin excretion in the microalbuminuric range of 30-300 mg 24-h-1. There was no significant...... difference between the two groups in urinary excretion of retinol binding protein. The distribution among the individuals of both urinary proteins was positively skewed and similar in the two groups. In conclusion, no significant differences in the urinary excretion of albumin and retinol binding protein...... were found between recently diagnosed Type 1 diabetic patients and normal subjects....

  3. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    Directory of Open Access Journals (Sweden)

    Shan Yang

    2016-01-01

    Full Text Available Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverter based distributed generation is proposed. The proposed method let the inverter based distributed generation be equivalent to Iθ bus, which makes it suitable to calculate the power flow of distribution network with a current limited inverter based distributed generation. And the low voltage ride through capability of inverter based distributed generation can be considered as well in this paper. Finally, some tests of power flow and short circuit current calculation are performed on a 33-bus distribution network. The calculated results from the proposed method in this paper are contrasted with those by the traditional method and the simulation method, whose results have verified the effectiveness of the integrated method suggested in this paper.

  4. A micromechanical study of porous composites under longitudinal shear and transverse normal loading

    DEFF Research Database (Denmark)

    Ashouri Vajari, Danial

    2015-01-01

    The mechanical response of porous unidirectional composites under transverse normal and longitudinal shear loading is studied using the finite element analysis. The 3D model includes discrete and random distribution of fibers and voids. The micromechanical failure mechanisms are taken into account....... Finally, the computational prediction of the porous composite in the transverse normal-longitudinal shear stress space is obtained and compared with Puck's model. The results show that both interfaces with low fracture toughness and microvoids with even small void volume fraction can significantly reduce...

  5. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  6. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo [Ajou University School of Medicine, Suwon (Korea, Republic of); Lee, Jae Sung [Seoul national University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5{+-}3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis.

  7. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo; Lee, Jae Sung

    2002-01-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5±3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis

  8. Mass distributions in disk galaxies

    NARCIS (Netherlands)

    Martinsson, Thomas; Verheijen, Marc; Bershady, Matthew; Westfall, Kyle; Andersen, David; Swaters, Rob

    We present results on luminous and dark matter mass distributions in disk galaxies from the DiskMass Survey. As expected for normal disk galaxies, stars dominate the baryonic mass budget in the inner region of the disk; however, at about four optical scale lengths (hR ) the atomic gas starts to

  9. Bayesian Option Pricing Using Mixed Normal Heteroskedasticity Models

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars Peter

    While stochastic volatility models improve on the option pricing error when compared to the Black-Scholes-Merton model, mispricings remain. This paper uses mixed normal heteroskedasticity models to price options. Our model allows for significant negative skewness and time varying higher order...... moments of the risk neutral distribution. Parameter inference using Gibbs sampling is explained and we detail how to compute risk neutral predictive densities taking into account parameter uncertainty. When forecasting out-of-sample options on the S&P 500 index, substantial improvements are found compared...

  10. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Effects of network topology on wealth distributions

    International Nuclear Information System (INIS)

    Garlaschelli, Diego; Loffredo, Maria I

    2008-01-01

    We focus on the problem of how the wealth is distributed among the units of a networked economic system. We first review the empirical results documenting that in many economies the wealth distribution is described by a combination of the log-normal and power-law behaviours. We then focus on the Bouchaud-Mezard model of wealth exchange, describing an economy of interacting agents connected through an exchange network. We report analytical and numerical results showing that the system self-organizes towards a stationary state whose associated wealth distribution depends crucially on the underlying interaction network. In particular, we show that if the network displays a homogeneous density of links, the wealth distribution displays either the log-normal or the power-law form. This means that the first-order topological properties alone (such as the scale-free property) are not enough to explain the emergence of the empirically observed mixed form of the wealth distribution. In order to reproduce this nontrivial pattern, the network has to be heterogeneously divided into regions with a variable density of links. We show new results detailing how this effect is related to the higher-order correlation properties of the underlying network. In particular, we analyse assortativity by degree and the pairwise wealth correlations, and discuss the effects that these properties have on each other

  12. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  13. Modeling a Distribution of Mortgage Credit Losses

    Czech Academy of Sciences Publication Activity Database

    Gapko, Petr; Šmíd, Martin

    2012-01-01

    Roč. 60, č. 10 (2012), s. 1005-1023 ISSN 0013-3035 R&D Projects: GA ČR GD402/09/H045; GA ČR(CZ) GBP402/12/G097 Grant - others:Univerzita Karlova(CZ) 46108 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : credit risk * mortgage * delinquency rate * generalized hyperbolic distribution * normal distribution Subject RIV: AH - Economics Impact factor: 0.194, year: 2012 http://library.utia.cas.cz/separaty/2013/E/smid-modeling a distribution of mortgage credit losses.pdf

  14. Lognormal Behavior of the Size Distributions of Animation Characters

    Science.gov (United States)

    Yamamoto, Ken

    This study investigates the statistical property of the character sizes of animation, superhero series, and video game. By using online databases of Pokémon (video game) and Power Rangers (superhero series), the height and weight distributions are constructed, and we find that the weight distributions of Pokémon and Zords (robots in Power Rangers) follow the lognormal distribution in common. For the theoretical mechanism of this lognormal behavior, the combination of the normal distribution and the Weber-Fechner law is proposed.

  15. Normal 201Tl SPECT myocardial scintigram - a basis for quantification of tomographic data using 1800 rotation

    International Nuclear Information System (INIS)

    Bock, A.; Frenzel, R.; Meyer, M.; Pink, V.; Reisinger, I.; Strangfeld, D.

    1990-01-01

    Using 180 0 rotation for 201 Tl myocardial perfusion SPECT the influence of different myocardial wall thickness, depth-dependent resolution, attenuation and other facts is the reason for about 30% inhomogenity from anterolateral to posteroseptal myocardium in polar coordinate Bull's eye. It is very difficult for a visual observer to assess patients without knowledge of the corresponding normal distribution. We studied 25 normal males and calculated pixelwice vitality, washout rate between stress, 2-hr and 3.5-hr delay to obtain a normal Bull's eye reference databank. Before the patient's data can be analysed it is necessary to multiplicate them with the inverse normal Bull's eye. (author)

  16. A simple method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation

    International Nuclear Information System (INIS)

    Begnozzi, L.; Gentile, F.P.; Di Nallo, A.M.; Chiatti, L.; Zicari, C.; Consorti, R.; Benassi, M.

    1994-01-01

    Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.) [de

  17. A new technique for the use of microspheres for the study of intra-cortical distribution of renal blood flow. Results for a normal and sodium overloaded rat. Report of internship performed in the Laboratoire de Physiologie Physico-Chimique (C.E.N. Saclay)

    International Nuclear Information System (INIS)

    Poujeol, P.

    1972-06-01

    This academic work reports the simultaneous study on the same kidney of the distribution of glomerular filtrations and the distribution of blood flow rate in the renal cortex. Th author combined the technique of perfusion of sodium "1"4C ferro-cyanide which allows the measurement of individual glomerular filtrations, and a technique based on the use of microspheres which allows the assessment of blood flow distribution in the glomeruli of different nephron classes. Experiments have been performed on a normal rat, and on a rat submitted to a chronic NaCl overload [fr

  18. Optimal power flow for distribution networks with distributed generation

    Directory of Open Access Journals (Sweden)

    Radosavljević Jordan

    2015-01-01

    Full Text Available This paper presents a genetic algorithm (GA based approach for the solution of the optimal power flow (OPF in distribution networks with distributed generation (DG units, including fuel cells, micro turbines, diesel generators, photovoltaic systems and wind turbines. The OPF is formulated as a nonlinear multi-objective optimization problem with equality and inequality constraints. Due to the stochastic nature of energy produced from renewable sources, i.e. wind turbines and photovoltaic systems, as well as load uncertainties, a probabilisticalgorithm is introduced in the OPF analysis. The Weibull and normal distributions are employed to model the input random variables, namely the wind speed, solar irradiance and load power. The 2m+1 point estimate method and the Gram Charlier expansion theory are used to obtain the statistical moments and the probability density functions (PDFs of the OPF results. The proposed approach is examined and tested on a modified IEEE 34 node test feeder with integrated five different DG units. The obtained results prove the efficiency of the proposed approach to solve both deterministic and probabilistic OPF problems for different forms of the multi-objective function. As such, it can serve as a useful decision-making supporting tool for distribution network operators. [Projekat Ministarstva nauke Republike Srbije, br. TR33046

  19. Neutron dosimetry and spectrometry with Bonner spheres. Working out a log-normal reference matrix

    International Nuclear Information System (INIS)

    Zaborowski, Henrick.

    1981-11-01

    From the experimental and theoretical studies made upon the BONNER's spheres System with a I 6 Li(Eu) crystal and with a miniaturized 3 He counter we get the normalized energy response functions R*sub(i)(E). This normalization is obtained by the mathematization of the Resolution Function R*(i,E) in the Log-Normal distribution hypothesis to mono energetic neutrons given in April 1976 to the International Symposium on Californium 252. The fit of the Log-Normal Hypothesis with the experimental and Theoretical data is very satisfactory. The parameter's tabulated values allow a precise interpolation, at all energies between 0.4 eV and 15 MeV and for all spheres diameters between 2 and 12 inches, of the discretized R*sub(ij) Reference Matrix for the applications to neutron dosimetry and spectrometry [fr

  20. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  1. A log-sinh transformation for data normalization and variance stabilization

    Science.gov (United States)

    Wang, Q. J.; Shrestha, D. L.; Robertson, D. E.; Pokhrel, P.

    2012-05-01

    When quantifying model prediction uncertainty, it is statistically convenient to represent model errors that are normally distributed with a constant variance. The Box-Cox transformation is the most widely used technique to normalize data and stabilize variance, but it is not without limitations. In this paper, a log-sinh transformation is derived based on a pattern of errors commonly seen in hydrological model predictions. It is suited to applications where prediction variables are positively skewed and the spread of errors is seen to first increase rapidly, then slowly, and eventually approach a constant as the prediction variable becomes greater. The log-sinh transformation is applied in two case studies, and the results are compared with one- and two-parameter Box-Cox transformations.

  2. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  3. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  4. Distributed gas sensing with optical fibre photothermal interferometry.

    Science.gov (United States)

    Lin, Yuechuan; Liu, Fei; He, Xiangge; Jin, Wei; Zhang, Min; Yang, Fan; Ho, Hoi Lut; Tan, Yanzhen; Gu, Lijuan

    2017-12-11

    We report the first distributed optical fibre trace-gas detection system based on photothermal interferometry (PTI) in a hollow-core photonic bandgap fibre (HC-PBF). Absorption of a modulated pump propagating in the gas-filled HC-PBF generates distributed phase modulation along the fibre, which is detected by a dual-pulse heterodyne phase-sensitive optical time-domain reflectometry (OTDR) system. Quasi-distributed sensing experiment with two 28-meter-long HC-PBF sensing sections connected by single-mode transmission fibres demonstrated a limit of detection (LOD) of ∼10 ppb acetylene with a pump power level of 55 mW and an effective noise bandwidth (ENBW) of 0.01 Hz, corresponding to a normalized detection limit of 5.5ppb⋅W/Hz. Distributed sensing experiment over a 200-meter-long sensing cable made of serially connected HC-PBFs demonstrated a LOD of ∼ 5 ppm with 62.5 mW peak pump power and 11.8 Hz ENBW, or a normalized detection limit of 312ppb⋅W/Hz. The spatial resolution of the current distributed detection system is limited to ∼ 30 m, but it is possible to reduce down to 1 meter or smaller by optimizing the phase detection system.

  5. On the issues of probability distribution of GPS carrier phase observations

    Science.gov (United States)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In common practice the observables related to Global Positioning System (GPS) are assumed to follow a Gauss-Laplace normal distribution. Actually, full knowledge of the observables' distribution is not required for parameter estimation by means of the least-squares algorithm based on the functional relation between observations and unknown parameters as well as the associated variance-covariance matrix. However, the probability distribution of GPS observations plays a key role in procedures for quality control (e.g. outlier and cycle slips detection, ambiguity resolution) and in reliability-related assessments of the estimation results. Under non-ideal observation conditions with respect to the factors impacting GPS data quality, for example multipath effects and atmospheric delays, the validity of the normal distribution postulate of GPS observations is in doubt. This paper presents a detailed analysis of the distribution properties of GPS carrier phase observations using double difference residuals. For this purpose 1-Hz observation data from the permanent SAPOS

  6. An evaluation of two-channel ChIP-on-chip and DNA methylation microarray normalization strategies

    Science.gov (United States)

    2012-01-01

    Background The combination of chromatin immunoprecipitation with two-channel microarray technology enables genome-wide mapping of binding sites of DNA-interacting proteins (ChIP-on-chip) or sites with methylated CpG di-nucleotides (DNA methylation microarray). These powerful tools are the gateway to understanding gene transcription regulation. Since the goals of such studies, the sample preparation procedures, the microarray content and study design are all different from transcriptomics microarrays, the data pre-processing strategies traditionally applied to transcriptomics microarrays may not be appropriate. Particularly, the main challenge of the normalization of "regulation microarrays" is (i) to make the data of individual microarrays quantitatively comparable and (ii) to keep the signals of the enriched probes, representing DNA sequences from the precipitate, as distinguishable as possible from the signals of the un-enriched probes, representing DNA sequences largely absent from the precipitate. Results We compare several widely used normalization approaches (VSN, LOWESS, quantile, T-quantile, Tukey's biweight scaling, Peng's method) applied to a selection of regulation microarray datasets, ranging from DNA methylation to transcription factor binding and histone modification studies. Through comparison of the data distributions of control probes and gene promoter probes before and after normalization, and assessment of the power to identify known enriched genomic regions after normalization, we demonstrate that there are clear differences in performance between normalization procedures. Conclusion T-quantile normalization applied separately on the channels and Tukey's biweight scaling outperform other methods in terms of the conservation of enriched and un-enriched signal separation, as well as in identification of genomic regions known to be enriched. T-quantile normalization is preferable as it additionally improves comparability between microarrays. In

  7. Probability distributions of placental morphological measurements and origins of variability of placental shapes.

    Science.gov (United States)

    Yampolsky, M; Salafia, C M; Shlakhter, O

    2013-06-01

    While the mean shape of human placenta is round with centrally inserted umbilical cord, significant deviations from this ideal are fairly common, and may be clinically meaningful. Traditionally, they are explained by trophotropism. We have proposed a hypothesis explaining typical variations in placental shape by randomly determined fluctuations in the growth process of the vascular tree. It has been recently reported that umbilical cord displacement in a birth cohort has a log-normal probability distribution, which indicates that the displacement between an initial point of origin and the centroid of the mature shape is a result of accumulation of random fluctuations of the dynamic growth of the placenta. To confirm this, we investigate statistical distributions of other features of placental morphology. In a cohort of 1023 births at term digital photographs of placentas were recorded at delivery. Excluding cases with velamentous cord insertion, or missing clinical data left 1001 (97.8%) for which placental surface morphology features were measured. Best-fit statistical distributions for them were obtained using EasyFit. The best-fit distributions of umbilical cord displacement, placental disk diameter, area, perimeter, and maximal radius calculated from the cord insertion point are of heavy-tailed type, similar in shape to log-normal distributions. This is consistent with a stochastic origin of deviations of placental shape from normal. Deviations of placental shape descriptors from average have heavy-tailed distributions similar in shape to log-normal. This evidence points away from trophotropism, and towards a spontaneous stochastic evolution of the variants of placental surface shape features. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. The Best and the Rest: Revisiting the Norm of Normality of Individual Performance

    Science.gov (United States)

    O'Boyle, Ernest, Jr.; Aguinis, Herman

    2012-01-01

    We revisit a long-held assumption in human resource management, organizational behavior, and industrial and organizational psychology that individual performance follows a Gaussian (normal) distribution. We conducted 5 studies involving 198 samples including 633,263 researchers, entertainers, politicians, and amateur and professional athletes.…

  9. Consumption of 125I labelled fibrinogen in normal subjects

    International Nuclear Information System (INIS)

    Langer, B.; Camargo, E.E.; Reis, J.M.M. dos; Carvalho, N.; Leao, L.E.P.

    1978-01-01

    The metabolism of iodine- 125 labeled human fibrinogen is studied by using three different sets of the radiopharmaceutical (0.9, 1.3 and 1.84 iodine atoms/fibrinogen molecule ratios) in 19 normal subjects. An aliquot of 40 μCi of fibrinogem- 125 I is injected in each subject, on normal dietary conditions and blood samples are withdrawn at 30, 60, 180, 36 and 720 minutes after the injection and, thereafter, one daily sample during 10 days. The compartmental distribution of the tracer is defined by plotting plasma and serum sample counts on a semilogarithmic graph paper. A rapid phase and 3 compartments are obtained. A 'rapid' consumption half-life and a 'real' consumption half-life are defined. The fibrinogen clottability is followed up to the last blood sample by checking the ratios of serum and plasma radioactivities [pt

  10. Monitoring device for the power distribution within a nuclear reactor core

    International Nuclear Information System (INIS)

    Tanzawa, Tomio; Kumanomido, Hironori; Toyoshi, Isamu.

    1986-01-01

    Purpose: To provide a monitoring device for the power distribution in the reactor core that calculates the power distribution based on the measurement by instruments disposed within the reactor core of BWR type reactors. Constitution: The power distribution monitoring device in a reactor core comprises a signal correcting device, a signal normalizing device and a power distribution calculating device, in which the power distribution calculating device is constituted with an average power calculating device for four fuel assemblies and an average power calculating device for fuel assemblies. Gamma-ray signals corrected by the signal correcting device and signals from neutron detectors are inputted to the signal normalizing device, both of which are calibrated to determine the axial gamma-ray signal distribution in the central water gap region with the four fuel assemblies being as the unit. The average power from the four fuel assemblies are inputted to the fuel assembly average power calculating device to allocate to each of the fuel assembly average power thereby attaining the purpose. Further, thermal restriction values are calculated thereby enabling to secure the fuel integrity. (Kamimura, M.)

  11. Operator-normalized quantum arrival times in the presence of interactions

    International Nuclear Information System (INIS)

    Hegerfeldt, G.C.; Seidel, D.; Muga, J.G.; Navarro, B.

    2004-01-01

    We model ideal arrival-time measurements for free quantum particles and for particles subject to an external interaction by means of a narrow and weak absorbing potential. This approach is related to the operational approach of measuring the first photon emitted from a two-level atom illuminated by a laser. By operator normalizing the resulting time-of-arrival distribution, a distribution is obtained which for freely moving particles not only recovers the axiomatically derived distribution of Kijowski for states with purely positive momenta but is also applicable to general momentum components. For particles interacting with a square barrier the mean arrival time and corresponding 'tunneling time' obtained at the transmission side of the barrier become independent of the barrier width (Hartman effect) for arbitrarily wide barriers, i.e., without the transition to the ultraopaque, classical-like regime dominated by wave packet components above the barrier

  12. Absolute-magnitude distributions of supernovae

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, Dean; Wright, John [Department of Physics, Xavier University of Louisiana, New Orleans, LA 70125 (United States); Jenkins III, Robert L. [Applied Physics Department, Richard Stockton College, Galloway, NJ 08205 (United States); Maddox, Larry, E-mail: drichar7@xula.edu [Department of Chemistry and Physics, Southeastern Louisiana University, Hammond, LA 70402 (United States)

    2014-05-01

    The absolute-magnitude distributions of seven supernova (SN) types are presented. The data used here were primarily taken from the Asiago Supernova Catalogue, but were supplemented with additional data. We accounted for both foreground and host-galaxy extinction. A bootstrap method is used to correct the samples for Malmquist bias. Separately, we generate volume-limited samples, restricted to events within 100 Mpc. We find that the superluminous events (M{sub B} < –21) make up only about 0.1% of all SNe in the bias-corrected sample. The subluminous events (M{sub B} > –15) make up about 3%. The normal Ia distribution was the brightest with a mean absolute blue magnitude of –19.25. The IIP distribution was the dimmest at –16.75.

  13. Option Pricing with Asymmetric Heteroskedastic Normal Mixture Models

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars

    This paper uses asymmetric heteroskedastic normal mixture models to fit return data and to price options. The models can be estimated straightforwardly by maximum likelihood, have high statistical fit when used on S&P 500 index return data, and allow for substantial negative skewness and time...... varying higher order moments of the risk neutral distribution. When forecasting out-of-sample a large set of index options between 1996 and 2009, substantial improvements are found compared to several benchmark models in terms of dollar losses and the ability to explain the smirk in implied volatilities...

  14. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  15. Influence of the statistical distribution of bioassay measurement errors on the intake estimation

    International Nuclear Information System (INIS)

    Lee, T. Y; Kim, J. K

    2006-01-01

    The purpose of this study is to provide the guidance necessary for making a selection of error distributions by analyzing influence of statistical distribution for a type of bioassay measurement error on the intake estimation. For this purpose, intakes were estimated using maximum likelihood method for cases that error distributions are normal and lognormal, and comparisons between two distributions for the estimated intakes were made. According to the results of this study, in case that measurement results for lung retention are somewhat greater than the limit of detection it appeared that distribution types have negligible influence on the results. Whereas in case of measurement results for the daily excretion rate, the results obtained from assumption of a lognormal distribution were 10% higher than those obtained from assumption of a normal distribution. In view of these facts, in case where uncertainty component is governed by counting statistics it is considered that distribution type have no influence on intake estimation. Whereas in case where the others are predominant, it is concluded that it is clearly desirable to estimate the intake assuming a lognormal distribution

  16. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  17. AC distribution system for TFTR pulsed loads

    International Nuclear Information System (INIS)

    Carroll, R.F.; Ramakrishnan, S.; Lemmon, G.N.; Moo, W.I.

    1977-01-01

    This paper outlines the AC distribution system associated with the Tokamak Fusion Test Reactor and discusses the significant areas related to design, protection, and equipment selection, particularly where there is a departure from normal utility and industrial applications

  18. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  19. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Efeitos da transformação de uma variável com distribuição normal em sua inversa sobre os parâmetros de sua distribuição usando técnicas de Monte Carlo Effects of transforming a normally distributed variable into its inverse on parameters of the distribution using Monte Carlo techniques

    Directory of Open Access Journals (Sweden)

    Mirella Leme Franco Geraldini Sirol

    2006-06-01

    Full Text Available Foram realizados quatro estudos de simulação para verificar a distribuição de inversas de variáveis com distribuição normal, em função de diferentes variâncias, médias, pontos de truncamentos e tamanhos amostrais. As variáveis simuladas foram GMD, com distribuição normal, representando o ganho médio diário e DIAS, obtido a partir da inversa de GMD, representando dias para se obter determinado peso. Em todos os estudos, foi utilizado o sistema SAS® (1990 para simulação dos dados e para posterior análise dos resultados. As médias amostrais de DIAS foram dependentes dos desvios-padrão utilizados na simulação. As análises de regressão mostraram redução da média e do desvio-padrão de DIAS em função do aumento na média de GMD. A inclusão de um ponto de truncamento entre 10 e 25% do valor da média de GMD reduziu a média de GMD e aumentou a de DIAS, quando o coeficiente de variação de GMD foi superior a 25%. O efeito do tamanho dos grupos nas médias de GMD e DIAS não foi significativo, mas o desvio-padrão e CV amostrais médios de GMD aumentaram com o tamanho do grupo. Em virtude da dependência entre a média e o desvio-padrão e da variação observada nos desvios-padrão de DIAS em função do tamanho do grupo, a utilização de DIAS como critério de seleção pode diminuir a acurácia da variação. Portanto, para a substituição de GMD por DIAS, é necessária a utilização de um método de análise robusto o suficiente para a eliminação da heterogeneidade de variância.Four simulation studies were conducted to verify the distribution of the inverse of variables with normal distribution, relatively to variances, averages, truncation points and sample sizes. The variables simulated were GMD, with normal distribution and representing average daily gain, and DIAS defined as a multiple of the inverse of GMD and representing days to reach a fixed body weight. The SAS® (1990 system was used, for simulation

  1. Nevanlinna theory, normal families, and algebraic differential equations

    CERN Document Server

    Steinmetz, Norbert

    2017-01-01

    This book offers a modern introduction to Nevanlinna theory and its intricate relation to the theory of normal families, algebraic functions, asymptotic series, and algebraic differential equations. Following a comprehensive treatment of Nevanlinna’s theory of value distribution, the author presents advances made since Hayman’s work on the value distribution of differential polynomials and illustrates how value- and pair-sharing problems are linked to algebraic curves and Briot–Bouquet differential equations. In addition to discussing classical applications of Nevanlinna theory, the book outlines state-of-the-art research, such as the effect of the Yosida and Zalcman–Pang method of re-scaling to algebraic differential equations, and presents the Painlevé–Yosida theorem, which relates Painlevé transcendents and solutions to selected 2D Hamiltonian systems to certain Yosida classes of meromorphic functions. Aimed at graduate students interested in recent developments in the field and researchers wor...

  2. Best Statistical Distribution of flood variables for Johor River in Malaysia

    Science.gov (United States)

    Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.

    2012-12-01

    A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow

  3. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Tie-Jiun [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Gao, Jun [INPAC, Shanghai Key Laboratory for Particle Physics and Cosmology,Department of Physics and Astronomy, Shanghai Jiao-Tong University, Shanghai 200240 (China); High Energy Physics Division, Argonne National Laboratory,Argonne, Illinois, 60439 (United States); Huston, Joey [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Nadolsky, Pavel [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Schmidt, Carl; Stump, Daniel [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Wang, Bo-Ting; Xie, Ke Ping [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Dulat, Sayipjamal [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); School of Physics Science and Technology, Xinjiang University,Urumqi, Xinjiang 830046 (China); Center for Theoretical Physics, Xinjiang University,Urumqi, Xinjiang 830046 (China); Pumplin, Jon; Yuan, C.P. [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States)

    2017-03-20

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A correction is proposed to address a bias in asymmetric uncertainties introduced by the Taylor series approximation. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  4. The use of biological isodoses ''IsobioGy 2'' for evaluation of tumour and normal tissues response for fractionated irradiation

    International Nuclear Information System (INIS)

    Maciejewski, B.; Skolyszewski, J.; Majewski, S.; Lobodziec, W.; Jedynak, T.; Slosarek, K.

    1988-01-01

    Divergences between physical and biological dose distributions were analysed using linear quadratic model. It was found that small variations in physical dose distribution and differences in normal tissue sensitivity for change in dose per fraction, expressed by a α/β value, can cause a high difference between physical and biological doses. This difference significantly increases when one field instead of two fields is daily treated. If there is no enough separation between treated fields, the biological dose may dramatically increase. The use of biological ''isobioGy 2'' isodoses, instead of physical isodoses, can provide an important information on biological effect in tumour or normal tissue and may diminish the risk of giving too high dose to normal tissue and too low dose to the tumour. 6 figs., 13 refs. (author)

  5. The quotient of normal random variables and application to asset price fat tails

    Science.gov (United States)

    Caginalp, Carey; Caginalp, Gunduz

    2018-06-01

    The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.

  6. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Directory of Open Access Journals (Sweden)

    Wesley K Thompson

    2015-12-01

    Full Text Available Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD and the other for schizophrenia (SZ. A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the

  7. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Science.gov (United States)

    Thompson, Wesley K; Wang, Yunpeng; Schork, Andrew J; Witoelar, Aree; Zuber, Verena; Xu, Shujing; Werge, Thomas; Holland, Dominic; Andreassen, Ole A; Dale, Anders M

    2015-12-01

    Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD) on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of

  8. Distribution and diversity of flora and fauna in International Institute ...

    African Journals Online (AJOL)

    Distribution and diversity of flora and fauna in International Institute Of Tropical Agriculture (IITA) forest and nature reserve, Ibadan, Oyo state, Nigeria. ... was also sighted in the study site and several other birds which normally winter around the lake. Key words: Ecology, Distribution, Diversity, Forest, Nature Reserve, IITA, ...

  9. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    Directory of Open Access Journals (Sweden)

    Mike S Fowler

    Full Text Available The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies dominate in red environments, rapid fluctuations (high frequencies in blue environments and white environments are purely random (no frequencies dominate. Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental series used in combination with population (dynamical feedback models: autoregressive [AR(1] and sinusoidal (1/f models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1 models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1 methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We

  10. Simultaneous distribution of AC and DC power

    Science.gov (United States)

    Polese, Luigi Gentile

    2015-09-15

    A system and method for the transport and distribution of both AC (alternating current) power and DC (direct current) power over wiring infrastructure normally used for distributing AC power only, for example, residential and/or commercial buildings' electrical wires is disclosed and taught. The system and method permits the combining of AC and DC power sources and the simultaneous distribution of the resulting power over the same wiring. At the utilization site a complementary device permits the separation of the DC power from the AC power and their reconstruction, for use in conventional AC-only and DC-only devices.

  11. Football goal distributions and extremal statistics

    Science.gov (United States)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  12. Multivariate log-skew-elliptical distributions with applications to precipitation data

    KAUST Repository

    Marchenko, Yulia V.

    2009-07-13

    We introduce a family of multivariate log-skew-elliptical distributions, extending the list of multivariate distributions with positive support. We investigate their probabilistic properties such as stochastic representations, marginal and conditional distributions, and existence of moments, as well as inferential properties. We demonstrate, for example, that as for the log-t distribution, the positive moments of the log-skew-t distribution do not exist. Our emphasis is on two special cases, the log-skew-normal and log-skew-t distributions, which we use to analyze US national (univariate) and regional (multivariate) monthly precipitation data. © 2009 John Wiley & Sons, Ltd.

  13. Multivariate log-skew-elliptical distributions with applications to precipitation data

    KAUST Repository

    Marchenko, Yulia V.; Genton, Marc G.

    2009-01-01

    We introduce a family of multivariate log-skew-elliptical distributions, extending the list of multivariate distributions with positive support. We investigate their probabilistic properties such as stochastic representations, marginal and conditional distributions, and existence of moments, as well as inferential properties. We demonstrate, for example, that as for the log-t distribution, the positive moments of the log-skew-t distribution do not exist. Our emphasis is on two special cases, the log-skew-normal and log-skew-t distributions, which we use to analyze US national (univariate) and regional (multivariate) monthly precipitation data. © 2009 John Wiley & Sons, Ltd.

  14. Investigation on 3D dose distribution in digital breast tomosynthesis

    Science.gov (United States)

    Masi, M.

    2017-03-01

    Monte Carlo calculations for dosimetry in digital breast tomosynthesis (DBT) require experimental validations. We measured the 3D dose distribution in a breast phantom in a DBT scan, using XR-QA2 radiochromic films. We positioned film pieces at the entrance surface, at the bottom surface and at four depths between adjacent slabs in the 5-slabs, 5-cm-thick phantom simulating a compressed breast with 50% glandular fraction. We irradiated the phantom at 40kV (half value layer 1.1mm Al) for three angular tilting of the beam central axis ( {±}25° and 0° normal incidence). We determined the transverse and longitudinal distributions of the average dose in the phantom (in terms of air kerma normalized to the entrance air kerma), showing the angular dependence of the depth-resolved 3D dose distributions. In transverse planes the maximum dose variations were between 5.0% and 14.8% for normal incidence, and by 8.6% from the central to the tilted view. In the direction of the beam axis, the dose decreases up to about 71% from the entrance to the exit value. The extimated backscatter fraction was between 3% and 8%.

  15. Studies on the distribution of hematopoietic bone marrow by bone marrow scintigraphy, 2

    International Nuclear Information System (INIS)

    Fujimori, Katsuhiko

    1976-01-01

    Distribution of the leukemic marrow was investigated in 42 cases by bone marrow scintigraphy using sup(99m)Tc sulfur colloid in association with clinical findings and ferrokinetics studies in order to clarify hematopoietic function in leukemia. 17 of chronic myelogenous leukemia, 3 of lymphatic leukemia, 2 of monocytic leukemia, 7 of atypical leukemia and one of erythroleukemia. 12 acute myelogenous leukemia were classified into 3 types A, B and C. Type A showed the distribution similar to those obtained with normal controls. Ferrokinetics studies, however, indicated complete absence of erythropoiesis. Type B showed complete lack of sup(99m)Tc activity in usual marrow sites, although ferrokinetics data showed normal erythropoeitic function. Type C showed abnormal concentration of sup(99m)Tc sulfur colloid in the tibiae. 17 chronic myelogenous leukemia showed reduced sup(99m)Tc activity in usual marrow sites and remarkable expanded marrow extending into distal femurs, proximal and distal tibiae and bones of feet. 2 acute lymphotic leukemia patients showed complete absence of sup(99m)Tc activity. The one chronic type showed almost normal distribution. Monocytic leukemia showed decreased marrow distribution in the sternum and vertebrae. Of 6 atypical leukemias one showed almost normal distribution. The others, including a case with hypoplastic luekemia, demonstrated marrow extension similar to that observed in chronic myelogenous leukemia or monocytic leukemia. Erythroleukemia showed increased concentrations of sup(99m)Tc activity in the usual marrow sites and marked marrow expansion throughout all long bones. These results suggest that there is a discrepancy between bone marrow distribution and hematopoietic function in the cases of acute myelogenous leukemia. (J.P.N.)

  16. Radioactivities evaluation code system for high temperature gas cooled reactors during normal operation

    International Nuclear Information System (INIS)

    Ogura, Kenji; Morimoto, Toshio; Suzuki, Katsuo.

    1979-01-01

    A radioactivity evaluation code system for high temperature gas-cooled reactors during normal operation was developed to study the behavior of fission products (FP) in the plants. The system consists of a code for the calculation of diffusion of FPs in fuel (FIPERX), a code for the deposition of FPs in primary cooling system (PLATO), a code for the transfer and emission of FPs in nuclear power plants (FIPPI-2), and a code for the exposure dose due to emitted FPs (FEDOSE). The FIPERX code can calculate the changes in the course of time FP of the distribution of FP concentration, the distribution of FP flow, the distribution of FP partial pressure, and the emission rate of FP into coolant. The amount of deposition of FPs and their distribution in primary cooling system can be evaluated by the PLATO code. The FIPPI-2 code can be used for the estimation of the amount of FPs in nuclear power plants and the amount of emitted FPs from the plants. The exposure dose of residents around nuclear power plants in case of the operation of the plants is calculated by the FEDOSE code. This code evaluates the dose due to the external exposure in the normal operation and in the accident, and the internal dose by the inhalation of radioactive plume and foods. Further studies of this code system by the comparison with the experimental data are considered. (Kato, T.)

  17. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering Based on the Newly Developed Self-consistent RC/EMIC Waves Model by Khazanov et al. [2006

    Science.gov (United States)

    Khazanov, G. V.; Gallagher, D. L.; Gamayunov, K.

    2007-01-01

    It is well known that the effects of EMIC waves on RC ion and RB electron dynamics strongly depend on such particle/wave characteristics as the phase-space distribution function, frequency, wave-normal angle, wave energy, and the form of wave spectral energy density. Therefore, realistic characteristics of EMIC waves should be properly determined by modeling the RC-EMIC waves evolution self-consistently. Such a selfconsistent model progressively has been developing by Khaznnov et al. [2002-2006]. It solves a system of two coupled kinetic equations: one equation describes the RC ion dynamics and another equation describes the energy density evolution of EMIC waves. Using this model, we present the effectiveness of relativistic electron scattering and compare our results with previous work in this area of research.

  18. Differential geometry of CR-submanifolds of a normal almost para contact manifold

    International Nuclear Information System (INIS)

    Shahid, M.H.

    1992-12-01

    The aim of this paper is to study the geometry of CR-submanifolds of a normal almost para contact manifold. We discuss the integrability conditions of distributions involved in the definition and geometry of leaves of CR-submanifolds, some results on CR-submanifolds with parallel structures and contact CR-product are also given. (author). 10 refs

  19. Distributing the Corporate Income Tax: Revised U.S. Treasury Methodology

    OpenAIRE

    Cronin, Julie Anne; Lin, Emily Y.; Power, Laura; Cooper, Michael

    2013-01-01

    The purpose of this analysis is to improve the U.S. Department of the Treasury’s distributional model and methodology by defining new model parameters. We compute the percentage of capital income attributable to normal versus supernormal return, the percentage of normal return attributable to the "cash flow tax" portion of the tax that does not impose a tax burden, and the portion of the burdensome tax on the normal return to capital borne by capital income versus labor income. In summary, 82...

  20. Normal differential renal function does not indicate a normal kidney after partial ureteropelvic obstruction and subsequent relief in 2-week-old piglets

    Energy Technology Data Exchange (ETDEWEB)

    Dissing, Thomas H.; Mikkelsen, Mette Marie; Pedersen, Michael; Froekiaer, Joergen; Djurhuus, Jens Christian [University of Aarhus, Institute of Clinical Medicine, Aarhus (Denmark); Eskild-Jensen, Anni [Aarhus University Hospital, Department of Nuclear Medicine, Aarhus Sygehus, Aarhus (Denmark); Gordon, Isky [University College London, Institute of Child Health, London (United Kingdom); University College London, Radiology and Physics Unit, Institute of Child Health, London (United Kingdom)

    2008-09-15

    We investigated the functional consequences of relieving ureteric obstruction in young pigs with experimental hydronephrosis (HN) induced by partial unilateral ureteropelvic obstruction. Three groups of animals were followed from the age of 2 weeks to the age of 14 weeks: Eight animals had severe or grades 3-4 HN throughout the study. Six animals had relief of the obstruction after 4 weeks. Six animals received sham operations at both ages. Morphological and functional examinations were performed at age 6 weeks and again at age 14 weeks and consisted of magnetic resonance imaging (MRI), technetium-diethylenetriaminepentaaceticacid ({sup 99m}Tc-DTPA) renography, renal technetium-dimercaptosuccinicacid ({sup 99m}Tc-DMSA) scintigraphy, and glomerular filtration rate (GFR) measurement. After relief of the partial obstruction, there was reduction of the pelvic diameter and improvement of urinary drainage. Global and relative kidney function was not significantly affected by either obstruction or its relief. Renal {sup 99m}Tc-DMSA scintigraphy showed a change in both the appearance of the kidney and a change in the distribution within kidneys even after relief of obstruction. This study shows that partial ureteric obstruction in young pigs may be associated with little effect on global and differential kidney function. However, even after relief of HN, the distribution of {sup 99m}Tc-DMSA in the kidney remains abnormal suggesting that a normal differential renal function may not represent a normal kidney. (orig.)